Why is nobody talking about the new RX 5000 line?

Why is nobody talking about the new RX 5000 line?

Attached: 66056_10_amd-announces-next-gen-radeon-rx-5000-series-graphics-cards.jpg (620x346, 18K)

Other urls found in this thread:

youtube.com/watch?v=1nqhkDm2_Tw
businesswire.com/news/home/20190529005766/en/PCI-SIG®-Achieves-32GTs-New-PCI-Express®-5.0
twitter.com/SFWRedditVideos

Because it's underwhelming since the Radeon VII is their flagship and barely keeps up with the 1080 ti.

>Why is nobody talking about a product that isn't commercially available yet?
wait till july

5600 pls

Attached: 1557610109098.png (753x479, 114K)

Because it will be failure 101%
so we focus on the good thing instead (cpu from amd)

Gonna bet 1080ti will be still like 25% faster in gaymes than this overprices flagshit

It isn't the time yet.

Attached: a fortnight.gif (688x46, 534K)

to talk about it we need a price.
for AMD to sell volume 2070 alternative needs to be $399. unrealistic but one can dream.
Otherwise people won't care for 10% perf above 2070 and just pay $50 more for RTX memes.
Let us hope Wang got the touch.

Attached: 2013-09-24_00016-100056137-orig.jpg (1920x1080, 301K)

hi

Attached: ln.png (657x527, 45K)

Navi will get Meme Tracing, too.

FPBP
They could've very well have released Poolaris++ and it wouldn't have less of an impact. AMD's gotta pull their shit together with RTG.
RDNA my ass.

you think they embedded something in the arch to speed up DXR and alternatives? doubt.

because it's shit.

Intel will have better GPU's than even Nvidia currently does.

what do we even know about it besides it beating the 2070 by 10% in 1 test

there's not even enough room on the die for it

new arch (massive CU redesign)
has l0 cache

>2070 performance
>$399
>dream
holy fucking shit are you retarded
$399 is massively overpriced for a 2070 alternative
anything more than $200 for it is overpriced
you got adjusted to the inflated nvidia pricing
we need to go back to the normal pricing

sure, but we have absolutely no details on what they changed about it

there is no need for special hardware for it. it's nvidia gimmick to sell just like gsync
DXR is directx extension, it's is perfectly conventional, if they optimize for it normal cards can run it fast
vega 56 already runs reflections fast in that recent demo.
youtube.com/watch?v=1nqhkDm2_Tw

this, its a mid-range chip

VII was a stop gap

RDNA isnt compute focused anymore though, so I doubt it

That's bullshit, mark my words. RDNA is just GCN by ol' good AMD ReBrandeon Technologies Group

at this point i'll be happy to get mid range for $400. chip is small though, maybe AMD comes to it's senses, it shouldn't cost more than $400 at that size.
$200 is being very unrealistic considering the die size.

Waiting for details at E3.

i hope rdna is gcn with 8 shader core

so not gcn

lisa said that navi would be on the datacenter, enterpruse, compute, enthusiast, etc.
they did it once with gcn, the chances of amd trying to make a jack of all trades again are not 0%

we don't know shit about how real this RDNA deal is, rumors previous to reveal suggested it's substantial change, at least it got better rumors about it than vega.
We must not forget navi goes in PS5. that's a huge deal contrary to everyone's beliefs.
GCN defined graphics for the last 6 years being in PS4. So there must be more to it than just an overclocked GCN.

no she clearly said:
vega stays for datacenter for a while longer
navi is gaming oriented.

oh
that's what I get for trying to watch the stream at 5 AM while still sleepy

Should of called it the RX-79

Attached: avatarchar.jpg (100x75, 16K)

AMD isn't in position to undercut nvidias offerings. Their investors want profits. Those ridiculous prices will not change unless nvidia lowers them, and that's a good thing for AMD. If nvidia didn't charge so much they wouldn't be able to sell their polaris gpus (well, to gamers at least).
Ther prices are inflated, but it's a given that AMD's offering would be slightly cheaper for same performance.
Since prices on new products will not drop anytime soon I think the best decision is to buy the big navi at its EOL when discounts drop. That's if you have a decent enough card to wait. Otherwise grab vega56 or used 1080ti, rtx and navi is going to be inflated shit.

this is exactly the time to undercut. while they have some trust accumulated with ryzen strategies.

>of

Why are you lying? AMD has made it clear Navi would be mid-range cards and the VII was their flagship, which they decided to release first.

Not like this is a English class.

Attached: cheekycunt.jpg (349x413, 28K)

This, sadly. In order to match the profits of previous years, they must ask more money for those stoves because there's less demand.

want to see actual benchmarks and some info on RDNA.

it's not a flagship you retard
navi is a mid-range line with performance around the RTX 2070

is this the new intel cope?
>a-at l-least we'll have GPUs i-intelbros
I've seen it in other threads as well

graphics need a third player.

>Navi would be mid-range cards
yes
>VII was their flagship
no

Not necessarily.
The technology is sufficiently complex that it's extremely expensive to enter the market for very dubious returns.
What market share does Intel hope to capture?
Nvidia has an iron grip on scientific computing and machine learning due to CUDA, which everyone de facto standardized on.
The gaming market is likewise heavily slanted towards Nvidia due to their rabid fan base. Even when AMD/ATI was consistently better for about a decade most people still bought Nvidia. And that's ignoring the elephant in the room that is game optimization. How exactly does Intel intend to optimize their drivers for a decade of backlog in games? They won't gain much market share if their GPUs perform like crap in every game older than, say, 2018.
I just can't see any room on the market for Intel, or ANY third company for that matter. Some markets are like that - compare the browser market where it's either Firefox or Chromium clones because of the insane complexity of modern web standards and no gap in the market to target.

>Only nvidia can sell gpus for money!!!
Suck my dick, I will buy Navi even if it costs that much.

Because they're a second best graphics company. Everyone knows this.

>for AMD to sell volume 2070 alternative needs to be $399. unrealistic but one can dream.
I was thinking 300$ myself and would accept 350$.
At 399$ it becomes very questionable for me.
In any case I will wait for shit ton of benchmarks on games and real world prices to make a judgement.

>why do you hate nvidia
>THE PRICES ARE OUTRAGEUS, THEY SHOULDNT SELL THEM THAT HIGH!
>so youd be willing to buy products from AMD that are priced similar?
>YES OF COURSE, AMD NEEDS TO MAKE MONEY TOO!!

It's scary that there is people out there that think like this.

but if sapphire didn't lie amd sells them $100 cheaper

Because Almost-As-Good-As-What-The-Competition-Has-Sold-For-6-Months is not particularly exciting?

100$ cheaper than MSRB of a card that came out over 6 months ago. While having no ray tracying tech.
Again, by the time navi releases, I will compare the prices of gpus on the market available to me.
And I hope they will release a COMPETITIVE MSRP price on navi.
Because here shops buy cards and pay plenty of money for it, so they don't put them on discounts, EVER, even 970 is still selling at MSRP here right now.
So if they price navi high, but "we will lower the price later if needed" then essentially it will always be pricey for me and essentially I will have no reason to get it.

Just once, JUST FUCKING ONCE. I wish AMD didn't do their bullshit dance, and just gave us a 2070 at 300$ so everybody could just buy them and they would gain massive popularity and PR boost.

But we already have AMD fanboys defending them
>muh amd needs to make mad profits too
>nvidia does it they are bad
>but if amd does it it's fine
and so on.

>While having no ray tracing tech.
you say this like anybody actually cares.
agree on everything else.

ATI AMD merge was a mistake

It's amazing to see there are still people who fell for the meme tracing bullshit. I wonder whether the same people also believe DLSS is great, despite the fact that it doesn't work.

>you're next Huang
is this real? if so then this is a very bold claim

>you say this like anybody actually cares.
AMD does, considering they mention it all the time of how they will include in their navi, but not the navi thats coming out because they didnt' manage it, but the navi that is coming out IN A YEAR.

Don't know why they'd even bother. Should instead call out the bull.

>Don't know why they'd even bother. Should instead call out the bull.
So nvidia is the bull, and the customer is the woman. Does that make AMD an impotent cuck in this story?

>NAVI
HEY LISTEN!

I don't hold any loyalties. AMD would easily become the bull if they ever got ahead.

yeah, any day now.... I have been waiting for them to make a ridiculously good GPU for years now so that everybody could just buy it.
Not same performance 2 years later are at a slightly less price or some shit like that.

that doesnt seem right
didn't they say navi has 25% higher performance at same clocks/50% for same power?
sounds to me like vega is already semi deprecated already
they are also on the same node from the same manufacturer, yields and drequency shouldnt be too far off

Not enough info about the new architecture yet. On June 10 we will get an idea of how different RDNA is from GCN, whether the rumor of 8nshader engines is true, and whether the rumor that Navi is Super-SIMD is true.

At first glance they probably are true because those are the changes you'd need to make to gain 25% performance per clock plus 20% frequency at ISO performance on 7nm to get a 50% perf/watt improvement.

We don't know shit about it except the "RDNA" uarch that might be just GCN. AMD will reveal full details at E3 on June 10th which is too soon for even the greediest tech youtubers to make videos including extravagant claims.

i dont get it, if rdna has 8 shader engines they should be able to make a 2080ti competitor or atleast 2080 competitor dont know why they decided to start with midrange

because 90% of the sales are in that part of the market you nvidia fanboy retard

never had a nvidia gpu but sure

Fabbing on TSMC 7nm HP isn't cheap, so I assume they wanted to make and sell lots of smaller mid-range chips before going for a big chip design.

i know mid range cards make more money but i thought that they would make a real flagship for marketing purposes even if its gonna go out of stock immediately

Because it will be shit like the Vega lineup oversized heaters that perform shit and costs too much for what other heaters cost.

>investors want profits
All the raw profit to-be-made already has, their stock is not going to double in value short term at this point. They are the only large tech company that does not pay dividends. They are attempting to grow revenue to the point where their historical debt balance is an acceptable floating expenditure (an absolutely fine business tactic given their growth and opportunity, but still.) AMD has nowhere to go but 10% up YoY until Intel gets their shit together, and regardless of the public sentiment saying otherwise, Intel is going to get their shit together and hit back brutally in the next couple years.

Investors want profit, no fucking shit. How is that continued growth that attracted everyone going to hold in the next 3 years? How much of the billions of debt will they pay down? How much of the profit are they forced to split with their foreign holdings? Will they ever share dividends to stockholders?
The biggest question of all:
How viable are they in competition 3 years from now, considering historically they've sunk their own boat in spectacular fashion every time they've had an advantage?

Athlon64 stagnated, Phenom was broken, they lost their fabs (and were then disastrously beholden to the company they sold it to), they purchased dead GPU IP, they gambled and lost hard on CMT, and were only competitive with Nvidia between early to late 2011.

Zen has only been good because Intel shit the bed pushing 10nm too hard and too risky.
After Intel spends their ludicrous cash reserves fixing their bullshit and releasing that new arch on 7nm (10 is being abandoned and relegated to small chips) it's going to be exactly like Core/Core2 stomping on AMD while AMD went stagnant.

Exactly how much of that Intel cash reserve has been spent in the past 5 years on marketing,
fruitless corporate "enrichment" programs, and said 10nm money pit. I doubt that either intel nor AMD have much cash on reserve after the nearly decade long decline of the desktop silicon market.

Intel held 4 times as much raw cash as AMD's total revenue in 2018.

>10nm too hard and too risky.
not a silver bullet, can't win vs chiplets.

>desktop
who cares. datacenter is gigantic still growing market.

>Intel is going to get their shit together
When their entire performance gap was caused by sloppy as fuck quick hacks that busted their entire series from first to current gen wide open?
I doubt it.

but there's less demand precisely because they're asking for more

>and regardless of the public sentiment saying otherwise, Intel is going to get their shit together and hit back brutally in the next couple years.
with what? 10nm is a failure, 10nm"+" will actually be a 12nm analogue with worse clocks than their hyper-refined 14nm+++ node, their current architecture is a swiss cheese of massive mistakes. It takes 3-4 years for a new architecture to come to market and they're just doubling down on "refined pentium 3" by all accounts for the next several years.

It's just a replacement rx580, but at the price and performance of a 2070. When it should be the performance of a 2070 at the price of a 580. Amd dun goofed.

Well I don't give a rats ass about the cards themselves but a new uarch is always interesting. However since we know nothing about it there's not a lot to talk about yet.

However I might tune in to E3 for the first time in years again.

I'm dying to know whether the 8 shader engines rumor is true. It would be a really big deal if it's true.

It's a new name on a modified old architectecture.

Compute cut out.
Smaller die.
Power efficient.
Lower clocks with better performance.

There, I summed up the 5000 series until the E3 conference.

instead it will be performance of 2070 with price of 2070, so still not great, but its another option

>Lower clocks with better performance.
this implies major uarch changes though

It's being heavily eclipsed by Ryzen.
The RX5000 is like a "eh, it's a nice option to nvidia", while the Ryzen is like "HOLY SHIT, IT BLEW THE UPPER PART OF INTEL OFF!"

businesswire.com/news/home/20190529005766/en/PCI-SIG®-Achieves-32GTs-New-PCI-Express®-5.0

>PCI-SIG Achieves 32GT/s with New PCI Express 5.0 Specification

AYYMD POOVI HOUSEFIRES GARBAGE, OBSOLETE EVEN BEFORE RELEASE

It already does just cause they arenot gaming fag don't mean they ain't real. Intel is too
late in the gzmd.

>GPU use nvidia pricing
>CPU uses Intel pricing

Why is and being a Jew now that they have market share.
>gain market share for being cheaper and providing more
>have marketshare fuck being cheaper just provide more.

I ain't upgrading from my 1070 until something similarly priced gives me at least 50% more frames, which this new RX series most likely won't.

Attached: 1518687503862.webm (1280x720, 2.72M)

Ah yes, let's buy the Nvidia and Intel gpu's that support PCIE 5.0 alr... Oh yeah, they also don't. Nobody currently has any equipment on the market for PCIE 5.0.

Also, let's be real. GPUs don't even satisfy PCIE 3.0 x16 yet. The reason you might want to get PCIE 4.0 is that you can limit your gpu to 8 lanes so you can use the other 8 lanes on something else. And there still aren't any consumer boards that support 4.0 except for X470 and B450 boards which I coincidentally already poses.

I only need to upgrade my cpu to get PCIE 4.0 whereas I need to wait years for any 5.0 equipment to become available and I would need to buy a new motherboard and cpu to get it.

This news truly is the end for AMD.

I hope Navi is good for ETH mining.
My 2x 12GPU 290x rigs are getting pretty bad at this point.
Those 290x are making the most horrible coil whine I've ever heard
I need something like a rx580+30% hashrate at lower TDP with a $200 price tag or so.

Attached: 1557830445709.jpg (992x565, 173K)

>tfw no rx 5800 because it is faster than vii.
>the city of vega

Or just more emphasis on shader performance instead of trying to balance ALUs compute.

>everyone thinks the new arch is actually just GCN with a new name
>we don't know jack shit about it
>the only thing they showed at computex was some radeon favoring software where it barely kept up with a 2070
There's really nothing to talk about until E3 when they give more info. I don't think anyone but the biggest fanboys are actually going to be excited about it though. To AMD's credit it isn't easy to fight a war on two fronts against two companies that are bigger than you, and they managed to get ahead on the CPU front so that alone is commendable. But chances are the GPU is going to be hot garbage.

If Intel actually goes forward with GPU R&D, it might be better for AMD to just shut down their Radeon division so they can concentrate on CPUs.

Dumb. All they need a mcm GPU that's power efficient. They've been pouring money in to it since even before Raja left.

Honestly it's really sad seeing the state of nu-Jow Forums. There appears to be no one left capable of applying basic logic, inference and analysis to publicly available information anymore.

Anyone trying to talk about how good or bad Navi performs that doesn't even try to engage with the central question of how many CUs it uses to get that performance, at what clocks, at what TDP, at what price is just wasting space.

*IF* Navi is 40CU, *IF* Navi is 225w TBP, *IF* Navi is 2070+ performance, *IF* Navi is $399, then its a huge step forward for RTG and proves they have achieved their claimed 1.5x perf/watt.

To whatever extent Navi uses more CUs, more TBP, has worse than 2070 performance, and costs more than $399, then it's far less of a step forward.

But really, the most important question for the future of the consumer GPU market is whether it has 8 shader engines or not. *IF* it has the rumored 8 shader engines, then RTG will be able to get back into the high end consumer GPU game next year (Because 8 shader engines means the ability to throw 80+CU on 7nm HP at the problem in the worst case scenario).

Why is nobody talking about the new RX 5000 line?
Because my RX480 is good enough and I couldn't afford an upgrade regardless. Their CPU's are more interesting anyway.

Not enough info to go on.

The 8 shader engines matter so much less than a redesigned CU that its laughable you think thats the point of this gpu.

And what exactly would you expect those redesigned CUs to accomplish on 48+ CU designs when they'd remain hopelessly front end bottlenecked by the 4 shader engines, exactly? Breaking the 4 shader engine limit of GCN is a prerequisite for taking advantage of all improvements later in the rendering pipeline on high CU count cards.

Yep. Fucking boring as shit. 1080Ti is 14nm??? at like 3 years old and still just as fast as AMD's best 7nm GPU ...

VII is actually pretty solid if you undervolt it