Navi's RDNA architecture

>gpureport.cz/info/Graphics_Architecture_06102019.pdf

Of the 1.5x perf/watt improvement over Vega, 60% of that is attributed to the new architecture, and not the node shrink.
As in, if Navi were on 16nm still at the same clocks as Vega, it'd have around a 30% improved perf/watt over Vega.

It is likely that launch drivers are going to leave a lot of performance left to be gained as they'll need newly optimized shader compilers.

RDNA CUs are more different to Vega ("GCN 1.4") than Turing SMs are to Pascal, when Turing SMs changed quite a lot.

Attached: ApplicationFrameHost_2019-06-12_02-06-17.png (2048x1146, 1.82M)

Other urls found in this thread:

twitter.com/RyanSmithAT/status/1138561780244869121
nvidia.com/en-us/geforce/news/rage-2-game-ready-driver/
freepatentsonline.com/y2019/0066371.html
anandtech.com/show/14528/amd-announces-radeon-rx-5700-xt-rx-5700-series/2
devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/
wccftech.com/amd-radeon-rx-5700-xt-7nm-navi-rdna-gpu-official-launch/
nowinstock.net/computers/videocards/nvidia/rtx2080/
twitter.com/NSFWRedditVideo

it was worth the wait

Don't worry it is still terrible when compared with 2070.

>inb4 it goes toe to toe woth the rtx 3080 in a few years

I look forward to Just Wait™ing until the drivers are good.

You must have never gotten to experience FineWine™
>Buy 7970
>It's okay at launch. Much better than Fermi. Just couldn't know then that it was the best GPU ever made, relative to the time it was released.
>Kepler comes and is competitive, but then drivers make the 7970 you already got better than it and Kepler refresh
>More years later, it still runs all the new stuff great, thanks for improved drivers and games being more compute heavy, and it having enough VRAM which Nvidia cards always skimped on
>Sell it to a miner for 1/3rd of what you paid, 6 years later

>Buy Ryzen 1000 series
>At launch, just okay. Better than $250 for 4c/4t stutterlake, even if averages are sometimes worse
>Next year, microcode upgrades greatly improve memory compatibility, and also improve IPC about 6%. Now it's even better
>Another year later, Windows Scheduler gives another up to 20% improvement
>Don't even need to upgrade to Zen 2 like you thought because it FineWine™'d

It's not that it's bad on launch. 7970 wasn't bad on launch, it was still better perf/$, perf/watt, even if drivers had weird bugs. Ryzen 1 wasn't bad on launch. It just gets even better with time.

Windows update "allegedly" fixed 1% fps drop while gaming. Wait for real tests and not just MS shilling.

Here comes the wait and see bullshit again..
Anyone remember vega? I actually bought a card and put my money on the line.. Yeah, the wait never paid out.. Drivers never came. Features were never enabled. Just a half ass'd power inefficient architecture that delivered sensible middle ground performance and nothing more.

You are definitely meming if you think vega didn't deliver.
V56 is like 1070ti performance and when undervolted runs around 200w and less.
V64 is good too and trades blow with 1080 depending on the game.
Yes they do take more energy but their performance was good.

Updated my X470 chipset, latest BIOS and the Windows 10 May Update. My 2700x is absolutely a monster. Everything is snappy and smooth and it already was very nice. Glad to see MS partnering more with AMD. New consoles are Zen 2 and Navi. Second hand market is really good right now and AMD is luring Nvidia into a price war. The future is bright!
Not to mention SSDs and RAM are getting cheaper! Can't decide on the 3800x or the 3950x.

10 Advanced Micro Dollars have been deposited into your account.

RDNA uarch is based

Attached: 84efe3a9-9957-44d7-903c-e2f269817c0f.png (1200x673, 862K)

I actually kinda wonder whether the performance/power improvements they gave on the presentation were for Navi which is supposedly a GCN/RDNA hybrid or actually for future full RDNA architecture. I'd say it's a valid question because the slide with those numbers was about RDNA in general and not about Navi specifically

Meanwhile I still use my win7 installation I did on 3570k now on my r5 1600 and everything was snappy all along. Too bad win10 requires so much tinkering, probably still laggy compared to win7.

and already the "just wait, fine whine" has started before the new gpus have even released. Share holders must be really worried about amd stock prices.

They go with the Fine Wine™ bullshit while at the same time bashing nvidia for bad drivers. How shit must AMD's drivers be if they take away that much performance at launch.

Cope. :)

considering CUs now are using IF its gonna take a lot more time and NOT just drivers sadly

this effectively means that each dual CU acts like a core and they are all basicly on a crossfire situation

in a tl dr version

this is bulldozer GPU that somehow is faster

RDNA appears to have made optimizations and improvements over GCN across the board. I don't get why the power requirements is still higher than the competing gpu made on an inferior process.

What anandtech managed to get from RTG about the primitive shaders on vega:
twitter.com/RyanSmithAT/status/1138561780244869121

Attached: Graphics_Architecture_06102019-27.png (1920x1080, 1003K)

IMD & Wave execution:
GCN: CU has 4 x SIMD16, Wave64 execute on SIMD16 x 4cycles.
RDNA: CU has 2 x SIMD32, Wave32 execute on SIMD32 x 1cycles.

LDS:
GCN: 10 Wave64 on Each SIMD16, 2560 threads per CU. 2560 threads (1CU) share 64KB LDS.
RDNA: 20 Wave32 on Each SIMD32, 1280 threads per CU. 2560 threads (2CU) share 64KB LDS.

Shared cache:
GCN: 4 schedulers & 4 scalar units (4CU) share I$, K$
RDNA: 4 schedulers & 4 scalar units (2CU) share I$, K$

because amd wont ditch the hardware sc no matter what (there is no way of emulating async especially a heavy load on a software sc nvidia is a prime example of it at WWZ)

remember how the power dropped like a stone from maxwell to pascal?
780ti has similiar tdp to navi also

Could've gotten the same perf/w back in 2016 with the GTX1080

anyone know how anti lag works?

Are you retarded?

nvidia.com/en-us/geforce/news/rage-2-game-ready-driver/

Nvidia OBLITERATES AMD in WWZ now with a single driver release

kek you fell for raja's poor volta
vega is trash, there is no defending it. v56 is mostly fine, but mediocre. v64 is too slow to command its msrp when the gtx1080 exists.

160+18% is still SLOWER than amd

Attached: p1rdeui66vs21.png (872x920, 62K)

>No HDMI 2.1
>No VirtualLink
>No Variable Rate Shading
>No Ray Tracing
>225W HOUSEFIRES on 7nm finally reaching GTX 1080 performance from 3 years ago

OH NO NO NO NO NO NO NO NO NO NO NO

AHAHAHAHAHAHAHAHAHAHAHAHAHAHA

>no vrs
amd had a patent on it long before nvidia even had a prototype working on it
freepatentsonline.com/y2019/0066371.html

pretty sure navi will have it

anandtech.com/show/14528/amd-announces-radeon-rx-5700-xt-rx-5700-series/2

>With a single exception, there also aren’t any new graphics features. Navi does not include any hardware ray tracing support, nor does it support variable rate pixel shading

KILL YOURSELF FAGGOT :^)

what is all this meme shittery you pretend to care about

That's ok user, all those are useless and shit.
Next year when AMD finally starts to support them they'll be groundbreaking must-have features though.

>+18% for 2080ti
>+14% for 2070
>+12% for 2060

oh yeah they barely reach amd let alone beating them LOL

>doesnt support it

thats weird considering its a software feature i wonder if the drives support it he will just eat his hat off

>225W HOUSEFIRES on 7nm finally reaching GTX 1080 performance
the only valid point here, jensen.

If Poovi supported it AYYMD would have stated it in the PDF, it doesn't and it's not software, it's required to be supported in hardware

devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/

I'm more curious about their mid / low end tier gpus desu. I have a nvidia gtx1050ti and I want to see if amd can match it's performance with a new gpu.

are they even going to make a mid / low end?

250mm2 used to be mid for $300-330.

what the fuck made you buy that over an rx 570? shitty prebuilt/low profile?

was cheaper, the 1050ti costed 140 while the rx470/570 was almost 100 bucks more expensive IIRC. It's been so long but money definitely was the deciding factor.

hello retard its me reality
amd lacks on SOFTWARE

Attached: Untitled.jpg (2560x3300, 1009K)

so what does this mean?

And give nvidia my money? Why would I do that?

You are a retard and it shows

Navi does not support VRS period

>navi cant do pixel rasterization on hardware

jesus fucking christ Jow Forums doesnt have any sense of technology

Because I would've had the performance your shit company is trying to sell me now for 3 years.
You won't be recieving dividends from me Jow Forums amd nigger.

Navi does not support VRS and this has been confirmed by many tech websites

Can't get any simpler than that but you insist on arguing about it like a faggot

Super SIMD when

navi LACKS software you moron
the HARDWARE is there rasterization of pixel shaders exists since 2xxx

>As in, if Navi were on 16nm still at the same clocks as Vega, it'd have around a 30% improved perf/watt over Vega.
They should have shoveled out tons of them cheap using 12nm or 16nm. A vega64 sized chip would probably be decent and still cheaper than 7nm.

>Could've gotten the same perf/w back in 2016 with the GTX1080
In reality the 1080 is 7% slower than the 2070, and the 5700 XT is 6% faster than the 2070. The 5700 XT is Vega 64 +15% perf.

Attached: GTX1080.png (1324x1664, 71K)

Based. Keep dabbing on the AMDrones here, most everyone hates these cards.

For real though, I think navi is gonna be pretty okay desu. This is coming from a GTX 1070 owner who wants to upgrade.

If we take a look at the cheaper $379 36 CU Rx 5700 you'll notice how it wipes the floor with the RTX 2060 by about 10% on average. Does that put it closer to the RTX 2060 or 2070?

Because it it gets really close to the 2070, say 95% as fast then it properly competes with it in my book. That's not taking the driver updates into account like OP said btw.

Attached: amd_radeon_rx_2-100798966-orig.jpg (1999x1119, 221K)

Personally I really look forward to essentially a ~doubling of FPS at 1440p with the 5700XT from my 1070 without having to pay the RTX tax of the 2080.

Attached: relative-performance_2560-1440 (2).png (500x810, 47K)

>hurrrrr double the framerate
what is the actual improvement in frametimes though

Also remember the $499 price tag was comfirmed fake, Rx 5700XT will be $449. That's a pretty sweet deal if it gets ~95% within RTX 2080 performance which retails for ~$800 on average.

wccftech.com/amd-radeon-rx-5700-xt-7nm-navi-rdna-gpu-official-launch/
nowinstock.net/computers/videocards/nvidia/rtx2080/

I'll be honest here: I honestly EXPECT AMD to fuck up day 1 drivers but that's okay in my book too because this is what polaris and vega were like at launch. AMD does all the hard grinding with the hardware but then buffs everything out later with drivers.

I've been waiting 3 years for an upgrade and I don't mind waiting another few months after I get my 5700XT for frametime problems to be smoothed out (probably just power limit that can easily be fixed with slight UV).

>~95% within RTX 2080
more like Radeon VII

Which is 90% as fast as the 2080
2070 is 80% as fast as the 2080
see So throw 5% on top with driver updates and you have 95% 2080 performance in the end. Not bad.

>1070 user
>upgrade to 5700
well, suit yourself. i don't see a reason to do so. 1070 is fine for 1440p (exactly my setup right now) but not high refresh rate

the only game that runs bad on it right now is shadow of war on ultra
everything else is pretty good at 1440 on 1070

>The 5700 XT is Vega 64 +15% perf.
At practically the same price of Vega 64.
Horrible pricing.
349.99 would have been a lot better.

Nah, 5700XT. I have some IC diamond tubes laying around which I plan to use to lap the heatsink with and use as TIM to OC as high as possible. My 1070 only went up to 1720 MHz but maybe I'll get a lucky 100-200 MHz OC with navi.

Also seeing how big of a gap there is between the base and boost, there seems to be a lot of untapped performance on this little demon. Just needs more juice and better cooling, OC will just be icing on the cake.

Attached: 1560200869974.jpg (1000x561, 151K)

That's what Nvidia were doing since maxwell. That's why Radeon were mostly slower when compared to Nvidia unless the game were optimized for the GCN arch.

>7970 wasn't bad on launch, it was still better perf/$, perf/watt
It was literally so bad at launch and getting its ass kicked by the 680 a few months after that AMD pushed out the factory OC ghz edition refresh that's exactly the WHYY AMD OVERVOLT THEIR CARDS, ITS OUT OF ITS COMFORT ZONE meme.

If you want to know how GCN should actually be clocked look no further than 7970 vs 7970 ghz edition

I see the slides, AMD didn't lied, this time they really changed the architecture.
No wonder, despite the higher clocks, that 40CU is outperforming 56CU.

I told you guys it was a wavefront saturation problem

More like 20% if it outperforms the 2070 by 10% which by itself is 10% faster than V64. Right?

navi is power locked like nvidia now. it won't go above 225w TDP spec.

vrs is shit.

I doubt it

Attached: amd-rx-5700-xt-die-size.jpg (1100x619, 120K)

I ignored everything gpu since vega, what is VRS?

AMD bad drivers?

At least I don't have to step into my DeLorean and accelerate to 88MPH over 30 fucking seconds, just to change some settings.

Source? Because aren't they releasing a 2GHz XT with """"225 TDP""" or was that a fake?

Attached: Wd9ylDhYzwU1xRZW.jpg (2165x1126, 286K)

>Vega 56 - 495 mm2=$399
>5700XT - 251 mm2=$449
hmmmm.

Attached: 31-Deep-life-quotes-that-will-make-you-think.jpg (960x720, 89K)

better binned limited edition for $500, also it's 1980MGhz

ah yes source is GN, they asked AMD directly about unlocked BIOS, it's all limited now
I think it's a good think, retarded journos won't show you 500w power draws anymore

Yeah, Navi is fucking epic. I bet nvidia doesn't even bother reacting.

compress shader quality to achiev higher fps.

It is everywhere. That is really sad because Overclocking Radeon is really fun. Power tables, modded bios and all of it is now gone.

shut it down, don't let the goyim know!

Attached: prices for 5700XT.jpg (666x174, 58K)

fuck that. nvidia already makes games look worse with aggressive culling

The only thing wrong with Navi is the price.

What about wattman's power limit offset thingie? Not even like 10%. I'm starting to reconsider navi desu. I just wanted rtx 2080 perf without the rtx price tag.

GTX 970 -398 mm2=$329
GTX 1070 -314 mm2=$379
RTX 2070 -445(120-130mm2 of it is cheap cheap tensor cores) mm2=$499

Attached: 952.jpg (1280x720, 145K)

>thinking the yield would be 70%
You people just blindly parrot made up bullshit from AdoredTV. The Zen2 chiplet won't even have yields that high, let alone a fucking 200mm2+ dense logic IC.

AMD goes full SIMT, this solved a lot unused capacity, and making better for future Raytracing hardware.

Comming from a 970, how big of an upgrade would the 5700xt be?

x2

The 5700 XT is massively faster than Vega 56, and ~15% faster than Vega 64 which launched at $499.

There would be no reason to put an 8 pin plus 6 pin on the 5700 XT unless they were going to allow significant power offsets in wattman. They had to lock the BIOS for DRM reasons.

its fucking jewish to raise price for increased performance between generations as this is expected to happen
however as all the nvidiots cheered for that i say suits them right not getting the benefits of a price war
right, there might be some massive OC headroom
>75 Watts: None
>150 Watts: One six-pin connector
>225 Watts: Two six-pin connectors
>300 Watts: One eight-pin + one six-pin connector
>375 Watts: Two eight-pin connectors

>Using yield numbers from a literal scam artist who makes shit up to grift NEET bux on Patreon
Lmoa

>chiplets half the size of the previous one that had insane yields wont have the same and even more now..

Jow Forums ladies and gents

7nm is more expensive than 14nm.

>underage retard kid thinks he understands semicon industry
14LPP is not 7HPC, you retard. 14LPP is not an immersion process. TSMC's 7m node is. It has radically more lith masks, more multiexposure, complexity is exponentially higher.
Die size alone does not determine defect density, you retarded little kid. TSMC's 7nm lines will never produce high yields.

>Another clueless dumbfuck believing Scottish NEET Patreon bux shill lies
Stop listing to people who make everything they say up to scam money out of dumbfucks like you.

whats your point?
anyway amd priced 5700XT and 5700 so badly that it looks like intentional sabotage, I Have no idea why they think they can be more greedy than fucking nvidia who are hated wide and far for their greedy margins

>>thinking the yield would be 70%
First off that is the Worst case scenario, not best. all those components are taken at their highest possible prices
Second, it's not AdoredTV dingus.
>The Zen2 chiplet won't even have yields that high, let alone a fucking 200mm2+ dense logic IC.
You have no idea what you are talking about. Then again maybe you do and just have stocks in AMD.

so? 970 was massively faster than 770
it didn't cost arm and leg
390 was barely faster than 290 they still asked 970 price plus premium.