AMD Radeon Navi leak with complete specs

videocardz.com/80966/amd-radeon-rx-5700-xt-picture-and-specs-leaked

discuss

Attached: AMD-Radeon-RX-5700XT-Navi-Specifications[1].jpg (1917x900, 372K)

Other urls found in this thread:

gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-AMD-Radeon-VII/4026vs4035
gpucheck.com/compare-multi/nvidia-geforce-rtx-2080-vs-amd-radeon-vii-vs-nvidia-geforce-rtx-2080-ti
tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572.html
tomshardware.com/reviews/nvidia-geforce-rtx-2080-founders-edition,5809-11.html
twitter.com/NSFWRedditGif

Looks like some incremental shit

At least it's something new, but don't care and this wasn't worth waiting for

noone gives a shit unless price is confirmed
if RDNA architecture does not get proper software support it's going to get buried

Attached: 586977212046114821.gif (75x75, 271K)

Saved you a click

Attached: Screenshot_20190609_144013.jpg (1080x1338, 375K)

AdoredTV said that one card would be 40 CU

thoughts?

Attached: 2019-06-09 15_41_40.jpg (1982x1371, 287K)

thats really bad. Like im gonna be rea ive been wait for navi for over 2 years man.

>15% vega 64 upgrade for $250
[x] doubt

It's an RTX 2070 competitor like everyone expected. Hope the power draw and price are low or it's bad news for AMD going forward.

Wow, it's nothing

So basically Full Navi has less Tflops than GTX 1080

Well that's pretty underwhelming

The slides look incredibly fake.
Stop falling for clickbait

GTX 1080 is 8.8 TFLOPS. also that's not the biggest Navi chip.

>RX 5700 XT FP32 compute: 9.75 TFLOPS
>RTX 2080 FP32 compute: 8 92 TFLOPS at base clock, 10.06 TFLOPS at boost clock.

i know floating point computing is not the end-all be-all, but it's a rough estimate of raw performance. Vega 64 also had ~10 TFLOPS FP32 but it's more in the 2070 territory in terms of performance

it will get support since the new x box and ps5 will use amd only and we all know that most games are ported from consoles so yeah

>it's literally just a polaris++

fuck you amd

The fuck is "game clock"?

This is honestly the make it or break it moment. The raw FP32 perf is clearly there but the main downfall of vega has been the difficulty in extracting as much of that as possible in an actual game. That and vega just blows in MSAA.

Attached: 674858568.jpg (1265x1416, 752K)

A more honest version of boost clock. Still meaningless because it depends on the game how high the clocks are.

it means that they had no choice but clock it up as much as they can despite inefficiencies to be competitive.

As in, at boost clock it'll likely go over 300W if TDP is 185~225W. Sounds worse than my 1080Ti.

My 144hz monitor is G-sync, pointless to use an AMD card?

>40CU 120W for Vega56 performance
unless you can undervolt the shit out of that card it's pretty lame, at least for me. it won't even be near 2070 performance even with overclock (if any when looking at the big boost clock of 1905Mhz).

So most likely there'd be a 5900XT and XTX.

Would be nice to pair with a Ryzen FX 16-core processor.

The throwback names actually work. A shame that both Intel and AMD both shit on the Athlon and Pentium names. Can we have them back in the premium range again?

>AMD TFLOPs=Nvidia TFLOPs
yeah nah, can't compare those right now. maybe AMD catch up with Navi but i'm not very optimistic

It's not just dirt 4 btw, WWZ was able to extract as much FP32 from vega as well using the vulkan API. Not sure what AA method was used though.

Attached: p1rdeui66vs21.png (872x920, 62K)

I think this is the year I go full AMD. Goodbye Lynnfield and Fermi.

Those numbers are awfull. It s 2060 tier if we believe the 1.25 performance boost . Also blower style.. at least nvidia did a decent reference design

>Nvidia
>decent reference design
Literally a ripoff $100 more expensive than MSRP so they look good in the benchmarks.

super keen for this to come out so i can finally afford an rx 580

Thank you for the info user.

wow so their new card MIGHT have underclocked 1080ti performance.

Congratz AMD. you're really sticking it to nvidia.

all that time wasted.

>8GB cards
Are we still in 2016?

>1080p

TFLOPS are TFLOPS you dumb fucking nigger.

Similar story at 1440p, vega is just wiping the floor with RTX.

Attached: World-War-Z-2560x1440-Vulkan.png (805x809, 53K)

In my country the v56 can be had for $150-200 cheaper than the 2060, depending on what autismo RGB card you get.

Gaymers who spend months at their shitty retail job to save up for an overpriced build may want AMD to compete at the top end, but AMD is so much better of a deal at the low and mid end.

I use a cheap old gpu that is good enough for my image editing but I'm not sure why low to mid range AMD aren't more popular. The only explanation I can see is that the market is flooded with retarded zoomers.

2070 has literally 8 gigs and it works and since 5700 is designed to compete with 2070, its kinda stupid to put 12/16/whatever just because

>All AMD Setup is bullshit.

>All that power from the 3950X wasted on being bottlenecked by the bloody 5700XT

I'm glad I didn't wait for navi and got a VII this past Feb

this.
The main difference is the way those TFLOPS translate to real-world gaming performance. Vulkan seems to really like AMD GPUs, but in every other scenario NVidia has the advantage

Vega 56 Is a based video card for smart white men. That thing can reach almost V64 performance with only a few tweaks, no need to go full retard to actually reach V64 by drawing 400Watts. Where i live i can get Vega 56 for 240-260$.

Well i dont see how my GPU is going to be bottleneck if i'm not a gaymer. If i was a gaymer i wouldn't be buying the 3950X in the first place.

>What is power consumption and price

>It's kinda stupid to push technology forward
Typical intel/nvidia mindset. 4GB is already obsolete and 8GB with follow soon.

>those prices
>actually believing anything adoredtv says after his ryzen 3000 "leaks"

The only nvidia gpu better than VII is 2080ti but the price is apple tier

Have you tested at 4K resolution (native or superscaled) without AA? AFAIK vega did unusually well in because of CMAA

>hurr durr it's not worth it if it isn't the fastest card on the market
are you actually retarded? as long as it beats nvidia in terms of price to performance it's a worthwhile product, even if it's just a midrange card

>UP TO
>UP TO
is this a joke? They are advertising the maximum performance possible instead of what it will get on average?

most of his price estimates have been super optimistic

>The only explanation I can see is that the market is flooded with retarded zoomers.
correct
most people think that if the companies top-end SKU isnt the best one the entire brand is irrelevant.

>Tflops don't matter when vega surpasses memetx cards
>But it matters now!

weren't they showing off the 5600 during computex?

>b-b-but, muh gaymes!

Attached: oh no no no.png (867x344, 30K)

so a 590 refresh, great. And here I was hoping for something that can easily beat a 64 performance at 250$ price.

>super fake
fixed that for you

Lmao slower than a Vega 56 1080 and 2060 barely faster than a oc 590 it's replacing
Where's the top spec 14tflop+ stuff?
Fuck this imma just softmod my 2080ti to 15tflops+ until 202x

Attached: 90FFBDEDC4D247D7B9F76BFAEB01662E.png (653x726, 49K)

but those werent gaming Tflops, dont you get it?

You can't give an average though can you since different GPU's will be boosting differently based on multiple factors like silicon "quality", VRM/PCB/Power Delivery, Temps that depend on your cooler, ambient and case and other variables.

Why the fuck would they go with a blower reference card when the Radeon VII reference cooler has multiple fans? Seems fake.

you'd lose goy-sync but it's still a 144hz monitor then

How is it slower than the Vega 56 and 1080 you dumb retard, could you please try and make a cohesive post for once in your life, oh wait you can't because your IQ is below 60. You're a nigger.

checked

>price and branding aren't subject to change

see
Not about the tflops race, more about making it easier for game devs to squeeze out more fp32 even if using ancient DX11 more than half the games out there still cling to.

yeah, nobody claims vega is bad for compute.
It's the games that it's terrible at comparing to nvidia both in performance and power efficiency.
So if your argument is to remove the main key flaw in the architecture that is vega, then sure it's good. just not at gaming.

where's the PCI power one?

It's slower than a 2070 it's barely midrange
Eh I don't really see these cards selling unless they are stupid cheap
I hope they do because nvidias low end Isa joke and midrange is pointless atm
High end I don't expect radeon tech group to compete till 202x when rt dxr is actually doable at decent speeds not hacked together shit rtx

so you can advertize up to X performance but actually release a housefire and blame it people not cooling it enough?

I am so happy I bought a used 1080 Ti for cheap. Best decision I've made in terms of hardware.

After Ryzen 3000 mediocre pricing I have zero hope that this shit will be sub $400 GPU.

>what is an i9?

So how do you feel that AMD is literally doing what intel does then?

>$500 card
>midrange
Where did it all go wrong?

The pricing is similar to the previous gen(s). Did you unironically think they'd be giving away 8c @ sub 100 bucks?

Same here, already put about 150 hours on it which is a lot for me, just didn't want to spend the extra $200 for a new Radeon vii

it's at least 30% faster than a 590 in terms of raw tflops. gaming performance per tflop will hopefully be higher due to the new architecture. of course it's not gonna beat a fucking 2080ti because that's not the goal of this product, plus it's gonna be a fraction of the 2080ti's price

They aren't outright lying or bribing people, blackmailing, and other kikery; So... not literally like kiketel. Nice try though inturd.

Soooo it's a Vega 56/64

see
ALL AMD has to do is make it easier for game devs to crank out more FP32 compute out of the cards without resorting to costly optimizations using low overhead APIs like vulkan. That's what navi is trying to fix. If implemented correctly a 40CU navi would be on average on par with an RTX 2070 for like ~$300.

yeah, these new cards are probably going to be as disappointing as the radeon VII, same price as their nvidia equivalent while being 15% slower and having higher TDP

>costly optimizations
>cost
>implying devs actually care enough to properly optimize their shitfests

>bought my 970 for $300 in 2015
>1070 was $379 on launch
>2070 was $500 on launch
just fuck my shit up

Attached: 1559861151944.gif (250x250, 458K)

even if it was 40% faster than a rx590, that would make it barely rtx 2060 level performance on average

>Ryzen 3000 mediocre pricing
$500 12 core that beats intel's $1200 12 core isn't what I'd call mediocre but okay

So they are just doing some of the things you hate intel for, just not all of them.
So you still love and praise them. Okay lad.
No need to start coping and calling everybody who points out that amd is doing something distasteful an intel supporter.

which is perfectly fine if it's cheaper than an rtx2060

No - I have Nvidia and gsync but I never turn it on because it adds latency. Whoever invented this shit is retarded. The best thing you can do is turn off all buffering, v-sync, g-sync and fps limiters, and make sure your resolution is running @ 144+. AMD's GPUs would work just fine for me.

so basicly a vega 56
400mhz more
obviously at least 200-300mhz more headroom
tdp is questionable but who knows
if they finally managed to fix color compression(why else would they use gddr6..) this shit could be really nice

Pretty much, since devs be like that, AMD has to do a lot of hand holding at the hardware level. Nvidia just figured that out earlier than AMD.

gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-AMD-Radeon-VII/4026vs4035
gpucheck.com/compare-multi/nvidia-geforce-rtx-2080-vs-amd-radeon-vii-vs-nvidia-geforce-rtx-2080-ti
oh shit nigger, what are you doing?

so what about the 3800X

15% increase in perf for the same price isn't exactly exciting.
Sure its miles better than what Intel have been doing but these sort of shit CAN be better.

This is of course without taking into current market pricing into account.
You can easily get 2600 for $150 nowadays.

>lets compare it to Intel

Even a turd is better when compared to a fucking Intel.

Been reading amd leaks for years here. They are always fake gay cancer. Kill yourself. Stop encouraging journos to make up fake leaks

>No - I have Nvidia and gsync but I never turn it on because it adds latency.
The amount is really small as I remember. Absolutely nothing compared to V-sync.
I really wonder what kind of setup you have where g-sync (or freesync) adds enough latency to make it feel perceptible.

you're right, that one seems pretty overpriced. probably not worth buying over a 3700x unless it overclocks really well

>if they finally managed to fix color compression(why else would they use gddr6..) this shit could be really nice
Vega has shit color compression?

AHAHAHAHAHAHAHAHAHAHAHAHHAHA
The only one trying to indirectly shill for something is you rajesh. Seeth harder.
This is kike tricks 101. First you deny everything wrong you yourselves do. Then you say that your opponents are "literally" the same as you. And then you go on and say that you never dindu nuffin wrong and your opponents were the monsters all along.
All companies do what amd is doing. But not all companies, including amd, do what your pathetic, lying, incelkike overlords do.

vega for me felt like a stop gap
pro cards had most features on
gaming cards barely had them at all

i just hope that navi is actually good

>i just hope that navi is actually good
Me too user, me too. I really want to support AMD but I don't want to buy their stuff just because they are the underdog in GPUs.
Just wish they give me a good deal as a gamer.

Not exactly. Nvidia basically sandbags their figures.

tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572.html

>The GPU Boost rate is 1733MHz, and we’ve taken our sample up as high as 2100MHz using a beta build of EVGA’s PrecisionX utility.
>as high as 2100MHz

That's well over 10TF. The pascal turbo pushes it well past the boost when the card is cooled sufficiently. Custom cooler cards will probably sustain around 2000-2100 if they get good airflow. It's 2019 and I can believe people still have no idea how nvidia cards work.

Pascal was about 35% faster per core ("IPC") than Polaris. 1070 is about 35-40% faster than the 590, which has similar throughput (as a mild turbo from 1070). Vega was a little better than Polaris sometimes(?). Navi/RDNA can potentially come close to Pascal, but Turing increases "IPC" over Pascal. There's still a gap between RDNA and Turing.

tomshardware.com/reviews/nvidia-geforce-rtx-2080-founders-edition,5809-11.html
>1905/1815
In the best case a 2080 will turbo an extra 11% over its base boost to 11.16TF