Daily reminder that when a game is coded properly VEGA 64 is nearly as fast as a RTX 2080

Daily reminder that when a game is coded properly VEGA 64 is nearly as fast as a RTX 2080.

Attached: untitled-1.png (722x909, 42K)

yeah but game devs are lazy, which is why so few of them use more than 4c/4t

Stop kvetching and buy an RTX card already, faggot.

Attached: gay-frogs-all-along.png (500x492, 386K)

whoah...

Attached: relativeperformancegames19201080.png (500x1450, 95K)

>turn on gaytracing
>your 2080Ti is now a 1050Ti
THANK YOU BASED NVIDIA
I WILL START CARRYING SOME WOODSCREWS IN MY LEATHER JACKET AS A SIGN OF MY UNWAVERING FAITH IN THE WAY IT'S MEAN TO BE PLAYED

>5ghz boost
gee i wonder why it's faster

Maxwell really isn't aging well.

i think it's just being limited by its memory in this game. look at how much faster the 390x is compared to the fury x

also 12gb titan xp > 11gb 1080 ti > 8gb 2080

>properly coded
>fury x is getting dumpstered by everything short of actual garbage
HAHAHAHAHAHAHAHAHAHAHA

Fury X's competition was 980ti, and it's 5% of it in that bench.

So you're saying that the 980 Ti, which the Fury X was a direct competitor to, is actual garbage? Given that it's only 2fps ahead and all. Which is even more embarrassing when you consider than the game clearly wants more than 4GB of VRAM and the 980 Ti has it, whilst the Fury X doesn't.

/thread

The lower clocked 2600 with its nearly 50% slower IMC beats the 4c/4t 8350k there.
And if people were smart enough to show 0.1%/1% minimums, it'd be an absolute BTFO on the stutterlake i3.

>woodscrews
I haven't thought about those in a looong time

Why did the Fury line age so badly? All three used to be above a 390, which was more or less the same as a 480. Hell, Fury X was benched vs the 1070 on Doom using Vulkan and it won. Now they are all below a 470?

Bad memory controller; unfinished arch.
Other GCN cards with GDDR5 and only 3Gb of VRAM handled a game using 4.Gb of VRAM better than Fury did.

They fixed it with HBCC in Vega.

vega 56 is the gift that keeps on giving

Attached: 1460623005667.png (1127x685, 37K)

So even in the best case scenario for GCN 5 it still gets beaten by a Turing card with less raw performance and less bandwidth... that's really bad.

>$350 Vega 56 matches the $600 2070
>that's bad because the Vega card is more of a monster at general compute and not exactly optimized for children's games
Hmm.
>b-but you can get a 2070 with failed yields that doesn't perform as advertised for $100 cheaper!!!
The state of cope.

>R2700 pretty much as fast
OH NO NO NO

>a monster at general compute
If by monster you mean slower than a 1080ti and 2080ti.

You go on about prices but AMD's probably losing money on Vega because of how fucking big it is.

>390x literally shitting on 980ti

Attached: 1523213963380.gif (216x190, 1.2M)

$350? Where?

see: $350 plus it includes $180 of game preorders.
>4x the price:performance of the 2070

Why doesn't AMD throw money at studios to develop for their cards first? Even if it doesn't pan out at the high end, their low and mid range cards would perform better and have a better price/performance ratio. Winning the low and mid range markets is more feasible than the high range market.

Leftist tend to be better at programming

Attached: 1541092274545.png (1281x802, 1.39M)

Attached: an.png (877x526, 50K)

>Winning the low and mid range markets is more feasible than the high range market.
That's exactly what AMD is doing. Which should be obvious from the fact that both the PS4 and Xbone run on AMD hardware nowadays.

As for Vega, the major issue is that actually utilizing its ridiculously good FP16 performance introduce a lot of code complexity. Not to mention it simply not being suitable for all applications, causing even more headaches from optimizing around avoiding FP16->FP32 conversions. Nvidia simply provides better performance for the path of least resistance.

makes sense based on the FP32 performance, but AMD cant just twiddle their thumbs hoping developers will put the effort in, because they dont have the marketshare to make it worth it.
So unless they make it way easier to optimize for AMD gpus or start actually selling some cards they're gonna still be behind in most games

Attached: 1537717781246.gif (288x377, 1.83M)

>AMD
>throw money
>AMD
>tonnes of cash reserves

Are you retarded?

GCN 5 in Vega has a raw performance advantage but more often than not it doesn't translate to not just gaming but general compute either, Turing is a better "architecturally", in nearly every way possible.

Ah yes at the price of a 2nd hand car

Attached: Nvidia engineer.jpg (1200x900, 100K)

Wait For Naviā„¢

>tfw Vega 64
THANK YOU BASED AMD
THANK YOU BASED DICE

List of properly coded games?

*intel processors have no security patches applied, amd processors are fully patched. substract 30% from intel performance to get the patched results.

isnt this card 3 years old? you play newest triple A game on ultra on ~60fps, seems decent to me

>Fury X's competition was 980t
>980 Ti, which the Fury X was a direct competitor to
Wasn't the fury x $200+ more than the 980ti, and it competed against the titan of the time?

No. They were both around $600