Remember when Jow Forums hated the 5700 XT and said navi wasn't a new arch? Those were the days... *sip*

Remember when Jow Forums hated the 5700 XT and said navi wasn't a new arch? Those were the days... *sip*

Attached: 1530993406100.png (380x349, 77K)

ok

Indeed, I have NEVER seen noVidya get curb stomped so hard before. RIS is one hell of a trick up AMDs sleeve.

Attached: Screenshot_20190720011802_Firefox.jpg (1080x1423, 460K)

thanks for the bump, kid

Attached: 1535921637763.gif (281x215, 19K)

this can't be right lol

It's not just BFV, very popular racing games like Forza Horizon 4 have been able to really take advantage of the new RDNA architecture well. Once you go to DX12 benchmarks it even consistently keeps up with the RTX 2070 super.

BUT that's absolutely nothing, RIS completely crushes nvidia by allowing 1080p frame rates on a 1440p display with 90-95% the same image quality as natively rendering the game at 1440p.

The newest Navi 23 meant to absolutely decimate the RTX 2080ti super (is that out yet?) in 4K via RIS (ie 1440p fps on a 4K display).

Attached: 0b408abd-6469-4bc1-8762-d30f38ad6fe1.png (757x605, 36K)

?

There's no RTX 2080ti that I know of. That would kill off the titan as well

>$400 beating $1200 card
Nvidia literally can't compete!

It’s almost as if AMD knows what it’s doing.

Uh oh

Attached: relativeperformance_38402160.png (500x1050, 56K)

Attached: relativeperformance_25601440.png (500x1050, 55K)

It is right. The game doesn't use nVidia GameWorks or nvidia partner UnrealEngine with nvidia GameWorks technology.

THAT'S why the 5700XT is such a massive anomally. The 1080p performance.

It's the PERFECT 1440p card because of RIS. and outperforms even a 2080ti at 1440p on every game because of it.

Attached: relative-performance_1920-1080 (1).png (500x970, 51K)

this

t. getting one soon

>i plan to use image sharpening to upscale 1080 to 1440p
Retarded

profromance on GTX1070
makes me cry everytime

RIS is actually one of the most interesting HW accelerated CAS scalers implemented to date. 90-95% image quality of the original is actually pretty palatable. Especially compared to the dogshit smear filter (ie DLSS) nvidia has on like 3 games.

Attached: 1548598224638.jpg (1920x1080, 1.16M)

släšš gee

This is all at higher resolutions, though. That user is right. 78% of 4k will look like native 4k with a great sharpening filter like RIS. However, 50% of 4k still looks like dogshit which is what the dlss is running at.

When you lower the resolution, the ability to upscale decreases. 1440p is 77% more pixels than 1080p. You fucking cannot scale that high with decent image quality, even at 4k.

Right which is what makes RIS so state of the art. 78% scale of 4K is 2996x1685 (5.0 mil pix) and 4K (8.3 mil pix) would be like 66% more pixels. Coincidentally 78% scale of 1440p (3.7 mil pix) would be very close to 1080p (2.1 mil pix). Though that means about 10% more pixels, it can be offset with just cranking the scale to 83% and getting ~10% FPS lower FPS BUT it's not known how well RIS can fill in missing information but at least according to most tests it's able to work with 75-80% scale and keep most details intact.

No HW accelerated CAS filter that can seamlessly adapt between resolutions THIS well has ever been made before. That's what makes it special and the fact that you can use it on virtually any game out there.

You missed the point of the higher resolutions. 78% of 4k will look a hell of a lot closer to 4k than 78% of 1440p will look like 1440p. There is just so much more information in those millions of extra pixels that is necessary. Do not expect good results if upscaling to anything less than 4k.

And no, 78% of 1440p is nowhere close to how low 1080p is. 1080p is only 56% of 1440p. The difference is greater than you think.

But 78% scale 4K is only 60% of 4K, the difference isn't that big and increasing the scale to 83% and losing 10% performance is that much of sacrifice anyway. Still better than natively rendering 1440p.