THANK YOU BASED NVIDIA

hothardware.com/news/3dmark-variable-rate-shading-test-performance-gains-gpus

THANK YOU BASED NVIDIA

Attached: 3DMark-VRS-Test-Results-NVIDIA-gpus.png (708x356, 19K)

Other urls found in this thread:

nvidia.com/en-us/geforce/news/nvidia-adaptive-shading-a-deep-dive/
benchmarks.ul.com/hwc/tmp/3dmark-vrs-feature-test-screenshot-vrs-on.png
benchmarks.ul.com/hwc/tmp/3dmark-vrs-feature-test-screenshot-vrs-off.png
devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/
youtube.com/watch?v=dADD1I1ihSQ&t=442s
nvidia.com/content/dam/en-zz/Solutions/geforce/news/gamescom-2019-geforce-game-ready-driver/metro-exodus-nvidia-freestyle-sharpen-filter-comparison.png
twitter.com/AnonBabble

>Turing only

>WAAAH HOW CAN TECHNOLOGY PROGRESS ITS NOT FAIR

poorfag

So what? we can expect to see this in games in like, 2 years? 3?

It's already in at least one game.

>variable-rate-shading
so games can look even worse now? yaaay!

which is?

Wolfenstein 2

The two newest Wolfenstein games have it.

Attached: BOOM.gif (290x326, 1.75M)

nvidia.com/en-us/geforce/news/nvidia-adaptive-shading-a-deep-dive/

Nvidia Turing supports the more advanced Tier 2 of VRS in hardware while 3DMark only currently supports Tier 1 in their test which is inferior, Tier 2 support is coming in the future

4k+ displays are popular now and there's no good reason to render at native resolution on them. It's a positive image quality trade off.

Sure looks like MASSIVE gains to me.

Attached: Wolfenstein.png (3973x2234, 594K)

Are you retarded? VRS is not turned on by AYYMD Unboxed

>textures look worse
>even worse than bad mipmaps
>game runs faster
wew lad, who could have thunk.
also wasnt ayymd supposed to be able to support this too? whats going on with that?

>benchmarks.ul.com/hwc/tmp/3dmark-vrs-feature-test-screenshot-vrs-on.png
vs
>benchmarks.ul.com/hwc/tmp/3dmark-vrs-feature-test-screenshot-vrs-off.png
jesus christ, lmao

Attached: this is not zoomed in.png (278x90, 37K)

DLSS areadly look like shit, now this

truly nvidia experience

Attached: 1438059713600.webm (350x350, 611K)

yikes

>reduce quality
>it works faster
wow

Just because it looks worse, doesn't mean it's not superior.

VRS, temporal AA and checkerboard rendering is the future. We'll never achieve 4K gaming without abandoning this "sharpness" autism.

Attached: amy_annoyed2.png (241x267, 66K)

are you blind? that shit looks awful. if you think this is the future then maybe you should go buy a ps4.

So it's DLSS again, but even worse. Thank you so much, leather jacket man.

Hole puke Batman! So much detail loss and jaggies.

got caught cheating in benchmark all the time, now nvidia bring those tech to mainstream, such innovation

AYYMD is the caught cheating and reducing image quality to win at benchmarks

FP16 to FP10 demotion is very blatant cheat

Not to RIS, nvidia really is running out of magic tricks. First it was gay tracing then dogshit smear filter. pic related

Attached: 1548598224638.jpg (1920x1080, 1.16M)

devblogs.microsoft.com/directx/variable-rate-shading-a-scalpel-in-a-world-of-sledgehammers/

AYYMDPOORFAGS don't understand the technology and try to spin it as usual since they are losing in performance, can't get anymore pathetic than that but that's the life of a perpetual loser for you

>Reducing quality to save on costs while having to sit on your ass raking in the shekels
and that's a GOOD thing

see
it looks like dogshit, you know full well this is a haphazardly rushed response to RIS which is butchering the entire 2070/2080 cards in every game right now.

And tried to promote it with Polaris first but lacked hardware support

Wrong, that is Tier 1 rendering

Nvidia GPUs support the more advanced Tier 2 rendering as mentioned in the DirectX blog post, but you AYYMDPOORFAGs don't read as usual

And even Microsoft agrees you can't tell the difference

It is not even a response to sharpening, are you retarded?

It's totally different technology

See what I mean? AYYMDPOORFAGS don't understand GPU technology at all

Damn, that sharpened lookin nice. Some spots look even better than 4k native.
How can DLSS even compete.

It doesn't look awful compared to rendering at a lower res.

Pic related. You can't possibly believe the 720p version looks better.

Attached: future.gif (928x530, 672K)

good bait

By ripping off AMD of course.
youtube.com/watch?v=dADD1I1ihSQ&t=442s

>cutting corners and butchering quality is suddenly now innovation
>but having a SW-like HW accelerated CAS filter with state of the art scalers comparable to SW ones that gives you ~1440p FPS at fucking 4K res WITH HDR retention is NOT
>"b-buh microshart was paid to say: it's all good guys, go home ha-hhha"
lmao

AYYMD Unboxed is retarded and don't know shit and it really shows

Looks like dof off/on

Prove it

Had nvidia driver post / now hard defend this shit tech
Nvidia shill not even trying to hide it anymore

>BUTTMAD AYYMDPOORFAGS WITH NO VRS GPU DETECTED

ENJOY YOUR INFERIOR PERFORMANCE AND INFERIOR SHARPENING

Let's not forget that this feature is a DirectX 12 EXCLUSIVE.

Sorry, Vulkancucks. Between this and raytracing, it looks like Microsoft is going to reign supreme for at least another generation. Looks like the Linux gaming meme is back to square one.

Attached: velma_laugh.png (469x498, 206K)

nvidia.com/content/dam/en-zz/Solutions/geforce/news/gamescom-2019-geforce-game-ready-driver/metro-exodus-nvidia-freestyle-sharpen-filter-comparison.png

KILL YOURSELF FAGGOT

Are you retarded? VRS is implemented on Wolfenstein games which only run on Vulkan

nvidia.com/en-us/geforce/news/nvidia-adaptive-shading-a-deep-dive/

Good, we don't want that shit. We'll take low level APIs like DX12/Vulkan to boost FPS without cutting corners any day of the month.

VRS will only work properly on a fucking VR headset with retina tracking anyway.

Attached: Screenshot_20190720011802_Firefox.jpg (1080x1423, 460K)

It's always fun to see an Nvidiot shit their pants and run away crying when presented with an actual technical argument. Really characterises their userbase nicely.

Attached: Nvidiotâ„¢.jpg (512x800, 218K)

See: >a new graphics feature available only on DirectX 12
>only on DirectX 12

It doesn't matter if some indie game implemented it using Vulkan. Real game engines will rely on industry-standard APIs like DX12.

Attached: tails_crush.png (1234x1070, 868K)

>AYYMD INVENTED SHARPEN FILTERS

Amdrones everyone

See
Basically they made a thing that could normally only be done on the CPU and ported it over to the GPU with minimal quality loss. It's like if NVEC had the same quality as the slow preset on a 10-bit precision HEVC SW encoder (it doesn't, more like the ultra fast preset).

>Just because it looks worse, doesn't mean it's not superior.

Attached: 1326996041993.jpg (576x435, 18K)

/co/mblr are brainless

That's exactly what Nvidia's variant is

>We

Big yikes and filter-pilled

It's a good sharpening filter but It's literally a pixel shader and there's nothing hardware specific about it. You could run it just as fast on any GPU. Slightly worse GPU sharpening filters have been around for 10 years.

>Just because it looks worse, doesn't mean it's not superior.

Attached: 1558529119230.gif (379x387, 71K)

Right but this has near SW quality level scalers (ie better than bilinear AND bicubic) which is really important if you're scaling awkward 78% zooms like in You can't get that kind of quality if you just do fast approximate bilinear scaling or even bicubic (ie can't do 78% zoom well).

That's why RIS quality looks so close to native 4K.

checkerboard is already obsolete
get on the bus, grandpa

>AMD unboxed

The Fox of Wet Blankets enters the thread.

Instead of participating in thread discussion, this fox just wants to point out that you're currently at the edge of the board and about to die. =^_^=

Attached: Laura.png (189x274, 75K)