Dlss = dumb larp super smudge

Dlss = dumb larp super smudge
youtu.be/3DOGA2_GETQ
So your either locked to 1080p 1440p or 4k with Dlss dxr forced on
Smudges the fuck out of the image looks worse than txaa + smaa + fxaa and the worst aa ever in ff15
Lol nvidiots waited 6 months for absolutely useless garbage.
Granted rtx ao looks interesting and rt/dx12 dxr on consumer cards in real time is impressive on a technical level it's just absurd In real applications especially games.
Fucking bizzaro this is nvidias biggest blunderbunglebuggerballsup I've seen and I saw the geforce 3 4 5 and fermi fiasco

Attached: Screenshot_2019-02-19-19-32-10-33.png (1920x1080, 646K)

You seem poor. You can’t afford an RTX card like I can.

Gtx 1080 I don't need anything more for 2k res simulators and the occasional fps

It's amazing that Nvidia's advanced AI and supercomputer manage to deliver an image that's significantly worse than just rendering at a lower resolution and upscaling. Truly a level of incompetence fhat AMD would be proud of.

This gen is such a let down. 1300 euro for 30% increase in performance vs my 1080ti I got for a still expensive 750. If amd could get their shit together we might have some actual competition again

>nvidia fucks up
>its amds fault
Kek

Lmao

You don't want competition, you want nvidia to drop prices so you can buy nvidia

Eh I'm fine for my 1080 I got after launch 500usd aging fine I don't really go over 2k resolution

Competition is about more than just price, feature innovation is important. Sometimes you get gimmicks others you get a game changer. Early adopter costs are high but the tech filters down. The desktop gpu segment is undergoing glacial progress compared to smart phones. Apple Samsung and qualcomm are pushing the envelope. Wtf is nvidia doing

I've been on 1440p 144hz for 3 years. Where the fuck is my 4k 120hz big screen gaming display they've been wheeling out at every convention for the last 2 years.

> fermi
Oh no it ran hot. It was still the best dickweed.

>than just rendering at a lower resolution and upscaling
That's literally what it's doing, though. They're rendering at a lower resolution and using the tensor cores to run a neural net upscaler. It's quite astonishing in a bad way that they managed to run a neural net upscaler that produces worse image quality than traditional, old upscalers though.

I've been stuck on a 27" 1440p 120hz IPS panel for 8 years now...welcome to how long it takes for the computer display industry to change, enjoy your long stay...

And it was late to market, overpriced, and replaced by the 580 (the card the 480 should of been) in a very short time period.

Bugger 4k 120hz when a 1399.99USD gpu can't hold 60fps minimum in last year's games desu
Where's my fucking joled xled quantumn dot new panel tech I'm, sick of the same tn ips va garbage for the last 20 years
The best at sucking arse just like turing reference cards
I'm stuck on 29" ultra wide 2560x1080p 75hz panel from 8 years ago I know your pain.
They are still selling the same screen too reee

>just wait for the 580
>now just wait for the 680
>now just wait for the 780
>whoops now just wait for the 980

And so it goes on. I could wait forever but instead I did the 480 -> 480 SLI -> 780 Ti -> 1080 route. I'll skip this generation and probably jump on the 3080 Ti or equiv.

Maxwell and kepler sucked.
Pascal was meant to, be a stopgap to tire gamers over but was too damn good.
Wish we'd just get turing on 7nm without rtx rt shit that's nowhere near ready and wasting precious die space yields and $$$
Nvidia may take it out like they did with fermis hardware based async compute scheduler

I can't even understand what you're trying to say, it's like an AI threw together a bunch of /v/edditor shitposts.

Really deeps my thinks

>That's literally what it's doing, though. They're rendering at a lower resolution and using the tensor cores to run a neural net upscaler. It's quite astonishing in a bad way that they managed to run a neural net upscaler that produces worse image quality than traditional, old upscalers though.
Did they even think the tensor cores were a good idea for graphics in the first place? Or was it a case of them wanting tensor cores for commercial AI stuff using the same GPU dies and then trying to come up with a way to sell it to consumers for games? Obviously they would have sacrificed GPU performance by spending transistors on non-gpu hardware.

They wasted 25% of the turing die on it so yes they did sacrifice quite alot
Doesn't matter when amd is flapping in the wind with ancient gcn bottleneck front end shit

I'm running re2 on ultra 4K at 60 fps steady no problem