What a fucking joke DLSS is. Even 1440p looks better while getting better performance than 4k DLSS

What a fucking joke DLSS is. Even 1440p looks better while getting better performance than 4k DLSS.
How does Nvidia get away with this shit? This should be false advertisement.

Attached: file.png (1280x720, 1.37M)

Other urls found in this thread:

youtube.com/watch?v=3DOGA2_GETQ
twitter.com/AnonBabble

port royal is a nice example for a good DLSS implementation but also is probalby a best case scenarion for a machine learning upscale algorythm. its the same frame displayed everytime.
I really hope nvidia can deliver a better experience with DLSS very soon, at the moment its absolutely useless and gets quite rightly taken apart by reviewers.
i hope we will get more info from nivida soon, everyday without improvement just makes it worse for them!

Attached: ahhhhhhhh.jpg (600x776, 114K)

Not with this hardware. Like you also just mentioned, it's only useful for something that runs the same frames over and over, not games.

it jewst werks

Wtf are you smoking, the right one looks way better

It honestly looks worse, like someone jizzed all over the battlefield and used that to polish everything.

You have to also realize, not only does 1440p scaled look better and run faster, but that 1600p looks much, much better even and runs EXACTLY as well as 4k with DLSS.

You thousands of dolars in a 4k monitor and top silicon for this?
Computers are for work, if you want to play video games buy an old crt and a moded wii and play with it, you'll have free games for your entire life and more (actual games, not those unfinished feminist bullshit they sell today).

Looks the same desu

It really doesn't dude.

Here's a good example of why 4K still isn't worth buying into. On top of that Nvidia's latest cards are all first gen of some new tech which is also something to always avoid.

>Nvidia's latest cards are all first gen
So, let me get this straight. The power rankings should go like this after Geforce 30?

GTX 780 Ti - GTX 980 - GTX 1060
GTX 980 Ti - GTX 1070
GTX 1070 Ti - RTX 2060
GTX 1080 - RTX 2070
GTX 1080 Ti - RTX 2080 - RTX 3060
RTX 3070
RTX 2080 Ti - RTX 3080
RTX 3080 Ti

>Gaymz

I can't make a definitive decision based on a screenshot, maybe in person it's different

How am I supposed to compare the differences on a 1080p monitor?

Wow, that looks like someone smeared the textures, good fuck nvidia, you ok?
the 3070 would be at the 2080 level, at this point, its one below at this point for GAYMIN, though I couldn't speak about other areas of performance, but I think you would be close for those.

DLNS not DLSS. There's no fucking super sampling involved in any of this. It's fucking upscaling, not rendering something at a higher resolution. DLNS @ 1440p w/ basically TAA is Nvidia's marketing term of "DLSS". It's all normal sampling at 1440p, and then using tensors to inference a projected "4K" frame based on a single model. The fact that the final frame loses HUGE AMOUNTS OF DATA from the original frame, is proof of it being a shit implementation and pure marketing effort to trick dipshits into buying RTX GPUs.

you can clearly see in this image that the nvidia dlss is making the graphics way better. you're honestly blind if you think 1440p is better than 4k lmao. get a fucking life and stop fanboying on the internet kid

I'll always be a Nvidia fanboy, I'm just disappointed and slightly mad that I was promised more.

How about.... no?

yeah

>everyday without improvement just makes it worse for them!
And better for us. Nvidia shitting the bed and having low sales can only lead to better competition in the gpu market.

>the 3070 would be at the 2080 level
You are probably right. The 2060 is what's unusually powerful for a card of that level, so they might make the next iteration proportionally weaker.

The 2060 is really throwing its weight around, I think nvidia got a little scared.

Why not play native 4k instead, maybe with some fxaa if the game has grass.

wtf i need new gpu both NVIDIA and AMD offerings sucks. WTF HAPPEN.

Maybe Intel can make them scared again, and then they won't go easy on 7nm.

wtf is dlss and they look the fucking same. is dlss perform better?

"no"

>doesnt have a fps number in eithe rpicture

fucking retard

This man speaks the uncensored truth.

Still better than Radeon VII

why has no one mentioned the fact that we are comparing 1440p and 4k on a 720p screenshot? How is this representative of anything?

We aren't comparing a picture, the picture is just there as a topic banner, you're free to look up your own comparisons, this topic isn't to educate you on the matter but discuss it.

DLSS 2x is going to be the future of AA

>DLSS 2x
"4k" upscaled from 720p?

plenty of people in this thread are using the image in question as proof.

Someone actually link to a proper image then, whats the point of a topic without the actual image in question to compare

Native resolution with DLSS. No upscaling

This should be where the image is from.

youtube.com/watch?v=3DOGA2_GETQ

Attached: 1533045066716.png (1087x979, 200K)

why is this guys highest res still 1080

4k render
8k Ground truth

You can still clearly tell that one is less detailed than the other. Basically like supersampling at 1080p.

30XXwill be on 7nm, so more potential for improvement

Buy 1080ti or vega64 (and undervolt+overclock), whatever you can find cheaper. There's still a shitton of them in warranty until 2020-2021

Of course, but the chip on the 2080 Ti is so huge they will most likely shrink it first rather than take it to its limits.

why can't /v/ stay in their own shit board?

>The 2060 is really throwing its weight around, I think nvidia got a little scared.
We're probably getting close to the limit of what raster graphics can do, either graphics stagnate removing the need for highend GPUs or full ray traced/path traced engines take over. You don't even need a GPU for these engines, just spam enough cpu cores on a chip and you're good, x86, ARM or RISC-V doesn't really matter which. Nvidia loses it's monopoly here as the patents they use to prevent competition relate to raster based GPUs. Raytraced engines just want as many cpu cores as they can get they don't require specific graphics hardware.

You see raytracing on GPUs at the moment as they are somewhat more ideal than the average PC cpu, they don't need cache coherency or strong cpu cores. GPUs contain a lot of stuff other than the compute sections so they will lose in raytracing against CPUs optimized for ratracing.

>full ray traced/path traced engines take over
Would you like your frames by mail order or on an hourly subscription?

Huang really overpromised on this one.
The keynote painted it like free 2X performance boost that even has better quality than the same resolution in native mode.
Either they completely fail at neural networks image filtering design or they fucking lied shamelessly or both.

Attached: the way toothless guys are meant to laugh at you.png (942x359, 88K)

>good DLSS implementation but also is probalby a best case scenarion for a machine learning upscale algorythm. its the same frame displayed everytime.
Yeah but it might be more of the latter than the former. I agree that BFV seems to be underwhelming but it is not clear DLSS can be much better. Maybe it is just a fucked design and it would be better to make a completely different upscale/AA filter.

inbetween generation, 7nm from Nvidia and 7nm + new architecture from AMD should be a better upgrade point

Because DLSS is shit even when downscaled. That makes the situation even worse because downscaling should be generous to its flaws.