ITT: What's your take on the freshly revived "AMD vs Nvidia" image output quality?

ITT: What's your take on the freshly revived "AMD vs Nvidia" image output quality?

www.youtube.com/watch?v=3fmvxgCVzhY This shit blew up and people are once again bringing this shit into discussion even though you obviously need to zoom in quite heavily to notice any discrepancy between the two (especially if your eyesight is shit).

Quick fix for any quote on quote "issue" on Nvidia's side is to go into the Nvidia Control Panel and to select "Use my preference emphasising: quality" in the image settings menu. That way the image will be literally identical on both sides at the cost of roughly 2 to 5 percent performance.

Me personally: I cannot be bothered for the life of me. I've had both AMD and Nvidia and no matter the monitor, I just can't get butthurt over it. I have to squint and get up close to notice any literal thing however, the instant the AMD card got out of the way, the image is pristine on both and the Nvidia side does not look bad or shitty or anything. After all, you literally need to zoom in like crazy to get anything to look different.

Attached: muh 0.057x better image quality qq.png (1896x957, 1.47M)

Other urls found in this thread:

youtube.com/watch?v=-0LSZJyA0F8
twitter.com/SFWRedditGifs

banp

Attached: ref.png (1538x813, 1.09M)

Attached: the same picture.png (500x561, 398K)

autist

>>Quick fix for any quote on quote "issue" on Nvidia's side is to go into the Nvidia Control Panel and to select "Use my preference emphasising: quality" in the image settings menu. That way the image will be literally identical on both sides at the cost of roughly 2 to 5 percent performance.

Something tells me this is nvidia's way of ''cheating'' in benchmarks.

>ATI gets caught literally cheating out the ass in Quake 3
>fanboys never live it down and literally make shit up the rest of their lives to get nvidia back

based schizoposter

It's not "cheating" as much as it is literally saving lost frames per second by deleting quality that unless we're cats with overclocked brains, we won't be able to see.

This "image difference" is only perceived upon pausing, taking a deep breath and zooming in. It's not perceivable in live action. Not even slightly.

People are just being utterly stupid trying to find reasons to ditch one and go with the other.

I literally see no difference on a 1080P panel when comparing a 2070 to a Vega 64. Not even when getting close. But 4K? Nah. Physically impossible to notice differences, calling it right away.

Ati got caught cheating in Q3? Wow, source?

And yeah. It's quite the fad to make Nvidia sound/look worse when all they do is literally eliminate rendering where it's not needed. (or how people now see images softer when they are not and are blaming it on extremely aggressive delta image compression).

>schizo
You have to go back, jew.
Get that propaganda shit off Jow Forums you fucking sub-human.
You and your fucking Discord trannies.
>hurr durr let's stir up Jow Forums

youtube.com/watch?v=-0LSZJyA0F8

>even though you obviously need to zoom in quite heavily to notice any discrepancy between the two (especially if your eyesight is shit).
>protip 99% of Jow Forums uses glasses
>protip 99% of Jow Forums are autists
Shit nobody cares but everyone is gonna start shitting of the "worse" brand, which ever it is

www.youtube.com/watch?v=R1IGWsllYEo

>BUH BUH MUH AMD SO MUH BETTUR LUUUL ROFLMAO ROFLCOPTER ANIME

>it's literally a combination of missrendering and glitches as well as optimizations
>everybody is deluded

fucking go to sleep you dumb virgins. both gpu designers produce the same shit and its all reliant on who decides what to do with their render engine

>amd does better in one game, nvidia does better in two games
>they are a draw in every single other aspect

fucking hell

A lot of people don't understand the number of driver-side hacks required to get a lot of engines and specific titles working properly, which is where I'd wager most of the fairly small differences come from.

>adoredtv
I'll pass

>omega biased virgin nerd with at least one contact who works at AMD
>constantly bashes Nvidia/Intel
>"he's not biased guys, trust me. Guys?"

Yeah. no.

This is certainly true. Nvidia has a much more active driver team and it's probably because of that they've been picked up on far more often using these hacks. Recent one I can remember is the texture loads in doom4 taking a few ms more than amd's causing some objects to be blurry for a few frames. There are similar techniques baked into the UE engine that was used heavily to improve performance in consoles last generation.

The Nvidia one has details in the roof but it has the effect of making it look cheaper, where the AMD displays less detail but it actually looks slightly more realistic because it doesn't have black dots drawn over it.

I don't know about Image output quality but AMD's encoder sucks ass. It's doesn't even come CLOSE to NVENC.

>different angle of shot causing anisotropic filtering and mipmaps to look different
Literally bait, I bet it's a pajeet doing it for views. One more Indian channel for the blacklist.

Attached: Untitled.gif (968x928, 310K)

>filename
0.057 times better means about 17 times worse.

wew fucking hell.

back to plebbit nigger

image output quality is about how the shaders were programmed, it doesnt matter the hardware

yes, and both nshitia and ayymd also fucking write shader replacements out the ass for everything and it wouldn't shock me if the novidya shaders are garbagefire hacks to win meme benchmarks.

The only correct answer in this thread. Jow Forums is mostly filled with brainlets it seems

Attached: D7BA3A2C-B1CF-440A-BF6C-753EC9657F9A.png (420x540, 47K)