What's the difference between virtual super resolution and SSAA?

What's the difference between virtual super resolution and SSAA?

Attached: images.jpg (512x287, 26K)

nothing if my understanding is correct

however, not every game will supersample
vsr works in the driver to make the game think its a higher resolution screen, so it will allow a supersampling aa even if the game doesn't have ssaa as an option.

That's pretty cool. Does it perform better than native SSAA?

no, you are running the game at a higher resolution, lets say you had a 1280x720 screen
you would be able to vsr
1920x1080
2560x1440
3840x2160
and possibly some in between resolutions, the games would be running at those higher resolution internally but driver would turn that inturnal to your resolution, much like supersampling would do outside of driver... that said there may be cases where the implementation of ssaa is so fucked that you get a performance boost just running it natively, god knows most of rainbow 6 siege modes were fucked aa wise.

So depending on the application, you may want to use their SSAA (if you were going to use SSAA anyway) or VSR?

almost always you would go with application, there are some games that profoundly fuck it up to the point it may be wroth going outside, but lets put it this way, if you are on a 1080p, you are either going to go with something about 2 times as demanding, or 4 times as demanding, civ6 apparently will do 8k and 16k supersampling, but generally, 4x would be the most you see benefits from.

Nice to know. Thanks user

Can virtual supersampling fuck up my monitor? I'm trying to run skyrim at 1440p but my monitor is 1080p

oh, just as a note, not all games that have ssaa are actual ssaa.

msaa is also a form of supersampling, however instead of rendering an entire game super sample, it only renders the edges, I have also see a game where ssaa was really just an fxaa that they called ssaa to note it on the box because the people who know what it means its a potential selling point.

really, if the game has real ssaa, its almost never an area where they fuck up profoundly, however implementations of msaa being fucked up are far more common so yea... if you run into a game that performs disproportionately bad with aa, go driver based over application.

no, it should be able to
look up a video about it to see what you are doing

framerate

HDMI 1.4 cables can overheat from doing that, otherwise you're fine.

no it cannot fuck up your monitor
>this dumb fuck question
>I'm trying to run skyrim
don't you have finals to study for?

Alright then, thanks. It's just that when I run skyrim at 1440p I can hear a horrible Coil Whine

google coil whine
generally its not bad, the problem is its a part of something we try and keep as quite as possible so the noise can be a problem.

also watch your gpu, you are putting a far larger load on it then normal, there is a chance that if it was dusty or a few years old, you may push something to a breaking point.

VSR uses your GPU to do scaling so no. Your monitor just sees it as a normal 1080p signal. Doing a custom resolution will make your monitor do the scaling if it accepts the input and can give you higher quality but even then the worst that can happen is it just won't accept it.

How can you tell if it's not real ssaa? I've pretty much just been avoiding aa altogether but ssaa kinda interests me

VirtualSuperResolution can fuck up game font sizes when the UI and the actual 3D scene use different redertargets. It also limits the refresh rate on 144hz panels. SSAA is here since Windows98 times and therefore a well established and supported feature. For these reasons I prefer SSAA.

Use an FPS limiter in order to limit coilwhine. Skyrim engine breaks anyways past 60fps.

>HDMI cables can overheat from that

Attached: 1457921761338.jpg (528x792, 66K)

1/2
Here's a thorough breakdown of what you're asking.

All non-shader forms (and some shader types - but not all. FXAA is basically a fast blur filter and Temporal AA is basically blurring the last x frames of motion together) of AA essentially draw and then "sample" a pixel multiple times, the multiples being what you've set in your options or GPU control panel.
For example, if I were to force a Radeon GPU to SSAA 4x with an 8x tent sample, the GPU would draw the frame at an internal resolution 4 times higher than the set resolution, and then sample every single pixel in 8 different places. Then the GPU would compare the samples, calculate an average pixel color, combine that data to create a more visually smooth image, downscale the entire image to your set game resolution, and then send it out to the monitor.

Sampling work takes patterned collections of sub-points from multiples of the native resolution, compares them to the other samples, and then recombines the image to average out the color differences between pixels to blur the staircase effect. Then the GPU downscales all of this data back to the output resolution of your monitor.
In effect SSAA renders every single pixel of the game multiple times, sampling different "mini-pixels" of every given pixel, and recombines the result to average out the alias.

2/2
VSR or the equivalent DSR for Nvidia allows the GPU to internally render at a higher resolution and then downscale that image to your monitors resolution. It's very similar to SSAA, but you can choose a sampling rate that is less than a multiplication of 2. What this does, though, is introduce scaling induced blur into the frame. VSR/DSR work best when using multiples of the monitor resolution, with the caveat being your GPU isn't performing the extra work SSAA requires (upscaling, multiple sampling, comparing, recombining, then downscaling). Instead, you're already drawing at that higher resolution to being with, and so the hardware only needs to downscale the render to your monitors resolution.

And to top it all off, none of this is true Anti-Aliasing. SSAA is a point-sample approximation technique which is less computationally expensive than the multi-filtered fourier forward and then inverse transform operation required to achieve true anti-aliasing,.

Think of SSAA as creating a complex 3D grid for every single pixel of the image, comparing all of those points within every single pixel, calculating the best result to blur together pixels that create a jagged line (also using pixels that aren't making the stepped/jagged line), and then recombining it all back to make a more smooth image, befire scaling it down and sending it out to the screen.

VSR/DSR can provide the same 2x or 4x scaling/anti aliasing effect using less GPU time because it is natively rendering at that higher quality, and then simply scaling it back down.

>HDMI 1.4 cables can overheat from doing that.

Please ram a knife into the wall power outlet and die.

This is probably the best explanation I will find on Jow Forums in a while. So 1080p to 2160p VSR will perform better than SSAAx4?

Yep!

VSR/DSR essentially lets you use 4K when you have a 1080p monitor. You'll effectively have the GPU performance you'd have if you were using a 4K monitor, but it is better than using SSAA to achieve a similar quality.

Oh cool thanks Jow Forumsuys
A while back I was testing VSR with furmark with my 1080p monitor and 1440p performed worse than 2160p so I found that pretty interesting

Don't use Furmark for performance testing. Its main purpose is to generate heat.

Oh lol what should I use?

Unigine Superposition, 3DMark, games that have integrated benchmarks. Metro 2033 is especially well for benchmarking as the results are very reproducable. Have a look at the "Gamer's Nexus" testing parcour for their grahics card reviews in order to get some inspiration.

Unless devs are being generous/competent and have it in-engine, SSAA/SGSSAA is generally DX9-only

Attached: portal2 2018-05-09 05-22-31-94.jpg (1920x1200, 748K)

Just coming from someone who who's been there and back, obsessing over AA.....

Anti-aliasing, and all the technology that surround it, is only as useful as much as jaggies bother you. Yeah sure, you can tell the difference comparing screenshots, but how many times are you simply staring at power line and saying, 'you know what would make that better--some AA and potentially a 20% decrease in performance!'

Or to that extent, one popular form of AA is FXAA, which is usually critiqued because of its tendency to blur the image. Again, however, this doesn't bother many people and requires little to no performance overhead.

If you have a decent enough card, it's usually easy enough to set the game to its highest preset setting and not give a fuck. It's when you have a low-to-mid-range card that you start obsessing over different AA and the performance hit. Or if you're already playing at 2k or 4k resolutions, you might not even need it.

Attached: 1270342945299.jpg (574x591, 17K)

that image is pure sex. dx11 was a mistake

Attached: witcher2_2017_05_04_13_25_43_074.jpg (3840x2160, 3.67M)

No
Consoles are the mistake
Just about majority of the technologies introduced this past decade are due to the growing disparity in performance between consoles and PC gaming.
You can usually tell the difference between a console game ported to the PC, and a PC game ported to the console, simply from the amount of possible graphical settings. That, and the performance overhead due to shitty optimization
>Looking at you Ubisoft

What bothers me is where you get a situation like Mankind Divided, where its lighting makes jaggies super visible and its only default AA setting is temporal AA, which results in massively noticeable and annoying artifacting on anything meant to be a computer display in the game. So there's no winning.

SSAA makes everything blurry for some reason

nevermind the shitty texture resolutions console games have

OP here, does screen recording (software such as relive) with VSR record in 4k if set to 4k? And does SSAA record at the original resolution?

vsr make ui elements blurry as fuck, ssaa does not. pretty easy to tell what is better.

The issue I have with it is the resizing of the signal to the render to the monitor resolution seems to be a bit shitty at least with nvidia's DSR. You can see it with fonts and thin lines - at 4K on 1080p shit is a bit too blurry and other resolutions (like 1440p or 1620p to 1080p) look like they are resized in paint.