What is the most sophisticated antialiasing algorithm?
What is the most sophisticated antialiasing algorithm?
Other urls found in this thread:
en.wikipedia.org
archive.rebeccablacktech.com
twitter.com
higher resolutions
I don't know if you're trolling but that's unironically true. Supersampling is just "lol, higher res", whatever bs novidia tries to rename it to.
Yeah but that breaks a lot of games' GUIs. And it doesn't work on CAD programs for instance.
Wat. Supersampling is very mature and stable.
4x the resolution of your monitor. Or 16x, if you have the hardware capable of rendering such resolutions.
From my experience, TXAA because it takes into account inter-frame changes.
Oh I meant downsampling
Supersampling sucks rat's ass compared to SMAA, both in quality and performance.
if (picture.isPixellated()) {
dont();
}
>SSAA is worse than SMAA
What is the scientific basis of this clam?
digitalfoundry has a video on this
go check it
maki is dumb
Neural networks doing superresolution
Super sampling is the opposite of sophisticated. Computational power != sophistication. Why is Jow Forums so dumb?
SMAA is blurry as fuck like most post processed anti aliasing
Don't know and don't care, I always run at a higher resolution than most so I turn AA off.
It is if you tweak it wrong or depend on developers like the /v/babbies.
You could run a higher resolution AND use antialiasing.
Zoomers bullshitting ITT
okay monsterpill me
from what game ? Graphics looks comfy
>You could drain performance for a hardly noticeable sharper edge, if at all.
It's literally a detriment at 4K
your brains
Risen, friend.
why don't devs just draw normal curves rather than jaggy ones??
Because that's how GIMP works
can some one explain to me how supersampling works.
how can rendering a image at 1440p and downscaling it to 720p make the 720p image sharper if its only on a 720p screen?
it confuses me.
I noticed in BF5 it helped but BF5 has forced TAA that you cant turn off with no TAA would it make 0 difference above 100% render scale?
It doesn't make it sharper. It's just blurry at the points it was "steps" before.
It doesnt makes it sharper. It is just blurred where it had "staircase" before.
Supersampling.
Rendering at higher resolutions and then downsampling it.
Sounds like brute force, nothing tricky.
wtf are you talking about TAA or supersampling.
im so confused. I play at a really low res because "esports" and games like BF5 having forced TAA is killing me.. but supersampling helps.
if I can hack the game to turn TAA off will supersampling still help or not. with its forced TAA supersampling ligit made things in the distance look like they had x2/x3 the pixels not just on edges.
SSAA or DRS/VRS
>If I can hack the game
Yes, supersampling is the one true way of increasing fidelity. Antialiasing is just a workaround, a hack. TAA is one of the worse algorithms, at that.
I think tanagram checkerboard upscaling is actually the most "advanced" AA. I think its a PS4 exclusive technique possibly even Decima engine exclusive.
its really complicated technique and renders the game in triangles not 100x100 or what ever.
its fucking weird I don't understand how it looks but upscaled it ligit looks like native 4k with abit of shitty AA on.
ANY antialiasing method does that. Takes a "stairs" artifact created by the fact you have low resolution and you're drawing lines in an angle and blurs them at points to make the delusion it's smooth now. Depending on the method, it's more effective in the delusion or less.
If you want perfection, wait for 16K on 24'' screens. They won't need antialiasing at all.
oh so if supersamping at 200% render scale was helping even with TAA on with TAA off it would help even more...
I still don't understand how it works it literally made things in distance look like they where made up of more pixels how the fuck does it do that if my screen is only 1024x768
That's the way to do it. Everything else is "tricky", as it tries to hide the flaws instead of just rendering without them.
Adaptive supersampling is one way to do it slightly better performance as it only renders the edges at higher resolutions.
But Supersampling (rendering at higher resolutions and sampling it down) *IS* the definitive way to do it. We rarely used it before because we lacked the horsepower to do it at usable framerates in real time.
>how can rendering a image at 1440p and downscaling it to 720p make the 720p image sharper if its only on a 720p screen?
At 720p in that situation you're sampling one point per pixel. At 1440p on a 720p monitor you're doing 4 samples per pixel blended together. Why wouldn't it be a more accurate representation?
anything that's not FXAA
Stop being inane. Supersampling is just another method of antialiasing; it just happens to have the best results.
If you want perfection, wait for 16K 24'' Monitors. They won't need antialiasing at all.
What about morphological filtering, that works great.
I'm going to test this later but, if I have a 1080p 144hz display and an old game that I can easily push 144fps at 4k with, would I technically be able to downsample from 4k using my GPU drivers while still getting all 144 frames on screen, or would I be limited to seeing only 60fps due to the display cable bandwidth?
Obviously, you're actually having more information to work with, instead or blurring the information you already have.
Supersampling is the way to do it. Obviously the monitor is at fault here, at 16K you're still basically supersampling, but you lose no information displaying that via downsampling.
Depends on the game. I can easily play many modern games at 4k downsampling to 1080p with minimal performance loss, because usually the game is shit and CPU bottlenecked or badly coded. Modern high end GPUs have tons of power in them.
I meant literal 16K 24'' monitors. They don't exist yet; not even 8K if I'm not mistaken.
All this downsampling happens inside the GPU which finally pushes a 1080p image
how does it pick which 4 of the pixels to show on my 1pixel screen thou?
that makes no sense!...
it literally increased resolution on flat surfaces not just edges like a window frame in distance instead of being a blurry mess was clearly made up of more data. I don't know if it was just powering thru the shit TAA and making it look better because I couldn't turn the TAA off.
with no TAA would supersampling only make edges smoother or would it actually show more data at longer distance.
HOW11!! the fuck can it do that.
same blurry shit, different name. i'd rather have jaggies.
Don't overthink it, it's just "zoomed out" from "very big" to "smaller" and now the "staircase" of some artifacts becomes "blurred staircase".
All becomes clealer once you realize all antialiasing methods (including supersampling) are a delusion, they just blur things.
Just pick the most represenative samples and average them out, that's literally the simplest AA right there. Blurry, but works.
yer I think if BFV didn't force TAA I would have understood it better whats a good game to test it in? arma?
just wondering if supersampling at 200% render scale could actually give you a competitive advantage for pixel spotting some one or just asthetic?
Because at 720p rendering, you're rendering one pixel per monitor resolution, while supersampling, you're rendering four and when it gets downsampled back to 720p, it takes the average of these pixels and places it on your low resolution monitor.
If you render less jagginess to begin with, you end up with less jagginess on a low resolution monitor also.
>how does it decide which of the 4pixels is more accurate to show on my 720p screen thou.
They're blended together, it doesn't decide which of the 4 pixels is most accurate. Now depending on the type of supersampling there are different sampling patterns.
en.wikipedia.org
>that's the part I don't understand. it literally looked like it was showing MORE data not only on edges but flat surfaces etc.
That's because it was showing more data, it just showed that more data using the same amount of pixels.
The pixels you see after rasterization is just an approximation of what's actually being rendered. By increasing the sample count you increase the clarity of your approximation but you're still going to be limited by whatever your output resolution is.
Supersampling/downsampling will ALWAYS give the best result over any other AA hack. Plus it enhances picture fidelity everywhere, not just at the edges.
I don't know about DSR, but VSR can downsample into 1080p while only rendering the game engine itself at higher resolutions, overlays like GUIs stay the same scale as original 1080p would.
This can depend on the game too though.
>pixel spotting some one
I mean, if we're talking 4K*200% here, the guy would have to be farther away than you can aim.
>just wondering if supersampling at 200% render scale could actually give you a competitive advantage
Maybe in like CS or something if a character's head is just barely peeking over an edge. The lost frames typically wouldn't be worth it though.
Easily, neural net tensor cores on the RTX 20 series cards.
Sounds cool, where can I buy some RTX 20 series cards?
> What is the most sophisticated antialiasing algorithm?
4K @no AA. Imagine getting a new hardware to get rid of sharp edges.
Stop being inane. Until there is 16K on 24'', antialiasing can be used.
Yes, it's advanced, but it's still a trick.
Supersampling is no trick, it just _is_.
Okay that sounds good, thanks.
Depending on the size and distance viewd, 4k screens can get pretty close already.
The problem with AA in general is, our monitors still suck. Supersampling fixes only the game engine related side of things, we still need better monitors also and we wouldn't need any AA.
2080Ti is way overpriced.
But, how can you argue that DLSS is not the most sophisticated AA?
The amount of samples you can gather increase exponentially with resolution so are you sure 8x SSAA on 4K is that much worse than native 16K?
Which is why I said 24''. For regular PC usage 16K will be a point antialiasing can be useless (and therefore any higher resolution useless).
Unless you stick your nose on the screen, which is irrelevant.
????
Why the fuck are half the posts here deleted?
Aspergers
You cannot add detail to anything. You can only change the pixels on jaggies to blur them in various ways in order to make them look more natural. The only true way to antialias is by adding resolution. Either way puts additional stress on the hardware.
who are you replying to or how does this answer OP's question?
Thanks
Of course, because 4k resolution in the physical world displays several times less pixels/data than 16k.
>PS4 exclusive
What about the xbox 360? I heard it had some coprocessor for free MSAA or something.
Downsampling
DLSS
SSAA
Just another selective "upres" AA method to save performance from a full supersample/downsample.
No AA, stable 144hz
they do, dumb dumb.
janky hardware just can't smooth it out.
honest question: does the infiltration of higher resolution gaming (4k, 5k) kind of reduce the significant of anti-aliasing trickery?
why don't they just make pixels round?
DLSS
i tend to enable supersampling in games because i have a 1080ti and play at 1080p should i just disable AA then ?
No, it just increases the effectiveness (hugely lowers efficiency) of AA.
No, why?
>No, why?
because it's redundant maybe idk
Deep learning supersampling.
It's really not. Why would you think so?
Why would you supersample the GUIs???
because i know nothing about computer graphix
My GPU drivers choose what to supersample. Evidently, almost no games have a separate 2D screen buffer for GUIs.
I dont even use AA if im running 4k
FullHD with quality AA looks better than 4K with no AA. Even at 4K there are jagged lines.
This. 4K is very difficult to supersample. Just slap some SMAA and bobs your uncle
>SMAA is better than supersampling
the absolute fucking state of Jow Forums