Rtx demo's confirmed bullshit

youtu.be/eZgX4EfIiXc
Apart from reflections rtx is basically a useless shitty solution with a tonne of noise (grainy lighting shadows etc due to low raytrace sample count)
Based tech jesus shitting on nvidiots
Your meme deep learning aa is also a joke!

Attached: 1487720349817.png (900x600, 492K)

Other urls found in this thread:

news.developer.nvidia.com/new-ai-imaging-technique-reconstructs-photos-with-realistic-results/
youtu.be/MMbgvXde-YA
youtu.be/HSmm_vEVs10
youtu.be/P0fMwA3X5KI
youtube.com/watch?v=N9F3z8Fl0Nc
youtube.com/watch?v=YjjTPV2pXY0
youtube.com/watch?v=ofcCQdIZAd8
youtu.be/NXiV2kIsG0U
twitter.com/NSFWRedditImage

Why do all these Wojaks wear Mario hats?

Lazy fags I guess.
We need a rtx on off leather jacket meme but everyone's too busy fapping to and drawing princess bowser

*Luigi hats
How young are you faggot

I have to say, didn't expect it from curlyhair. Someone finally said how artificial metro exodus comparison is. Shadow of TR is also bullshit, but less visible.

That part about "denoising". Is that supposed to happen? Because the Metro Exodus and Control gameplay footage had very noisy shadows.

Surely that's not intended? It's like watching a movie or a picture where the camera ISO is set to 128000 instead of the normal 100 wtf

where's the fucking realism in noisy shadows?????

Just buy a second card and use it as a dedicated Phy- uh, raytracing card.

So that's why all the demos have been in slow-motion.

Metro Exodus and Control demo had by far the fastest camera movement.

news.developer.nvidia.com/new-ai-imaging-technique-reconstructs-photos-with-realistic-results/
you fucking retard, Nvidia already has deep learning technology to fill in missing pixels, so the small number of rays in a scene isn't even a problem
they can run the deep learning software on their tensor cores, the light with the ray tracing cores in parallel with the polygon drawing, and pixel shading.

That's exactly the point user it isn't realistic at all the only way to get rid of it is run more samples but it's deminishing returns kinda deal hence why I think raytracing in games is a farse until we have like 100gflop gpus with sufficient headroom.
Nvidia massively jumped the gun on this
Lol nobody is falling for that when the cards are $1200usd each
Yep it's a absolute joke I can't wait for it to all come crashing down.
Someone hasn't seen the tech in action youtu.be/MMbgvXde-YA
Fucking nvidiot

younger than you boomer

>t. leather jacket man

youtu.be/HSmm_vEVs10
Here's how it looks filtered VS unfiltered
No fucking way (ai) deep learning can get rid of all of it it's literally impossible it's just another bullshit nvidia fuckup
It's too early and too low sample count the tech simply isn't ready for mainstream gaming
Ps5 and such gen 9 likely won't have this either it's just too hard for gpus to process and puts extra strain on the rest of the system CPU and memory in particular hence why nobody is bothering to run any games demo's or benchmarks with mid to low end CPUs it's a absolute fucking lie

youtu.be/P0fMwA3X5KI
He also explains it more here

even if rtx actually worked, it still looks like shit. go outside once in a while and you'll realize the world isn't that reflective, rtx makes everything look like fake high-gloss plastic. nobody ever gave a shit about realtime reflections in games until some corporation told them to, now they're spazzing out like it's going to make their mindless friendship simulator any more interesting or less depressing. try-hard zoomer brand-whoring has really reached pathetic new heights in the last 6 months. you faggots need to get an actual hobby instead of sitting around with your mouths hanging open wiggling a mouse around and shrieking at your screens.

Real life is fucked user we ain't all rich big dick chads

Attached: 1517808718577.jpg (700x630, 115K)

don't worry ahmed, amd will put out a card that can compete with the 1080ti in 5 years. maybe 3.

JUST BUY IT ALREADY, YOU FILTHY GOYIM!

They already have Vega 20 it's twice as fast as thwt but it's for enterprise only and prohibitively expensive

Hand rubbing intensifies

Attached: 1525078347619.jpg (1045x1433, 220K)

You would think Nvidia would've done that on their own demo where each scene lasted for like 5 seconds rofl roflo rolforlof

Heh
Can't wait for them to have some actual competition in the consumer space bur the Vega 20 is 400w+ so all these new rt cards with ai in 202x will be housefires and that's on 7nm

Attached: 1501249258394.gif (160x160, 2.87M)

>That part about "denoising". Is that supposed to happen? Because the Metro Exodus and Control gameplay footage had very noisy shadows.
It has to happen that way. In real life all light is ray traced, obviously, but there's no pixels and there's no "sampling".

When you want to trace rays on a computer you have to take samples from each pixel in the image, by sending sample rays from them at random. When you've done that you look at how many objects the sample rays bounced off and construct a sample value.

Naturally, this is extremely performance intensive, and so Nvidia only has one sample per pixel. When that one sample is sent out randomly from the source pixel, it can end up anywhere, which naturally creates noise due to the fact that you don't have enough samples. What is common is to layer a denoiser on top of that, where Nvidia decided to use ML for the denoising. It still won't fix all the noise, but it looks a lot better.

Pic related, 1 sample per pixel in the top left, and each image is 2 to the power of n samples per pixel (e.g. last image is 2^15 = 32768 samples per pixel).

Attached: sampling.png (1024x1024, 1.3M)

Yeah, RTX was always going to be shit, that's why Nvidia had to order the devs to delay it until Novembet. Imagine bying a 1200€ card, and 1,5 months later you find out that it can't even do the one thing it was supposed to be able to do. The one thing they hyped up and which would supposedly justify the insane launch price

too young it seems zoomer

He obvioulsy didn't even watch the siggraph presentation and is getting roasted in the comments.

HAHAHAHAHAH PEOPLE ACTUALLY FELL FOR THE RTX MEME

Attached: FineZigzagDrongo-small.gif (480x228, 500K)

what devs... nobody is supporting this except some single developer indie teams that jumped on the opportunity to get a couple free graphics cards from nvidia as compensation for using it in their shitty awful premade asset games

>what devs... nobody is supporting this
BFV and TR devs. It's a grand total of two, hence the plural

Oh yeah, that's another thing. I sure would like to drop over 1k on a GPU so that I can enjoy quality RT games, such as BFV, Tomb Raider, BFV, Tomb Raider and BFV. And you just know devs will rush to support this tech because there are so many GPUs out there that support it. Since RTX cards are only >1k€, I'm sure everyone is gonna get one

its just tacked on
another form of their physx program that worked so well for them

the grain/noise is just there for the cinemaric films. Just like when you get 24fps with RTX ON. It makes it more cinematic since hollywood movues have grain (high camera ISO) and run at 24 fps

*cinematic feels

It's all still smoke and mirrors until I see actual fucking games running well using this shit and some serious commitments from devs I ain't gonna believe shit nvidia rushed it all and fucked up just like amd did with Vega

So wait a second they are judging the RTX tech by looking at videos ? VIDEOS ?

you are all retarded shills.

Fucking MORON. If the insane noise i visible from horrible youtube-tier video compression, then that shit will be even MORE VISIBLE in-game

>raytracing in games is a farse until we have like 100gflop gpus with sufficient headroom.

AGREEEEEEeeeeeeee

The leather jacket may be heading for a right direction but
it's too early for users to jump on the boat.

judging an entire new tech by looking at pictures and videos.

HEHHEHAHAHAHHAHAHAHAHA

you fucking shills are something.

>Stop judging our new tech by looking at it or based on what developers say or in terms of performance numbers! In fact, just stop kvetching altogether and buy it!!!
What's funny is that you're the one who's completely tech-illiterate. You're probably one of those people who used to rage at Digital Foundry for measuring framerate from captured footage, without being able to explain why you think that's bad. Because you don't know. You just assume it must be. Because you're completely tech-illiterate.

It's going to be noisy regardless of what Nvidia does, unless they do many thousands of samples per pixel or something (which they cannot do in realtime because the cards are simply not that powerful)

See

see this you brainlets

youtube.com/watch?v=N9F3z8Fl0Nc
youtube.com/watch?v=YjjTPV2pXY0
youtube.com/watch?v=ofcCQdIZAd8

its funny thaI am a dev, I dont judge things based on videos but actual tests.

They completely skipped like half the Cornell box examples, and steve couldn't even grasp why did they change the materials of the balls.
His "expert" seemed completely lost and failed to explain most stuff. Fucking jensen did a better job at explaining the cornell box. Also nope steve, you can't fake GI with moving lights.

Denoising is only going to get you so far. If your input is absolute fucking garbage then there's limits to how well your denoiser can work.

In Nvidia's case, they have a single sample per pixel. The denoiser isn't going to fix that completely, and will only do so partially.

t. CD Projekt Red bitch boy

What part of tech in development you dont understand you faggot ? did you watch the videos ? at some point power and ai will converge and will allow for an image indistinguishable from a ground truth image rendered offline. Right now its not perfect but its usable. nigger

when i was shitting nvidia here about ray tracing saying that they are just sampling one ray per pixel while a basic ray tracing scene requires at least 1000 rays per pixel i was being called a shill..

thats why their demo was being run on x20 faster as per nvidia?
oh yeah totally forgot about the impact on the TIME during processing on the pipeline eh

>What part of tech in development you dont understand you faggot
I understand the fact that non-existent information is literally non-existent and can't be "recovered" because it doesn't exist, you fucking dumbass. If you have white noise and run a very clever denoiser on it, it's just going to create garbage patterns regardless of how good the algo is.

That's an extreme example, but it perfectly demonstrates the point.

>at some point power and ai will converge and will allow for an image indistinguishable from a ground truth image rendered offline.
At some point graphics cards will have enough computing power to just fucking sample properly and you don't need a god damn neural network to denoise it in the first place. Either you'll have enough SPP that you don't even need to denoise it or you'll have enough SPP that any given traditional denoising algorithm will do without any ML bullshit.

>will allow for an image indistinguishable from a ground truth image rendered offline
This is literally impossible. If the information is completely lost then there is also no way to restore it. No amount of AI research is going to change that, EVER.

Your argument is literally equivalent to claiming that an upscale can look as good as a rendered image at native resolution. It's not happening because information to do it perfectly has been LOST. Your result will NEVER be as good as the properly rendered image.

>Right now its not perfect but its usable.
It's not perfect because the raytracing is fucking garbage and the denoiser doesn't have enough raw data to work with to create an acceptable result. It's only ever going to be good when the samples per pixel is increased.

he just said it as it is, ray tracing is not for gamers it is for developers who will move to implement ray tracing in their games.

RTX is an amazing innovation that constructs a BVH of the entire rendered scene every frame and then allows rays to be cast in that BVH. This has been done in software for years but has never been fast enough; NVidia finally took the leap and added this functionality to their hardware and I hope they use more of their die space for ray tracing specific functionality in the future because it will open up a whole new type of graphics.

So no, i don't expect you as a stupid fuck who barely even knows what a ray is to understand why its important that every single graphics developer no longer has to roll their own BVH and raycast shader and someone in the industry is finally trying to standardize such a fundamental task as casting a ray to a triangle. AMD will add it soon too.

Its hardly NVidias fault that they have to try and explain stuff like this to fucking gamers, I don't even know why they try.

indistinguishable is not the same as "perfectly recovered" for example most lossy compression produces indistinguishable results without perfectly recovering the input

Yes, that is correct, but in Nvidia's case the raytracing simply does not produce enough information for any denoiser, regardless of how clever, to produce a result indistinguishable from a perfect one. In other words, until they do more samples per pixel in their raytracer, you won't have very good results.

See , because a clever traditional denoiser (not ML) should be able to work with the 6th image (2^5 = 32 samples per pixel) and produce a very good result. If you look at the 12th image it's barely noisy at all (2^11 = 2048 SPP).

My point is that no matter how good the denoiser is, it simply cannot produce good results with only one sample per pixel, which is what Nvidia is currently doing. That is an unfixable problem since I get the impression that the 1 sample per pixel is a limitation they will never get past with 2xxx cards. Maybe in a few generations.

star wars demo runs at 4k with imperceptible artifacts and movie like quality on RTX (2080ti). how does Nvidia do this ? are they lying ?

youtu.be/NXiV2kIsG0U

Yes, it's not "real" realtime raytracing like leather jacket man wants you to believe, it's just an appoximation with a low number of samples and a denoiser with "deep learning", just buzzwords.

> it has to be brute force raytracing otherwise its not real raytracing

>1 SPP
What the fuck were you guys expecting? Real ray tracing uses hundreds of thousand SPP to get realistic images.

but Jensen Huang said that it just works