Realtime raytracing is here

>realtime raytracing is here
>AMD is nowhere to be seen, absent, not even a single option
FUCK THIS

Attached: 1552890963953.jpg (1173x671, 143K)

Other urls found in this thread:

cgicoffee.com/blog/2018/03/what-is-nvidia-rtx-directx-dxr
twitter.com/SFWRedditImages

>realtime raytracing is here
Where?

literally everywhere, download UE4 and see for yourself. Even broke indie devs have access to RT now.

Oh, it's nice. I'm so looking forward to running it on my 2200g.

Reminder that rtx is to real time raytracing what the virtualboy was to virtual reality

So Novidya creates extremely tessellated hairworks 2.0 so you HAVE to buy a new GPU again and you act surprised? OP is a homo.

kek

>hairworks
what does that have to do with anything you retard?
rtx is not the realtime raytracing tech, DXR is and that is not vendor specific

not on xbox, not on playstation, not on nintendo, not on laptops, not on phones, only on overpriced hardwarelocked GPUs on a small sliver of computers.
You know everywhere.

>real time
>its always a screenshot
eveytime

It's the same as in hairworks is also proprietary garbage that only one or two games are going to seriously use. Although hairworks is certainly more shitty than this.

>what does that have to do with anything you retard?

AMD didn't do that shit either. For a reason. Do you even comprehend bro?

>It's the same as in hairworks is also proprietary
it isn't. realtime raytracing is a DX12 feature called DXR made by microsoft.
the GTX 1xx0 series have it as well, although not accelerated because there are no RT cores.

AMD made their own hairworks alternative you mongoloid, also realtime raytracing is not nvidia's thing

point taken, but until cheaper cards support (hint hint also amd) it's going to stay a meme.

>but until cheaper cards support (hint hint also amd)
yeah well that is exactly why I made the thread. AMD needs to fucking respond already. it's not rocket science to build some hardware acceleration for a specific task like this. I want to work on this but I can't get myself to pay an arm and a leg for Nvidia's shit tier hardware implementation.

>>realtime raytracing is here
It isn't. Nvidias version of "raytracing" is doing like 2-3 samples per pixel and then using a shitty AI to do an abhorrently terrible job at filling in the other missing 100,000-1,000,000 missing samples per pixel.

For example it looks nothing like pic related.

Attached: Glasses_800_edit.png (2048x1536, 2.9M)

Consoles.

you can't blame consoles for this when even pc has no good options.
t. tech illiterate
>nvidia's raytracing
no such thing

amd tressfx has been around for years now. it does the same shit hairworks did without the massive performance hit.

The point I'm trying to make is that Nvidia love to add shit like this to force sales. The butthurt fanboyism ITT is far to stronk, so I'll just leave it with you.

That image does not have 1 million samples per pixel. A typical ray traced image uses under a thousand. No RTX application to date uses AI denoising. Low sample counts are useful for most effects as seen in every RTX game so far.

>"Ray tracing is expensive. Like really expensive and it gets progressively more expensive the higher the resolution of the resulting render and the number of shadow-casting lights and reflective/refractive surfaces. Even the most powerful graphics hardware will struggle to trace with more than 1 to 3 samples per pixel for a 1080p-frame at a rate of 30 to 60 times per second. There's only so many rays you can throw into the scene every second and you have to manage this budget really well, otherwise you'll end up with a useless noisy mess or a slide-show."

>"So as far as I see it, RTX (in case of real-time ray tracing) it's a sophisticated temporal point sample denoiser coupled with sparse (really sparse, like 1 SPP sparse) ray traced data designed to cut frame conversion time to values acceptable for real-time applications."

cgicoffee.com/blog/2018/03/what-is-nvidia-rtx-directx-dxr

Point is Nvidia's variant of ray tracing is by all subjective and objective metrics absolute DOGSHIT. Not only does it look nothing like actual 100K-1 mil SPP ray tracing but even with a 2080ti you have to drop the resolution down to 1080p and get less than 60fps which ruins the entire experience every gamer expects. Combine that with the fact that 99% of users will get an RTX 2060 or weaker and can you really blame them for not wanting to adopt such an assbackwards meme?

That one had over 100,000 samples per pixel and it's just what someone would see on a fucking countertop let alone an entire room filled with furniture. Also YES IT DOES, 1SPP without it results in so much loss in detail you might as well be blind.

Attached: 1543284106651.jpg (750x523, 33K)

Did you actually read that blog post? He comes away with a very positive impression of RTX as an incremental improvement on current real time graphics. Here's a simple fact. Control traces about 1 ray per pixel for reflections. The reflections look great. Are they distinguishable from reference? Yeah, probably. They're also substantially better than any combination of SSR and environment maps or even the very advanced and expensive SDF cone traced reflections in that game. It's a nice improvement that we would have to wait another 5-10 years for without hardware acceleration.

We'll get it when the next Console gen hits.
Because of course, MS is going to DEMAND DXR support for the APU in their next Xbox, and Sony is going to want Ray-Tracing as well for the PS5.
So AMD is going to have to bend-over backwards to design, develop, test, and manufacture something that can do DXR/Ray-Tracing in the next gen of Console stuff, which is going to end up finding it's way to dedicated GPUs.

The issue being, I don't think AMD was preparing for that tech. They were still experimenting too much between Vega, Fury, and now Navi, and they've got too much shit going on, and no way to bring it all together reasonably. What we'll probably see, is some odd FPGA-like solution from AMD that won't do Ray-Tracing as well as nVidia's RT cores, but will be available to do compute tasks and more when it's not being used for ray-tracing, and AMD is going to tout how "FLEXIBLE AND COMPUTE POWERFUL OUR SOLUTION IS" and lord only knows how that shit is going to go. Maybe good, because then devs for consoles can use that FPGA for other shit in games if they don't need/want ray-casting, but at least PC devs will probably be able to flex those FPGAs into doing bizarre shit. Hardware emulation for perfect retro emulation, "AI" features like hardware accelerated OCR/facial recognition/computer learning, better transcoding for streaming and shit, and more.

AMD already has a patent for a somewhat different way of doing RT in hardware. Hopefully it works out well. I think Microsoft in particular will be interested in having it.

It's nowhere near ready yet, nvidia had to learn this the hard way when they were practically FORCED to release 16XX Turing without the gay tracing cores because of how bad demand for RTX was. Nvidia jumped the gun and thought that people would suddenly be convinced they needed to desperately burn over a grand to play the latest cow-a-doody/fart night with slightly better than global voxel illumination at fucking 40 FPS at 1080p.

They won't, RTX was disaster and they won't be able to aquire HW anywhere near that fast for even 1080p 40-60fps gay tracing for their consoles.

>RTX was disaster and they won't be able to aquire HW anywhere near that fast for even 1080p 40-60fps gay tracing for their consoles.
Are you retarded? The RT cores added like 5% extra die area to the RTX cards over the non-RTX Turing, 1660TI. Hardly a disaster

The 16xx parts were not a reaction to the response to RTX. It takes longer than a few months to revise an architecture and manufacture working chips. It was planned from the beginning because obviously a card that slow doesn't benefit from ray tracing. The evidence I have for RTX being ready is that all the games using it already run at 60 FPS on most of the cards and produce good visual improvements. You might not agree that the graphical upgrade is worth the cost. That means you are not in the target market for these cards. People who care about graphics are. Here's another important point. RT cores and tensor cores only take up about 10% of the die area on Turing. If you assume a cost of $100 per die it's $10 for RTX. A $10 cost for 4x higher ray tracing and AI performance is a steal.

>they were practically FORCED to release 16XX Turing
You peabrained mongoloid. GPUs take upwards of 5 years to design. Their dies were planned at least 2 years in advance, they take months just to build up inventory and many months more to ship literal metric tons worth of product by surface shipping. The fact that you think nvidia can just randomly shit out GPUs in a few months shows how dumb and delusional you are. Idiot.

Not him but it sliced deep into nvidia's profit margin when nobody wanted their 2080/2080ti cards and found that 1080p RTX performance on a 2060 was absolutely abysmal. The non RTX turing cards were going be released anyway but nvidia felt so sure RTX was going to have them rolling around in money they only made 2 of them (1660s are failed 1660tis).

I think in the end even those who were dumb enough to get RTX cards realized what a failure the RTX meme was. This is why only handful of games (mostly who?) even adopted this garbage technology.

So now you have unhappy customers who can't even use their meme RTX feature on the games they actually play and game devs who refuse to adopt it. The damage is far worse than what it seems on the surface.

Attached: 2019-04-16-image.png (1440x1000, 59K)

THE YIELD CURVE HAS INVERTED !!!!!!!!

If they sold Turing cards without RTX they would only be $10 cheaper. That's how much it costs them in manufacturing. No sales were lost based on $10. Some sales were made on the shiny RTX screenshots. It was a good business move. The reason Turing is not a large improvement on Pascal in raster performance is that the manufacturing process did not change so there were few improvements to be made.

Attached: cokedupkoreanspeech.png (1828x1740, 999K)

In my opinion it was just a scam that backfired on nvidia too much. Like you said it was just 10% more die space on a $1,000+ card aimed to play video games.

Graphics ignorants and AMD shills do not want you to know that this is what SSR and cube maps actually look like. They want you to believe that near perfect full scene reflections aren't a worthwhile improvement.

Attached: 19686-reflections2.jpg (1112x684, 65K)

>"that'll be $1,000+ tip"
>"on and you can only do this on 1080p and occasionally FPS will dip to 40"
Nah, missed me with that gay shit.

The chip on a $1000 card costs around $200 at most. That's $20 for a 4x improvement in ray tracing performance. It's $20 on a $1000 purchase for a technology never before seen in real time graphics. It's $20 for the first real innovation in graphics hardware since the compute shader.

If you don't want good graphics there's a wonderful card called an RX 570 and you're welcome to buy it.

>"JUST BUY IT!"

Except VXGI has been around for like a decade now, uses less resources, AND looks better than RTX. Face it we all got clowned and punked by Nvidia. That's why I sold my RTX 2070 on ebay and got a vega 56 instead. Yeah I get 20% less FPS in some games but at least I don't feel HAD anymore.

Attached: vxgi.jpg (1000x589, 168K)

VXGI has never been implemented in a real game.

Exactly. RTX magically comes in to save the day except it turns out you need to buy a 2080ti to get 40-60 FPS at 1080p when VXGI global illumination can do the same on any GPU with better quality results.

Is it really any coincidence the only way to use it is through gimpworks. Nvidia knew what they were doing.

RIP innocent processor.

Yeah, that patent is why I think they'll end up with an FPGA-like solution. The hardware is going to be multi-role, but "intended" for Ray-Tracing support.

>RTX was disaster
Look, I'm an AMD fanboy who bought an RX480 on launch day. But even I don't think RTX was a disaster, not even close. I don't think the Ray-Tracing's re-re-re-introduction to gaming is going great, but the RTX series of cards themselves aren't bad.
That being said, I think AMD is working with MS/Sony to really, really crack down on real-time Ray-Tracing's inherent unoptimized state of existence, so their new consoles can have it. After all, they were the ones that spurred AMD into pioneering more and more advanced dynamic render scaling tech so home consoles could "run" 4K games. I think they'll definitely want to chase Ray-Tracing next gen, and they'll compromise on something that half-works, like they did with 4K render scaling, just so they can say that they have it.
Good devs will make it seem just as good as what PCs can do/better, since it'll be purpose built for the hardware/software, but like always, PC tech will eventually "solve" the issue, then it'll get ported back to the "pro" versions of the next-gen consoles at somepoint for "full scene, high-volume Ray-Tracing" or something. With remakes of the first generation's games that had crappy Ray-Tracing, so they can make more money.

I'll go into the details on why no game has ever used VXGI. Voxel solutions are inherently limited by the fact that to get a linear increase in resolution or view distance you have to multiply memory footprint, trace time and voxelization time. You can never use VXGI effectively in a large scene or one with a lot of dynamic geometry. The voxelization cost and memory use spirals out of control rapidly. You can never get a resolution better than about 10cm cubes in a practical scene. That means if you have a wall less than 10cm thick it will simply leak all the light through as if it was not there. Voxel cone tracing is not free. It is only moderately faster than hardware accelerated ray tracing. Voxel cone tracing requires temporal accumulation or denoising just like ray tracing. In the end voxel cone tracing is usable for coarse GI and rough reflections but only if you can tolerate or mitigate light leaks and don't care about coverage for dynamic objects.