Is raytracing a meme? I can see the commercial need for raytracing and the genuine need for it (special effects...

Is raytracing a meme? I can see the commercial need for raytracing and the genuine need for it (special effects, specialised uses such as flight simulators, etc), but it seems wasted in gaming. I seriously doubt gamers will give a fuck

Attached: raytracing.jpg (1280x720, 103K)

Other urls found in this thread:

tomshardware.co.uk/nvidia-rtx-gpus-worth-the-money,news-59055.html
twitter.com/AnonBabble

It will be good when everything be able to support it, instead of being yet another truform/hairworks.

Ray Tracing is the holy grail of lighting effects. We are on the cutting edge, the transition to a new standard has begun. People used to say the same thing about anti-aliasing in terms of performance hit, and now it's assumed to be there.

has anyone done the specs on what it would create to make graphics look like reality an avoid the uncanny valley

it's a meme pushed by nvidia to sell their shitty cards with $10 asics slapped on

It's not a meme, but the technology is still too new. If you want to upgrade your card wait for the inevitable GTX 2080 release.

Buy a RTX in a few years.

it's just more realistic light and reflections, that's it


textures, draw distance and animations are far more important steps for "next gen" graphics

It depends on what type of gamer you are, are you the competitive type that wants stable high fps and no eye candy to distract you from seeing what you need to see? or are you the 50-60fps g-sync/free-sync type that just want the absolute max graphics and all the eye candy?

I'm a competitive fag so all I'm seeing with gaytracing is fps loss, Even when I'm playing campaign, I prefer high stable fps over all the pretty effects and eye candy.

Also from a competitive fag perspective I have more concerns with gaytracing and future games that will support it, how will it affect the frame times? reflection bugs? will game devs account for all the reflections or do we have to beta test the game for a year after the "release" to fix all the broken/over powered reflection that renders common competitive fps tactics useless?..etc

>Ray Tracing is the holy grail of lighting effects
It's more like the end of the rainbow in how the closer you get to it, the further away it seems.
The closer we get to real time raytracing in levels of detail that were revolutionary 10 years ago, the more the same hardware allows for greater levels of detail with approximate lighting methods.
The approximate methods are getting so sophisticated these days that raytracing itself might just be a relic we'll grow past outside of the film industry

From gaming standpoint, it sure is at least for now.
Consoles can't do ray tracing very well and I doubt they'll do so even next generation. Right now consoles aim at 4K instead. Amd you have to admit, consoles are the one pushing gaming forwards and setting new standards. So I doubt anyone will be using ray tracing "properly" any time soon. If anything it's going to be an extra setting similar to different kinds of ambient occlusion.
Also the performance. It's just not good enough for mainstream gaming if Tomb Raider runs at 1080p with drops to 33fps on 2080ti.

Someone just needs to release $300 GTX1070/1080 equivalent without ray tracing hardware for mainstream gamers.

tomshardware.co.uk/nvidia-rtx-gpus-worth-the-money,news-59055.html
Fuck you, buy the card

Attached: 886.png (600x707, 380K)

Except rasterization is very close to raytracing results, it just takes much more work to achieve the result while in raytracing you just need to set parameters and it will work most of the time as intended.

I'm willing to trade memetracing for more actual physics and polygon. I miss cellfactor and the ageia times tech despite ageia being a rubbing hand merchant.

Attached: nvidia.jpg (1024x785, 501K)

What is sarcasm.

How is G-sync/freesync not a top priority for you if you are trying to be competitive idiot.

Do you not know what real-time means?

>hardware aytracing
>$10 asics
lmfao you're clueless

???
Did you know about Gsync/Freesync fps range?

You're about as stupid as I'd expect from a 'competitive fag'. You know you don't need to use Ray Tracing, right? Just like how you turn everything to low when you play now. You provided literally 0 (zero) arguments against the technology itself.

>You know you don't need to use Ray Tracing, right? Just like how you turn everything to low when you play now. You provided literally 0 (zero) arguments against the technology itself.
The ray tracing hardware is the reason why RTX GPUs cost this much. Otherwise you're wasting money for most likely not a massive performance jump after 2 years despite costing way more, if you're just going to disable it.

positive of a ray tracing is, it's easy.
Want to create a mirror? Done.
Have a unique piece of furniture that needs to change colour based on angle that light hits it? Done.
Negative is that good quality real-time ray tracing isn't possible with current hardware or upcoming once.

Realistically, whether raytracing is widely adopted in games or not (remember that the consoles all use AMD tech which is incompatible with Nvidia's proprietary raytracing) the 20xx generation is almost certainly one to skip. The tech is still new, buggy, and resource inefficient. 7nm gpus will be here in a year and they will do everything the 20xx cards do and will likely do it much better. By then raytracing will have had time to mature in games and it will be clear if it's worth the investment or not.

> Is raytracing a meme?
Ray Tracing? More like Gay Tracing.

Is ray tracing nvidia proprietary technology like PhysX, or will we be able to expect it to be widely used?

If you have to ask, you are naive. Everything Nvidia does is proprietary.

In that case you can't expect it to become anything big. Pretty sure consoles are gonna stick with AMD, and that weighs a fuckton as far as where game devs are going to put their resources.

Currently? Yes a 14tflop gtx asic risc ai meme can barely run reflections at 2k let alone shadows lighting and ao + reflections I can't see these first gen cards doing it all only parts also have not explained how the new aa works with older games oh that's right it doesn't

DLSS will have to be implemented through patch.

>Believing ASICS cant process RT
>Believing chips arent dirt cheap to manufacture
Go be a r*dditor somewhere else.

So it's just ai meme temporal with moar samples
I wonder if it's low latency and doesn't add any additional frame time to render
Also when not using rtx crap does thwt free up the rtx part of the card to do other stuff?
This
7nm and 5-3nm is as small as we get before we start looking at other ways to get massive performance jumps with Monolithic chips

>what is input lag?
>what is gay-sync / fag-sync frame range?
>what is frame buffer?

>you didn't make any argument against gay-tracing so you're stupid for not making any argument against gay-tracing
>buy overpriced gay-tracing gpu that is overpriced because it is a gay-tracing gpu so you can turn off gay-tracing

Attached: 1533586683018.png (293x326, 36K)

Have people already forgotten that Nvidia grossly overestimated mining GPU demand? What better way to get rid of surplus 12 nm chips before 7 nm comes out than to slap an asic on them and promote another physx gimmick.

The RTX cards are good for regular rasterization if only because they use GDDR6 which has the same throughput as HBM2 at the cost of causing house fires.

I think raytracing will start killing off regular rasterization by the time we get to HBM3+/HBM4 which will quadruple memory throughput over HBM2. Will have photonic interconnects/interposers on the die at the point. Raytracing has better memory access patterns that will scale up faster than rasterization can keep up on future iterations of hardware.

Attached: 989c22c0b3a008d8df419817989160bae3d25247f12687121bf8fb0c970d6f64.gif (300x202, 1.13M)

Nip girls are so cute.

Liar, you can't see her nips

Right now it's a meme yes. Anyone who buys one of these overpriced shitboxes now is a complete bugman. They are shilling to gamers and if gamers know anything it's that developers jump on the latest trends without fully understanding what they're getting into so we have at least a year of shitty performance from raytracing regardless of the hardware. These pieces of shit still can't even get Gameworks to work well.

>at the cost of causing house fires.
It uses less power than GDDR5, what are you on about.