Why do so many people shit on raytracing as a feature and denounce it as nothing but a meme, a stupid fad?

Why do so many people shit on raytracing as a feature and denounce it as nothing but a meme, a stupid fad?
It's genuinely amazing to me that consumer electronics now have tech like this and can even produce very playable framerates in games with it on.

Attached: 519064-nvidia-geforce-rtx-2080-ti-founders-edition.jpg (810x456, 87K)

Other urls found in this thread:

youtube.com/watch?v=ZiIfSCuwiQw
en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support
en.m.wikipedia.org/wiki/PhysX#PhysX_in_video_games
geforce.com/hardware/technology/physx/games
youtube.com/watch?v=ZRAq9UefCNk
twitter.com/SFWRedditImages

There's nothing wrong with the tech itself, just how it was marketed and at what price.

Industry was pushing higher resolutions, refresh rate and high fidelity graphics. Now you gotta go back to 1080p 60fps with dips for fuzzy reflections or a more correct lighting or shadows.

Why?

So they can sell you a 1440p60 and 4k60 capable card again 3 and 5 years down the road.

the only amazing thing about RTX is how much Huang charges for it, the features are not there - neither in hardware nor in games
maybe in 2 years we'll get there

Price, """very""" playable framerates and implementation are questionable.
Little to no improvement to previous gen GPUs, while being overpriced. Doesn't seem to run well on non-highend cards. And why the fuck people want to see realistic reflections in a 64 players multiplayer game?
Since consoles will jump on that train soon this tech wont end up being a meme like physx, i guess.

I suspect it's the newest cutting-edge HI-TECH to sell to people. I think most gamers caught on that they don't need the most-up-to-date graphics card to play their games on the native monitor resolution, so they had to invent a new fad. I doubt any big vidya company was pushing for this. This isn't the golden days of Carmack whan Doom/Quake was driving gfx card development.

>Why do so many people shit on raytracing as a feature
Because AMD is shit at it.

>shit on raytracing
how many rays are metro and BF using?

/thread

>Why do so many people shit on raytracing as a feature and denounce it as nothing but a meme, a stupid fad?

because thats what it is currently. Realistic reflections aren't there let alone photo quality textures. GTAV photo real mods look more realistic than raytracing lmao

Attached: darker image for half perfomance.jpg (2765x851, 1.22M)

Ray tracing was amazing when the redditfags where all pretending that they could do it on their amd gpus last year, or so.

What? The photo you linked looks pretty damn good.

Its a fat dip in fps but I was able to play metro exodus with raytracing on high and push 60 fps+ the entire game with a 2080.

>Why do so many people shit on raytracing as a feature and denounce it as nothing but a meme, a stupid fad?
cope

Attached: 1489985053956.png (1228x1502, 1.07M)

The problem with most of the raytracing implementations is that the devs don't disable the other effects, and also usually the texture has some baked illumination on it.

That's why it still looks kind off.

To make RT work, you need some very flat textures.

It's not shitting on raytracing, it's on its implementation, pricing and how it fucked nvidia's top lineup price/performance

I hope lisa su sees this photo

Man this raytraced gta pic looks really good! Oh shit wait no this is prebaked lighting that doesn't chop performance in half, *and it looks better* oh shit

Attached: raytracing!.jpg (1920x1080, 837K)

2 years is way too short
My prediction is 2 entire generations before it becomes standard

as much as I hate pricing I must admit nvidia killed console parity with RTX
they increased graphical fidelity without harming development cycle and made PC ports look better than consoles first time in 10 years
it's not a FUD, it's a real tech, just a bit raw give it couple gens it will be as common as HBAO. remember when AO took half of your performance? I remember.

They shit on it because it's not ray tracing.

Remember when were not getting charged extra for AO, tessellation or physx? I remember.

>Remember when were not getting charged extra for AO, tessellation or physx
No, can't say that I remember.

Because it's falsely advertised, overpriced and poorly implemented.

I don't. It started with Pascal. And seeing how Nvidia losing revenue it will stop soon.
Navi going to flop hard, so AMD may come to their senses as well.

>Navi going to flop hard, so AMD may come to their senses as well.
What do you mean? AMD knows that they can charge whatever they want as long as it's not as much as nvidia.

Was that on 1080p, if yes then I might laugh.

And if not, then Intel will come and Conroe their greedy asses.

Leaks have said they are going to charge as much as Nvidia. You can see it in their Ryzen 2 pricing they are not competing for marketshare, ever again. Lisa wants high margin low volume for profits.

I'm saying the same thing, like 20205 or so before we really see a decline in rasterization.

>Lisa wants high margin low volume for profits.
I remember the last time amd tried that.

It's not a feature, it's a collection of algorithms that has been around since the 60's. It's not something exclusive to Nvidia hardware.

Its literally impossible for any lightning to look better than Ray Tracing because Ray Tracing is how light behaves irl

Wrong. They need more samples. Rtx now does only 1 sample per pixel with no bounce, then uses temporal filtering to denoise. Ray tracing only start looking god at 12 samples.

Most people are idiots. Some of the not-idiots are trolls.

FX-9590?

Ray tracing casts from the viewpoint. Unless your eyes shoots lasers this is not how light behave in real life.

Efficient hardware acceleration of them is exclusive to Nvidia tho.
Algorithms that can render shit overnight is one thing. Doing it real-time is completely different.

Well that's one. AMD always hikes the prices as much as they could whenever they think that they have something that the market will perceive as "better".

>I'm saying the same thing, like 2025 or so before we really see a decline in rasterization.
The problem is RTX cards are still raster based, the hybrid raster rendering with some raytraced effects sucks. You're better off waiting until you can abandon rasterization all together. Nvidia probably hates the idea of pure raytraced engines as their patents and tech are all raster focused. I'm not even sure traditional GPU hardware is optimal for raytracing, conventional CPUs are very competitive per watt at raytracing, GPU rendering became popular largely due to the lack of cheap high core count CPUs.

I don't really consider that a good thing, hardware should be implementation agnostic, there are many other global illumination algorithms besides ray tracing but of course people haven't heard of them because Nvidia hasn't marketed them.

>meme like physx,
I'm still mad. Ray tracing is the second-biggest possible improvement.
Common affordable physics card can radically change gameplay, not just the looks.

Depends on the complexity of the scene. Minecraft unironically looks great.

Yeah, but early raytracing is only a small part of a bigger picture, currenty rendering quality isn't good enough yet for accurate light to matter. Not to mention that even with acceleration it cuts performance in half.

The current rendering methods for light are more than good enough. The real problem is that models and textures are photo realistic yet. LODs aren't there yet

who cares about memetracing lmao

For the price they were charging for the 2080ti they should have shipped it with 4 stacks of hbm2 as well

>AMD knows that they can charge whatever they want
yeah, worked great for them from 40% market share to 15%

at 1080p even 1070 can run it at 35fps

fury, vega, 295x
all failures.

I am.

They had to do raytracing because in 10-15 years' time, raster performance would be more than sufficient for 99% of the population and integrated graphics would kill the dGPU vendors. Ray tracing raises the bar for 3d graphics and prevents the risk of integration for another few decades.

I do, after recent announcements of CP77,WD Legion(it also means AC in 2020 will have it) and Control having it, I even considered 2080 as purchase.

>I'm still mad. Ray tracing is the second-biggest possible improvement.
>Common affordable physics card can radically change gameplay, not just the looks.
youtube.com/watch?v=ZiIfSCuwiQw
So much promise and potential..

user, light rays are reversible

No, it's the fact that Nvidia locked it behind RTX for the 2000 series launch and now unlocked it for every GPU they have, even GTX.
While other GPUs, like AMD as also able to do it.
The special cores on the RTX GPUs only give miniscule benefit for raytracing, it's shit because it's Nvidia tricking it's own consumers to pay.

Nobody shits on raytracing otherwise, we have had Quake 2 with path tracing for a little over a decade now, the only stupid ones are the ones that get hyped about it now like it something new.

Attached: attentiongrabbingimage.jpg (979x1099, 151K)

It is, if done on a huge supercomputer taking hours for specific viewpoints and then baked into the game, instead of real time on a measly few billion transistors.
Huge difference and one is actually viable.

I was wonder why they don't stop.

it's not even real ray tracing

>The current rendering methods for light are more than good enough. The real problem is that models and textures are photo realistic yet. LODs aren't there yet
Most of those issues are related to raster graphics themselves, raytrace renders don't care that much about polycount as long as it all fits in RAM. Many simple materials can forgo textures and UV maps all together, the shader alone is enough when raytraced.

>Common affordable physics card can radically change gameplay, not just the looks.
But PhysX is still a thing. Pretty much every major game engine is using it for physics.
Developer just aren't bothering with all of the fancy effects nowadays.

>They had to do raytracing because in 10-15 years' time, raster performance would be more than sufficient for 99% of the population and integrated graphics would kill the dGPU vendors. Ray tracing raises the bar for 3d graphics and prevents the risk of integration for another few decades.
Ironically, their hybrid raster/raytrace system is mostly making people think raytracing is shit because RTX is vastly inferior to fully raytraced renders. It might end up backfiring.

Rape Tracing is butt fucking amd.

>But PhysX is still a thing
Ubiquitous affordable physics isn't. It doesn't matter how it's called. Therefore all we get from this is cosmetics which can be safely turned off. It's like breaking into the Fort Knox to steal a pen.
It could be much, much more. Non-scripted destructible shit is most obvious, but far, far from the only thing.
But it must work on all the platforms, including the relatively low-end AMD. No sane developer will do it.

Lots of games use physx for physics.

actually AMD should be better at raytracing just because AMD cards are better at math
people really think navi is going to be exactly how amd advertises it, but we all know how vega 56/64 turned out to be a huge disappointment with a high price tag
the refreshed nvidia cards are going to make the 5700 XT compete against the super 2060 and not the 2070

>actually AMD should be better at raytracing just because AMD cards are better at math
Your claim doesn't reflect real life.

Because we don't have the power to run it anything like properly, and won't have for several more generations of cards at least. Nobody is shitting on ray tracing as a concept. People are shitting on Nvidia's $1200 flagship product that can barely run a rough approximation of ray tracing, that often looks worse than the game's baked lighting, which they're selling to people right now.

>Non-scripted destructible shit is most obvious
It's still possible to do it. I believe UE even got some tools for destructible stuff. And PhysX can work on CPU just fine. I think it's mostly just devs that don't want to bother with it.

The only game where it's had a 'mind blowing' effect is Minecraft

Attached: 1531931358528.jpg (1192x670, 99K)

en.wikipedia.org/wiki/List_of_games_with_hardware-accelerated_PhysX_support
Small pool, and it's mostly cosmetics.
At what cost? Not everyone has a high-end processor, and that's assuming that it's close to being embarrassingly parallel, which I suspect isn't.

because they do not have great products, only half decent ones
some marketing analyst told them same shit Jow Forums repeats "cheap brands yadayada", not realizing half decent products should be priced accordingly

red faction already did it. it's a boring game, destruction physics do not make a game better.

>cope
except AYMD is gonna implement a non hw based solution as per the Crytek demo and the next gen console specs, much superior to Novideo's

>Small pool, and it's mostly cosmetics
That's GPU accelerated physx.
You're being dumb user.

en.m.wikipedia.org/wiki/PhysX#PhysX_in_video_games
>PhysX technology is used by game engines such as Unreal Engine (version 3 onwards), Unity, Gamebryo, Vision (version 6 onwards), Instinct Engine,[28] Panda3D, Diesel, Torque, HeroEngine and BigWorld.[19]

Looked up a video of 4K raytraced Minecraft, and certain shots really were very impressive. There was one of some buildings by a river where it looked more realistic than most photorealistic games released today despite the minecraft geometry.

Can rtx be used on game emulators?
Like mod old games with real time Ray tracing?

geforce.com/hardware/technology/physx/games
Ok, provide me an extended list, then.
So? My whole point that only the batshit insane developer will use it as an essential gameplay building block.
It will be something which can be easily turned off, like the hair, cloak or some realistically falling jars.
It's never emergent gameplay stuff like "put some explosives near the lake to flood an enemy base at the right moment".

>Ok, provide me an extended list, then
Here
>As one of the handful of major physics engines, it is used in many games, such as The Witcher 3: Wild Hunt, Warframe, Killing Floor 2, Fallout 4, Batman: Arkham Knight, Borderlands 2, etc. Most of these games use the CPU to process the physics simulations.
Also arma series, dayz, and many bus titles.
>So? My whole point that only the batshit insane developer will use it as an essential gameplay building block.
No. You're confusing (because you're dumb as fuck) physx physics engine which is a major building block with the hardware accelerated eye candy.
Many games use physx as it's part of the engine for physics, but don't necessarily implement the optional closed source hardware accelerated eye candy.
Just admit that you misspoke, and don't try to backtrack now.

you know what Nvidia shud do is release a "pure RTX" card, wich connects to your 10 series or 20 series via the SLI header and works like an accelerator (kinda like early physx)

it would allow for a massive RT core that could easely do raytracing in modern games at proper framerates
even when just pared w/ a 1070

Attached: card-top.jpg (600x549, 78K)

And it would cost... ?

no more than $200
I honestly would even consider it. especially if they manage to pair them to RTX cards for the future

Fury non x competed pretty well vs the 980. Vega 56 samsung memory is 64 with less oc headroom. Dual gpu cards needed devs to implement tech which allows both vram to be utilised (it's apparently possible as per amd quote)

RT core would need its own cache and memory and a wide bus limiting the minimum size of the chip. Moving so much data to another card would pull more power and the latency would kill performance. Putting the RT hardware on the same chip as the GPU is the only real solution for performance and efficiency.

Because every benefit that nvidia mentioned was already something that existed over 10 years ago. Reflections are an old and mastered art that nvidia wants to sell as a completely new idea.

>Fury non x competed pretty well vs the 980
nah, nobody bought it but crypto cult
it is competing when both products sell at least enough to notice. what RTG is doing is scrapping by.
rx480 was the only success they had in recent years.

>Reflections are an old and mastered art
imagine being this clueless

I would consider it if it did more than just rtx.
If it had other basic accelerators than maybe.

name one game with accurate full scene reflections

I've seen a couple example of that but they weren't really worth doing - quake with ray tracing drops from 1000 fps to below 60 and now looks nothing like it was supposed to, and isn't really any better, just different at best.

at 1080p 1070 can barely get to 20fps with rtx lmao

youtube.com/watch?v=ZRAq9UefCNk
works, not ideal, but it works

RTX 2080 can barely push 60fps at 1080p in Metro Exodus with RTX on.

Because it's just another marketing led solution to - oh fuck we're hitting a limit on performance increase without spending billions on new designs every two years, lets invent a new way to milk the epeening idiots out there for huge sums of money

1440p 60

Attached: 7_metro-exodus-bench-rtx-1440p-ultra.png (759x545, 24K)

pretty pathetic if you ask me

The 1080p 60 guy was off by 80%. People love to lie about RTX.

Attached: 6_metro-exodus-bench-rtx-1080p-ultra.png (759x545, 24K)

Attached: 1559891305943.png (1200x1841, 1.74M)

Because it is. The people buying this shit are the worst mindless consumer whores we could possibly have. They fed into the hype and not only paid the crypto inflation tax nvidia has kept on their cards but they also paid for tech that does almost nothing for massive performance drops. They forget nvidia doing the same shit with tessellation a few years ago. Reasonable people couldn't give a shit about some reflection in a puddle or some glowing fog being added to ancient games that looks like shit. It's a tired cycle of bullshit and i'm sick of the retards that gulp it up.

Still can't reach 144fps on 1080p with your $700 card lmao.

>No. You're confusing (because you're dumb as fuck) physx physics engine which is a major building block with the hardware accelerated eye candy.
I'm talking about computationally heavy usage. Now it's limited by processor, and it's not particularly good at it. Acceleration could've solved this problem, but alas.
Therefore one can do the cool physics-related stuff using the modern engine, it's just the performance falls like a brick compromising the game loop, which effectively restricts it to niche projects.
So the computationally-heavy physics-related tasks are effectively restricted to the eye-candy which can be turned off if necessary.