Why doesn't anyone here care about real time ray tracing? It seems like a big deal

Why doesn't anyone here care about real time ray tracing? It seems like a big deal.

Attached: 1*JLaJ1LILYAq4uXwiumKL3Q.png (1600x1202, 2.3M)

Other urls found in this thread:

amd.com/en/technologies/radeon-rays
youtu.be/yxrbyG4Xtfs
youtu.be/tB2RcvWRh40?t=81
youtu.be/nYLLvOFSHCU?t=531
youtube.com/watch?v=0wcEpMNUK8A
twitter.com/NSFWRedditImage

That's where you're wrong kiddo.

oh :(

Because the ""RTX"" card can't really do it?

Why would people care? It tanks your framerate and doesn't look that good to justify leaving it enabled. That's why raytracing has always been for pre-rendered movies, not real-time video games.

there is nothing you can't sell to gamerfags, is there?

isn't the whole point of these ray tracing cores that it can now do it in real time without being costly?

Because it was marketing BS that backfired horribly

It's a feature kind of limited to one visual style of gaming, ultra realism, and a very small percentage of games go for that look.

but nvidia can't into raytracing

Because

There's still a hefty performance cost, just not slideshow tier. The main benefit of the Turing cores are the deep learning potential.

Is AMD working on an answer to this?

I don't think they have the money to do so.

Why would they? It's a gimmick just like hairworks. It's almost the first thing people turn off because it looks weird or destroys your framerate.

If this premature shit wasn't forced down our throats so that Nvidia can sell overpriced and defective turds RT wouldn't actually be so bad as within a few years it will replace the current rasterization approach.

Though I'm sure the goy doesn't care about his missing frames because theyre dumb enough to run DLSS, might as well smear vaseline on your screen.

RT is the transition to OCL.

Attached: nvidia-geforce-rtx-issue1.jpg (700x398, 97K)

Long time ago.
amd.com/en/technologies/radeon-rays

I really want to buy the 2080 but I have no reason to.

because it's a marketing term?

In its current form, it somehow manages to make shit look worse AND tank the framerates.
I mean, if you have a grand to blow on an absolute fucking gimmick, then knock yourself out, but we peons would rather wait until the tech matures enough to be actually viable without having to sell your kidneys to make shit look marginally better in best of cases.

Attached: 1514915488007.jpg (639x477, 89K)

>amd.com/en/technologies/radeon-rays
because rtx is a sad gimmick. even amd does it better with their vega cards.
youtu.be/yxrbyG4Xtfs

No, but Wendell might try to patch a workaround that might work with amd hardware.

sounds like AA on the geforce2
or T&L

here's the thing about ray tracing

you need THOUSANDS of cores specifically dedicated to it to make it actually notably different from high-quality rasterization

and the rtx cards have less than 40 cores dedicated to ray-tracing each, right?

Do you see the problem with this fledgling technology? and how far shaders, rasterization and anti-aliasing have come in recent years?

It was a gamble on nvidia's part to take off down a road that noone was actively asking for and failing to deliver the product promised at a palatable framerate that performed noticeably better than conventinal rasterization.

they fucked up and hopefully this will be developed into workable hardware on the side, but not this generation. Probably not for a couple generations. we're talking leaps in computing power we've never seen before to make this shit viable.

The only card with Real time Ray Tracing is the GV100 Quadro.

RTX cards don't have "ray tracing cores" It's literally a branding lie by Nvidia.

The GV100 gets like 2 frames per day in real time ray tracing...

What software utilises this card? 3ds max with vray?

I just want one for that sweet GDDR6 bros

>It seems like a big deal.
It isn't

>2 frames per day
>in real time

Attached: 1486915223795.png (499x338, 38K)

Because it Technology relate, and this is Jow Forums, a board dedicated to shilling apple, and fussing over how many kb of ram your neofetch shows for the new besktop thread.

>part to take off down a road that noone was actively asking for
I fucking god damn hate when any fucking motherfucker make this absolutely SHIT argument that NOBODY was actively asking for. Or that nobody was asking for something that has been delivered. That's not how innovation works you god damn communist. It has been proved time and god damn time again that what the consumers is asking for has nothing to do with innovation. Consumers were asking for faster horses, not a fucking car. Consumers were asking for faster ships, not a fucking plane. No consumer were asking for a personal computer.

The worst part is that your pea sized brain understand that real time ray tracing is a innovative technology and the way of the future, but it doesn't stop you from making this shit argument. Surely you god damn realize that there's only so much we can do with rasterization. Stop making this fucking argument that people are not asking for something you motherfucker. People asking for something is the complete opposite of innovation.

Problem with RTX is that they are prone to break, buying one is not a good idea.

Yes that was the "point" but the reality is it still only delivers 45fps in best scenarios and even lower on average, often tanking below 30. Sure it looks marginally better but we've cheated lighting so well with less intensive methods that the difference is only really noticeable on reflective objects or in game development where the developers would otherwise be too incompetent to properly place lighting.

The main reason Nvidia is pushing it so hard is because they know that low-mid range cards will soon be powerful enough to not warrant any more upgrades for 1080-1440p gaming (eg. Navi). Enter the Raytracing meme which tanks performance for slight visual improvement. Nvidia can keep selling RTX cards for a lot of shekels and gamers will be happy to buy them because "it's the future durr". If you buy an RTX card, you're the goodest of goyim.

>It's a gimmick
It allows for things, like global reflections, you really can't have otherwise (without exponentially more horsepower using conventional rendering anyway).

I guess this is the first time a lot of people have seen the Nvidia Cycle.

>Hype up this new cutting edge feature
>New cutting edge feature makes every game run like shit
>Nerds declare new feature worthless, make endless posts about how mad it is and how they never turn it on
>2nd gen cards with said feature come out, and run it way better
>Feature becomes a "benchmark", insecure builders don't think their PCs are worth shit unless it can run feature well.
>High end cards that do it well start flying off the shelves as gamers eagerly upgrade for the "Nvidia experience"
>Cut to a few years later
>Feature is considered ubiquitous. Nobody bothers to turn it off. Nvidia announces some new bullshit and the process starts all over again.

Physx, Tesselation, HairWorks, HBAO, they always do this. It's part of their marketing cycle. RTX is cool, but I won't bother getting hype for it until it trickles down to midrange cards.

Attached: 1364438839686.jpg (456x297, 24K)

And yet the 2000 series is a complete failure and Nvidia stock is not even half of what is was worth on it's release.

Yes, one day ray tracing will come, but today is certainly not that day and it's not going to come from a card with a huge chunk of die space dedicated to nothing but ray tracing that takes so long to process it becomes the sole bottleneck on frame times as the card waits for it to complete.

>without exponentially more horsepower using conventional rendering anyway
Ahh yes, RTX sure fixes that problem by going from 200fps to barely hitting 60.

As long as game developers don't optimise their games, new and better hardware will be needed.
I swear they're in cahoots.

>PhysX

Not sure about that one, it never really took off. It was a cool gimmick in games that had it but it simply wasn't worth most developers time implementing it.

Tesselation is a great technique though. I wish more games utilised it.

You realize that even at 1 frame per day, its real time. This is why Jow Forums and Jow Forums is full of fucking idiots.

The sample Nvidia set up for real time hardware based ray tracing used four GV100 cards and acheived 24 frames per second in a tech demo. 6fps per card, in a tech demo that was optimized for the situation.

They stated outside of that, 1 GV100 would pull about .1 fps, and if it were in a game, about 1 frame per hour.

Actually it doesn't. Thats simulated ray tracing. Probably 1/100 of that for hardware based.

nvidia is using a specialised sip for rt. I imagine AMD would brute force it using moar ALUs.

It looks like shit because the reflections are low res and the frame rates are completely unacceptable.

Oh boy, I guess that meant we had "real time" ray tracing years ago when it was half a frame a day!

Or maybe you are too autistic to realize that when people think or say "real time" they mean processed at 30-60 fps and 0.0000116 fps is not "real time" to anyone.

That's a chicken and the eggs problem. When do you release this shit? Just keep dumping money and hoping? Yeah, the 2000 series are absolutely retarded. I'm not defending this. I'm attacking this retarded notion that people should ask for something before it's delivered. AMD almost went bankrupt because it tried to lead the market with multi core CPUs. Maybe the developers could've handled that amount of cores more efficiently giving an edge to AMD. But that's not what happened. Had AMD kept improving on the Phenom-type of architecture we wouldn't have Ryzen now, and the Epyc 7601 wouldn't be able to, on a engineering sample, beat two of the best CPU intels has to offers combined, meaning: the pace of innovation would likely be lower.

Mind you, it's not beating your competition that I'm focusing on. I'm focusing in the speed that AMD could release a product that have absolutely no answer from Intel simply because years ago they decided on a path that "nobody was asking for", and actually everybody was asking the contrary.

We will care when more games support it and the graphics cards don't halve their performance when it is turned on.

>You realize that even at 1 frame per day, its real time
real time is reasonable framerate to watch/play you utter imbecile

I think the problem, though, is more how bad the release was.
Remember, when the RTX cards came out you couldn't even enable the RTX features because there needed to be a Windows update first.
So the features in the APIs were behind the hardware. Ideally you would want the features to be added to the APIs first, then the hardware and then the patches to existing games and new games supporting the new features.
Instead we got the hardware, then the API features, then patches for a few released games to support it, and I think we are still waiting on a new game built with these features in mind.
And all the time they were hyping the shit out of real-time raytracing, going as far as that TomsHardware article to push it.

Yes, real-time raytracing is likely going to be the future basically from now onwards, but the sales pitch and delivery were terrible.

We are in agreement. I just hate any notion that the market should "ask for something". And talking about incompetence, the threadripper, specially the 2990wx suffers from a bug in the windows kernel that makes only 16 fucking threads work instead of the 64 it has because of the fucked up scheduler. It doesn't kill performance on everything, but it's downright impossible that AMD didn't knew about it.

Why didn't AMD and Microsoft just work together to sort that shit out?
Or does Microsoft just not want to improve their scheduler in consumer based Windows distributions?

Because we are two-three orders of magnitude short on computing power to get it done correctly.

Because the effect is very minimal for halving framerates. It's like the hairworks shit. Who cares about nice to look hair on a big tittied cumdumpster if the game is gonna run like ass with it

You forgot the most important common point for all these technologies, they look absolutely retarded and forced, Physx looked nothing like real world physics and more like what happens if you spawn 1000 watermelons in the same spot in TES games, hairworks is specially awful, look at this, youtu.be/tB2RcvWRh40?t=81
it doesnt even look better, let alone being worth the 20 FPS loss.
RTX is the same shit, right now is barely noticeable at best and outright buggy at worst.

Don't get me wrong, the few RTX games that exist look way better with it on. This is actually a bigger deal than Hairworks and Physx.

Which ones? Battlefield and Rise of the Tomb Raider barely look any different, let alone better, Atomic Heart has wonky reflections and buggy lightning.

DF had a good video on it. I wouldn't say it's worth the tradeoff, but it does look really good.

youtu.be/nYLLvOFSHCU?t=531

Screen space reflections are pretty terrible once you understand the limitations of them, I'll be glad when raytraced reflections become cheap both performance and money wise

Im hyped for DXR.
It has potential to look impressive. But retards at DICE thought what adding Raytraced Reflections to a game with negligible amount of reflective surfaces will be a good idea to showcase RTX. Thats why people say what it doesnt look good.
DXR Reflections could look very impressive in Mirror's Edge for example, where almost every surface is reflective, but not in BFV lol. They chose wrong game to showcase raytraced reflections.
Upcoming Resident Evil 2 will utilize DXR Shadows and Metro Exodus will utilize DXR GI, thats where the beauty of raytracing will be the most pronounced, because shadows and GI affects whole scene and every single object.

you don't need specialized hardware to do raytracing, people have been doing it for years on all kinds of cards
youtube.com/watch?v=0wcEpMNUK8A
Even in games, I seem to remember the original crysis trying to hype up its use of ray tracing. I guess this new nvidia thing makes it a little faster so it can be used more places, but in the very few circumstances where raytracing is actually useful it is already being used.

Reminds me of the DX11/DX12 thing

On older titles there's barely a difference and in some cases it might produce minor visual glitches to use the newer one. Rise of the Tomb Raider comes to mind. Enable DX12 and the grass looks weird from a distance.

It just works

>DXR Reflections could look very impressive in Mirror's Edge for example, where almost every surface is reflective
This, it's a fucking travesty that they didn't add RTX support to Catalyst

this
no one knows it but AMD has been working on it for far longer than Nvidia and is smart enough not to make GPU's that include this yet. because they know the technology isn't ready, and the people aren't ready

It's not even true ray tracing (that's still very very computationally expensive), it's just casting relatively few rays, and then the AI tries to guess how to complete the image.
The tech is not ready yet, and the current graphic tricks are good enough for now.
We'll probably talk about it again in 5-10 yrs.

>the current graphic tricks are good enough for now
The sooner we move away from SSR the better

I don't care about RTX but task/mesh shaders and variable shade rate are cool.

why doesn't anyone care about NVIDIA dying? It seems like a big deal.

Attached: 1533450457029.png (536x614, 362K)

>t. totally not novidia marketing

RTX was a means to sell Turing to consumers since they knew it would not be much faster than Pascal in rasterization and Nvidia prefers to use the same chips in both pro and consumer products.

I just want 3DFX back man

ray tracing is nice but nvidia's hardware implementation and their scummy "BUT MOOOOOOOOOOM I NEED AN RTX CARD" tier marketing sux

It's a travesty they aren't throwing that game around more when it comes to literally anything involving visual improvements.

Yep, honestly one of the best looking games ever. DICE are always at the bleeding (or mirror's) edge of visuals, hell even the original ME still looks good today. One or two complaints I have with catalyst though is the pretty mediocre AA they use, and the disappointing lack of really "rough" textures like on the walls of pic related.
Also not a visual complaint but I feel the original ME was a bit more varied in its locations, while in Catalyst you pretty much never go down to ground level bar one area and one part of one mission. All in all though I still fucking love the game and the Shard is one of the coolest buildings/missions of any game ever.
/rant

Attached: vn2t6gx.jpg (1920x1080, 616K)

fuck off back to Jow Forums wagecuck

RTX is for plebs

>Why doesn't anyone here care about real time ray tracing? It seems like a big deal.
it's far too early. The tech shows promise for screenshotting, but the hardware doesn't improve on the framerate or resolution capability of the previous generation enough to justify the pricing

No, it's not. Real time is not prerendered. The point is there IS NO REAL TIME RAYTRACING AT A REASONABLE FRAMERATE. The "reasonable framerate" solutions aren't actually doing raytracing, they're doing AI guesswork. Actual raytracing would look a lot different, think actual reflections.

Too early to bother investing into

The original ME looks good because everything is prebaked.

True, but there's nothing wrong with that when it's a linear story like ME. Not every game needs to have open world dynamic lighting

>Why doesn't anyone here care about real time ray tracing? It seems like a big deal.

It will be some time in the future.
Currently 1080P 40fps is NO DEAL

and you fall for jewish tricks you fucking yellow nigger gook bug

> Nvidia
Remember G-SYNC? When NV offered $150 card and AMD offered to to it on a GPU with software?
There were some assumptions of RT ASIC being just a bunch of 16FP cores, not gimped, like in consumer cards. I tell you, NV wouldn't develop an all-new chip for a market that tight and put it in GAMING cards when there are money to be made in the prof market. Instead, they'd rather repurpose some existing tech, call it a fancy name and change for it x10 the price of manufacturing. As far as I know, they don't have RT products for enterprise, meaning that either:
1) Current prof cards do RT just as fine without RTX cores
2) Gaming RT is a quick and dirty tech, therefore it does not offer the real experience

What a stupid argument you fucking retard, people were asking for faster and more reliable transport in general, the car immediately took off, and Zeppelins were a thing long before planes - people loved them before the Hindenburg.
Fuck off with you shit analogies that have no bearing on anything. Literally kill yourself.

>misses the point entirely
Using your own logic you're agreeing with me. People are asking for better rasterization instead they're getting ray tracing. Think for one god damn second in your god forsaken life you stupid fuck.

someone had to take the shot at some point, im glad to see nvidia shooting itself first for once, its usually AMD that experiments with new tech

RTX was rushed for consumers when it was clearly aimed for professional users like people into deep learning or supercomputing.

Which card should I get?

Attached: 67565657865.jpg (1242x1228, 213K)

Ray-tracing is shit, path-tracing is the real deal.

Titan V does it fine without RTX cores

first gen freesync is dogshit, tons of freesync 2 monitors are also dogshit
amd releases half baked technologies and expects OEMs to pick up the slack. gsync, for as pointless a technology as it is, does what it promises and does so regardless of what OEM you buy from

Didn't someone show that Vega does it even better?

That's the problem. Framerate aside, frame times are not nearly as consistent with RTX enabled. As such it isn't actually real-time, because it's missing deadlines and its timing is irregular. It also is wildly expensive, so it doesn't meet the affordability mark either. Raytracing is cool and it's a valid, long-standing technology, but RTX is nothing but snake oil.

This is a drag. Hopefully AMD shows something cool at CES. I haven't paid attention to RTX at all until this thread.

They marketed it to the wrong people.
Gamers want high framerates and RTX just isn,t capable of that yet.
It should've been a feature marketed to speed up offline rendering (Arnold GPU, Octane, Redshift, ect.)