About the whole "Ray Tracing" of Nvidia

Posting on Jow Forums, more relevant than /v/.

Is Ray Tracing good? Of course.

Is it necesseary at the moment in video games? Absolutely not. What you need are GOOD graphical engine and SMART visual optimization.

The FOX ENGINE is the perfect example of that, were the engine is built around physic based lighting, which is way everything looks great and photorealistic, without having to use Ray Tracing.

youtube.com/watch?v=_18nXt_WMF4

What we need is not cutting-edge tech such as Ray-Tracing , but we need SMART engineers and visual artists.

Attached: choose_pic.jpg (711x690, 336K)

Other urls found in this thread:

youtube.com/watch?v=JKO6s17pQHU
youtube.com/watch?v=m7i0_a2wHr8
adriancourreges.com/blog/2017/12/15/mgs-v-graphics-study/
youtube.com/watch?v=qCs0xxUS5as
youtube.com/watch?v=6J1GMrsq6l0
youtube.com/watch?v=yzqzZStTh5I
steamcommunity.com/app/311340/discussions/0/619574421426382995/
image.noelshack.com/fichiers/2014/51/1419095653-mgsgroundzeroes-2014-12-20-12-45-59-95.png
image.noelshack.com/fichiers/2014/51/1419087261-mgsgroundzeroes-2014-12-19-23-52-11-11.png
image.noelshack.com/fichiers/2014/51/1419082923-mgsgroundzeroes-2014-12-19-23-31-58-51.png
image.noelshack.com/fichiers/2014/51/1419075135-mgsgroundzeroes-2014-12-19-22-49-16-30.png
image.noelshack.com/fichiers/2014/51/1419070660-mgsgroundzeroes-2014-12-19-22-18-27-65.png
image.noelshack.com/fichiers/2014/51/1419071914-mgsgroundzeroes-2014-12-19-22-26-42-21.png
image.noelshack.com/fichiers/2014/51/1419027619-mgsgroundzeroes-2014-12-19-21-46-34-83.png
image.noelshack.com/fichiers/2014/51/1419008808-mgsgroundzeroes-2014-12-19-17-40-23-49.png
image.noelshack.com/fichiers/2014/51/1419008480-mgsgroundzeroes-2014-12-19-17-35-50-23.png
image.noelshack.com/fichiers/2014/51/1419015380-mgsgroundzeroes-2014-12-19-17-20-12-35.png
image.noelshack.com/fichiers/2014/51/1419013271-mgsgroundzeroes-2014-12-19-17-04-43-75.png
image.noelshack.com/fichiers/2014/51/1419012353-mgsgroundzeroes-2014-12-19-16-54-49-57.png
image.noelshack.com/fichiers/2014/51/1419015987-mgsgroundzeroes-2014-12-19-17-22-09-26.png
youtube.com/watch?v=x19sIltR0qU
youtube.com/watch?v=kqFWiyYZzig
youtube.com/watch?v=bpNZt3yDXno
youtube.com/watch?v=rl-mn97X33k
youtube.com/watch?v=CPBdiV0JvRo
youtube.com/watch?v=dQSzmngTbtw
gamasutra.com/view/news/286023/Graphics_Deep_Dive_Cascaded_voxel_cone_tracing_in_The_Tomorrow_Children.php
unity3d.com/learn/tutorials/topics/graphics/choosing-lighting-technique
research.nvidia.com/sites/default/files/publications/dnn_denoise_author.pdf
raytracey.blogspot.com/2017/07/towards-real-time-path-tracing.html
cdn.videocardz.com/1/2018/08/NVIDIA-Turing-L1-L2-Cache.jpg
twitter.com/SFWRedditImages

all that you see on the fox engine is BAKED, along with cube maps and direct lights.

is it hard to understand that raytracing means not baked ?

Perhaps, but what NVIDIA needs is a reason to charge more money for new products

This engine is stupidly good in that regard.
youtube.com/watch?v=JKO6s17pQHU

youtube.com/watch?v=m7i0_a2wHr8

Attached: 1528898491051.gif (320x240, 2.62M)

the global illumination (indirect light) in the room from the pic is pre computed, meaning its static for the most part.

" When you die and your whole life flashes before your eyes, how much of it do you want to not have ray tracing?"

Attached: risitasww3.gif (136x102, 516K)

to anyone interested.

adriancourreges.com/blog/2017/12/15/mgs-v-graphics-study/

global illumination ( what makes everything look photorealistic) is approximated with irradiance spherical maps. You can also see the many tricks that they use to achieve other effects.

In short graphics engines are incredibly complex pieces of software due to the fact that ALL is an approximation, its tricks upon tricks to try to approximate real life lighting.

Raytracing is a very elegant solution to all of this, you no longer have a huge bag of tricks to try to approximate effects, you simply use physics formula and call it a day.

This link is very impressive.

Attached: fox-engine-1.jpg (638x350, 71K)

its not gonna worth a damn for consumers until at least 5 years from now

fpbp

But you have to buy it now goyim, buy more save more !

hello nvidia kike shill

Too bad the never released this SDK has it was meant and how the game was downgraded for the final release...
youtube.com/watch?v=qCs0xxUS5as

> I'm a shill because I understand how important raytracing is

>create the best game engine in the industry
>use is it mainly for soccer games

Fuck Konami. They have so many good IPs and they don't do anything with them.

It's the man on the left who created the Fox Engine, Julien Merceron !
youtube.com/watch?v=6J1GMrsq6l0

youtube.com/watch?v=yzqzZStTh5I

>The FOX ENGINE is the perfect example of that, were the engine is built around physic based lighting, which is way everything looks great and photorealistic, without having to use Ray Tracing.
>"without having to use Ray Tracing."
Learn to read.

And, yes, Fox Engine is an actually good engine. Sadly Konami only use it for their IP.

who said fox engine uses raytracing ?

also

> comparing fake tricks (fox engine) to the real deal (raytracing)

>>"without having to use Ray Tracing."
It's either 100% real time or ray traced. Baked? Why did you make up a word? What?

That's you.

whatever you say shill

It may be a more elegant solution but the performance lost isn't worth it in the long run.

Yeah, 100% full-Ray-tracing/full-speed is probably 10 years away without kidding. And seeing the direction of the video games industry, I will have stopped playing games permanently before that (is the verb tense correct? My english isn't native)

If anyone wants to see full uncompressed 5k pictures of Ground Zeroes, a mad man made hundred pictures of that. I've just found it again.

steamcommunity.com/app/311340/discussions/0/619574421426382995/

Games these days have GI prebaked using raytracing methods.

Literally who cares about realtime raytracing, unless you want your cinematic experiences to be open world.

Apparently, the pictures were taken with FXAA on (very hard to deactivate on GZ/TPP) o it could look even better.

image.noelshack.com/fichiers/2014/51/1419095653-mgsgroundzeroes-2014-12-20-12-45-59-95.png

image.noelshack.com/fichiers/2014/51/1419087261-mgsgroundzeroes-2014-12-19-23-52-11-11.png

image.noelshack.com/fichiers/2014/51/1419082923-mgsgroundzeroes-2014-12-19-23-31-58-51.png

image.noelshack.com/fichiers/2014/51/1419075135-mgsgroundzeroes-2014-12-19-22-49-16-30.png

image.noelshack.com/fichiers/2014/51/1419070660-mgsgroundzeroes-2014-12-19-22-18-27-65.png

image.noelshack.com/fichiers/2014/51/1419071914-mgsgroundzeroes-2014-12-19-22-26-42-21.png

image.noelshack.com/fichiers/2014/51/1419027619-mgsgroundzeroes-2014-12-19-21-46-34-83.png

image.noelshack.com/fichiers/2014/51/1419008808-mgsgroundzeroes-2014-12-19-17-40-23-49.png

image.noelshack.com/fichiers/2014/51/1419008480-mgsgroundzeroes-2014-12-19-17-35-50-23.png

image.noelshack.com/fichiers/2014/51/1419015380-mgsgroundzeroes-2014-12-19-17-20-12-35.png

image.noelshack.com/fichiers/2014/51/1419013271-mgsgroundzeroes-2014-12-19-17-04-43-75.png

image.noelshack.com/fichiers/2014/51/1419012353-mgsgroundzeroes-2014-12-19-16-54-49-57.png

image.noelshack.com/fichiers/2014/51/1419015987-mgsgroundzeroes-2014-12-19-17-22-09-26.png

Too bad, it seems that the engine was downgraded for TTP (because of the open world?) and the three maps (Mother Base,Afghanistan,Africa) were no match for the Omega Base.

Attached: cjIQ855.jpg (586x473, 89K)

And bear it mind that it was a PS3-Xbox 360-PS4-One-PC game and that the game was rushed. It could have looked much better but it was pretty good already.

Now, I'm dreaming of a Metal Gear Remake collection with the Fox Engine but it's a dream...

the exact same luddite ass posts were spammed when PBR was gaining traction a couple years ago. You brainlets will never learn.

Attached: 1514995011689.jpg (225x225, 6K)

The long run is exactly when it's worth it. It's only in the short term right now where it isn't really viable without nvidia's jew tricks.

prebaked = static

the world cannot change with prebaked methods.

>What we need is not cutting-edge tech such as Ray-Tracing

Unlike what many people here seem to believe, ray tracing is not a new cutting-edge technology. It's been present since the 60s and standard in CGI animated movies for the past decade. What *is* new is that technology is now advanced enough to introduce ray tracing technology to real time applications.

Attached: main-qimg-857c2306dd327a53aba4edea5f5ca51c-c.jpg (243x401, 29K)

No, what we need is ray assisted rendering, something that bakes on the fly, and real time up close.

lets put it this way, there are a few effects that demand ray tracing, such as global illumination, as fucking every single game that has a flashlight is borderline broken in terms of lighting, I can hold a flashlight up to the ceiling, and guess what, room is illuminated, may not be well illuminated, but I can walk around just fine.

another effect is shadows, holy fuck to shadow maps suck every single cock that gets put in front of them and refuses to let go. you either need a shadow may that is fuck huge, a gpu effect that blurs the shadow to hell and back, or retarded amounts of processing power, why not have rays do the shadows and blur based on best guesses?

we hit a point where we are not going forward faking ray tracing, and in this time of stagnation, why not introduce real rays?

granted nvidia done fucked up hard with their implementation, but still, baby steps.

its not a made up word, when you pre render something in a high quality and apply it to a lower quality version, it's called baking in 3d.

most elements in most games are going to be static though, so why not bake elements that will never move? this was a failing of farcry 3 or 4, where you went in a cave and if you had ao low or off everything looked like shit, but if you had it on, it looked good, but all the shadows never moved, but it took a solid 120fps and tanked it to 30 for this scene.

I honestly like baking over real time
I like faking graphic effects, but lets be real
game developers are abject garbage 99% of the time, and cant even write standard code for gpus to use, so amd and nvidia have to go in and fix their shit. Taking decision making processes out of these retards hands would be a good thing, less ways they can fuck up.

they get people to use ray tracing
now they have a good 4-5 generations of improvements they can implement without needing to resort to gimping old gpus.
If nvidia does focus on rays and tensor cores, that means their shaders are going to be untouched, which good thing for us, as long as they keep games having non ray traced lights, that means we will be able to maintain the same performance levels without needing to buy into retarded expensive cards.

honestly, guy put ray tracing into quake... honestly i'm 100% ok with the grainy look, if it makes games look like that with very little effort.
youtube.com/watch?v=x19sIltR0qU
youtube.com/watch?v=kqFWiyYZzig
youtube.com/watch?v=bpNZt3yDXno
youtube.com/watch?v=rl-mn97X33k

raytracing is simulating photons
sure we can do it, and we know diffraction numbers for most things... hell physics books allow you to apply real world rays easily

the problem is the time it takes to do

its cutting edge because we are getting very close to it being something that can be done.

granted I think there is more practical applications for voxel cone in the meantime over raytracing.

>is the verb tense correct?
Yes. Your writing is perfectly natural, with the possible exception of "without kidding," which I've not heard anyone say before.

Ah, I translated the french "sans rire" to "without kidding". I checked now and it's more like "no kidding" in english.

Thank you anyway.

>I honestly like baking over real time
that's like saying I honestly prefer pictures over movies

raytracing just makes everything easier when it comes to achieving any sort of lighting effect, you get accurate reflections, shadows, global illumination for free without resorting to tricks.

>as fucking every single game that has a flashlight is borderline broken in terms of lighting, I can hold a flashlight up to the ceiling, and guess what, room is illuminated, may not be well illuminated, but I can walk around just fine.

this. you understand what it means to have realtime global illumination, shadows, and reflections.

I believe the majority of people dont understand what it means to have real-time raytracing since its a technical subject and most gamers are too stupid to understand.

>I believe the majority of people dont understand what it means to have real-time raytracing since its a technical subject and most gamers are too stupid to understand.
They don't care, why should they

If you bake the effect correctly, you reduce resources necessary power by an order of magnitude

nvidia showed ray tracing in real time, then they walked it back to reflections and shadows, and now its walked back further to the point, sure it does reflections, but at the cost of crippling performance.

If you can ray trace a full scene and use whatever blackmagic you need to to make it not granny (though I like the grain, a pseudo film look) and it's over 60fps, in game, give it to me

you give me reflections only, and the shit is fucking going below 30? get the fuck out of here

oh they will care the moment they start seeing amazing lighting effects.

I'd kill be able to have this sort of lighting on a 2d game. this is currently unachievable without heavy use of fake techniques.

youtube.com/watch?v=CPBdiV0JvRo

Attached: gi.jpg (870x662, 138K)

The more you buy, the more you save...

What he meant was, if he didnt have to buy back so many GPUs, you couldnt have saved on the 20 series.

Yes, its fundamentally better tech for a lot of reasons.

1. Many graphical effects which require significant tricks, become easy and natural with ray tracing. For example shadows and reflections.

2. No more separation of static and dynamic geometry.

3. The tech is scalable. You can just add more rays and more bounces to get a crisper image.

Oh I more than understand it for certain effects, we could easily have a burst of rays happen, not a large amount, maybe 100, and have them go straight out the flashlight, and have them single bounce on the wall and hit something else, this could then have the lighting calculated from that, and also have a lazy effect to the gi, so it lingers. again with my example, you are in low lighting and are dealing with bounce light from a flashlight, when you are in darkness and you move, you don't have instant visibility, so the ghosting effect the gi would have here could be use for the benefit of the effect.

look, I fully understand the implications of realtime ray tracing, but we are being fed fucking reflections that tank fps to sub 30fps at 1080p from what use to be 144, the shit is not ready

what I want to see is a bake + real time approach, where things near you are real time, and things far away are baked till you get close, and when you get away from them they rebake.

we hit a wall with most effects in games, polly count... resistance 1 and 2 on ps3 are a good example of this, the poly count doubled from 1 to 2, but no one noticed at all, sure we can add more polys to certain things like a round barrel, and we notice that, but most objects, we don't. now tessellate the hell out of something when you get wall licking distance, and it just does not matter.

however going from ps1 to ps2 was night and day, ps2 to ps3 was again night and day, ps3 to 4 however, while it was a big jump, it was more resolution than actual effects that make game stand out, and with 4k, we honestly hit a point where more resolution won't effect how it looks, textures are already there, you only see pixels when you wall lick anymore, but lighting is the area that everything lives and dies on, and the only way to go forward is to ray trace as we already bake it, the only thing left is to have real time interact with baked or possibly real time too.

problem with number 3, the tech isn't scalable at all, as it requires so much fucking processing power, we are nowhere near it yet.

Honestly this
youtube.com/watch?v=dQSzmngTbtw
Is likely going to be the better solution than ray tracing, at least at distance, up close may still go for rays just for precision sake, but this will likely be what the rest of the world at distance is subjected to.

gamasutra.com/view/news/286023/Graphics_Deep_Dive_Cascaded_voxel_cone_tracing_in_The_Tomorrow_Children.php

>what I want to see is a bake + real time approach, where things near you are real time, and things far away are baked till you get close, and when you get away from them they rebake.

baking is a time consuming process, meaning it can usually only be done offline (not while you are playing the game).

the whole point of baking something is because you know it won't move, its static.

unity, the most popular game engine out there offers:

-realtime lighting (shit, no global illumination)

-baked lighting (looks great but static)

-precomputed lighting (looks great, allows you to do day/night cycles but again only for static objects, lighting for moving objects is approximated, meaning faked)

check this
unity3d.com/learn/tutorials/topics/graphics/choosing-lighting-technique

VXGI has been out there for a while and its what the cryengine uses for realtime global illumination.

problem is that its not as good as raytracing while it takes almost as much processing as raytracing , so not quite accurate but very intensive, meaning only good for small scenes.

and they use it to make shitty cash grabs *claps*

yes and no,
baking is just fixing the result, if you are generating a good enough result, as in ray+denoise then walking away from it shouldn't need to rebake a high quality version that would need offline, the one you have is good enough. because the moment you get close enough, it goes back to real time.

largely I believe a real time bake along with a pre bake for entering a level would be good enough, and if you allow it to be scalable, this should allow games to grow with new gpus instead of 'this is as good as the game will ever look' we get with most effects.

nothing is going to be as demanding as ray tracing. and looking into it a bit, it seems like versions of this have been implemented in a few places, and the load is nothing close to what ray traceing gets to while giving a fairly large bump to image quality.

well, they left the gaming market almost entirely shortly after it was made.

Still in a dream... Snake eater.

>game developers are abject garbage 99% of the time, and cant even write standard code for gpus to use, so amd and nvidia have to go in and fix their shit.
As opposed to? Microsoft does the same for popular desktop software.

I'm just imagining right now
>ray traced graphics in 4K vr headsets
It's gonna be real freaky in a few years, anons.

cant argue that microsoft doesn't shit the bed, but it seems everyone who failed to be a good programer works as game devs and find new ways to break shit

no, that's never going to happen, you need higher than 4k per eye before it stops looking like pixels, granted by that point ar is also going to be a major thing.

also, there are a few interesting techs that seem like they may take over the vr marketI think the current lense vr is going to give way for more of an arcade style vr where you are looking at a mirror/reflection instead of a screen. easier to focus

Where the good programmers add?

>incredibly complex pieces of software

Attached: 1ewdzs.jpg (399x385, 37K)

>no, that's never going to happen

you are talking out of your ass and I bet you dont even have a vr headset. You are in for a surprise in the following years as deep learning is doing miracles in all fields including vr with foveated rendering.

> also, there are a few interesting techs that seem like they may take over the vr marketI think the current lense vr is going to give way for more of an arcade style vr where you are looking at a mirror/reflection instead of a screen. easier to focus

what the hell are you talking about ?. the whole industry is moving towards having high res displays

Attached: foveatedRendering.png (681x392, 550K)

> writing photorealistic high performance graphics engine is easy

t. deluded web dev

You are too fucking obvious shill

for games, you you say id, fucking 1060 6gb at 4k gets playable frame rate with timings that feel like i'm on a free/gsync monitor... shits fucking magic

you can also add architects for unreal, cryengine, and unity, as much as people hate unity, its a good engine till it's put in the hands of someone shit at driving.

as for good programers in general, ever use a program and it worked better then expected? thats where they are.

> 80% of 'meme-tracing' in current Nvidia cards is meme-learning based filtering, denoising, and approximate copy/pasta overlays atop a super noise ray low ray/sample count image.

/thread.

Matte surfaces and global illumination are barely possible if there are a number of dynamic elements in the scene. They introduce transparency/reflections and the # of ray bounces kills performance. Not to mention, what brainlets don't understand which you noted is that the majority of what's going on is meme-learning approximation/filtering. They're doing something like 1-4 samples per pixel and about 2-3 bounces. The output is grainy as fuck and this is nowhere near what ray tracing is. This is like a developer preview card.

> its not ray tracing if they dont do infinite bounced rays per second

ok

when I say that's not going to happen, I mean the effect you are thinking of, where you put the headset on and its hard to tell real from fake.

2 reasons for this, 1 is you need higher then 8k per eye, the very center of our vision requires a stupidly high resolution screen to mimic, and as we can move our eyes around, that stupidly high resolution needs to move with it so the whole screen field needs to be high resolution.

Then you mix in the problem of focal points, the main area all headsets fail at today, none of them are able to remain in focus or in perfect focus for long. I can't remember the company but there was one shopping around using a different way to imbed screens that don't require optics because it was so close to your face. especially for low end vr, that is where my money is as almost all of the expensive shit thats hard to engineer for and even worse, hard to mass produce at a reasonable cost is eliminated.

you also have the added issue of vr with near indistinguishable from real life graphics would never take off, as killing someone in one would be so fucking off putting the software wont sell. granted if you are talking about waifu sims, i'm fairly sure most of them don't want realism in the games either.

you may see ones like roborecall go ray, but it would be a ray and bake process, something like what nvidia currently does where the processing is baked and shifted from one eye to the other, so instead of rendering 1 scene twice, you do it once. for the most part this will work but certain effects that if you had a single camera angle would work, will break under how it would need to be implemented.

> I detail how it works
(You) reply with a reframed :
> Appeal to extremes

Nice try. Doing 2 samples per pixel and 4 ray bounces is not ray tracing. It's a prime for a deep learning approximation algorithm. Like I said, this is a 'dev preview' card. They implemented everything in a HYBRID ray trace pipeline and wedged it into the area double precision float processing was done in Volta. Nvidia themselves calls this a hybrid pipeline because it still relies heavily on shaders/etc. The footprint they've dedicated to ray tracing is tiny af compared to the overall GPU. This shit belongs on a dedicated asic piped into the GPU pipeline so they can give it more processing power and reduce the insane die size and subsequent cost. The upcoming generations will do this via NVLink. The future architecture will look nothing like this beta cash grab
> ok

global illumination doesn't need much to work, at least in a way that wouldn't piss me off.

Have the ray trace shoot maybe 1000 rays possibly 10000 in all directions, do a two bounce, and approximate how much gi needs to be there in engine, you don't need the entirety of the lighting to be ray traced, just ray assisted.

This would fix my issues with muzzle flash, flashlights and the like overnight. hell there are a lot of ways I can think to use ray tracing in a limited degree that would assist real time rendering. thankfully the push to real time rays is going to stagnate the gpu market as they wont be focusing on pushing 4k or any new effects, so a good gpu should last a while, at least till companies stop doing 2 sets of lighting for a scene.

I'm happy that Nvidia pushed this card out. However, as an intelligent consumer, I recognize it for what it is. It's a consumer version of a quadro meant to get the ball rolling on dev buy-in. Professionals who do a lot of rendering will profit greatly from these cards. Gamers are sort of being teased to join in on the ride. Get some of these in gamers hands and entice the studios to start working w/ and developing your tech. It's going to take some time before this is fleshed out in hardware and software. Good first attempt but I'm not buying at these prices. Ray tracing belongs on a separate chip. Whoever does this first will have solidified this as a real technology beyond alpha/beta stage.

Again, this is what's making the incredibly noisy ray tracing output look sensible :
> research.nvidia.com/sites/default/files/publications/dnn_denoise_author.pdf

So, just like rasterizing, there still are a shit ton of hacks to make a rendered image. Nvidia calls this a hybrid pipeline not a ray tracing pipeline. Most of what you see is due to the slew of filtering/denoising/sampling/fill/etc that occurs in the tensor cores.

Calling this :
> ray tracing
is a joke.

Good 1st attempt though and I look forward to much more fleshed out pipelines in the future that rely less on meme learning

> The spatiotemporal filter approach looks amazing - like truly amazing and amazingly usable but it has a problem with lag between the GI and the geometry itself which makes it kind of unusable for games... seeing especially shadows to lag behind after objects so much would be painful to endure and gamers would laugh their asses off. The new Nvidia AI filter looks amazing too - no lags but the result is softer than that of the spatiotemporal filter! But costs 50ms on a Pascal Titan which is too much..

raytracey.blogspot.com/2017/07/towards-real-time-path-tracing.html

Inb4 game devs don't use raytracing, because they'd loose their jobs. It just werks.

>all that you see on the fox engine is BAKED

So?
It works and that's all that really counts.

The only problem is dynamic effects but if they can make those faster by baking parts of them in then just the more credits to them.
Doing heavy computations in game engines just to brag about how much hardware you need is really stupid.

>Most of what you see is due to the slew of filtering/denoising/sampling/fill/etc

Which is fine, and in fact should probably be inserted in every ray-tracing pipeline to improve output quality at similar render times.

> So? It works and that's all that really counts.
This is even the case for current gay-trace coming out in Geforce20. It's not ray-tracing. It's a hybrid pipeline that relies on rasterizing and meme-learning to make it work. The machine learning post processing pipeline that does final render contains a statistical map of pre-baked rendering results. No GPU pipeline is capable of doing legitimate rendering in real-time. So, they all are generally an arrangement of hacks and shortcuts. Brainlets don't know this and thus think Nvidia's gay-trace is the 2nd coming. No, it's just a new set of better hacks to do the same thing a traditional rasterized pipeline does.

>Literally who cares about realtime raytracing, unless you want your cinematic experiences to be open world.

>who needs this new tech that's becoming usable for consumers when we can just stick to our old technology.
t.boomer

Was talking about Fox engine.

Yeah obviously but raytracing is getting closer to being usable in real time gaming. It's like saying computers are not a new technology to consumers when the first home computers came out because the government has been using them for years.

>6 years in development
>Rushed

Suck a shotgun, Kojimadrone.

Ray Tracing is new tessellation, nvidia ruined that tech for everyone,now they gonna ruin ray tracing for us, adopt rate will be maybe 3-4 huge AAA games per year no one really likes(COD,Battlefield,Overwatch,Destiny,PUBG and that shit).

Bigger picture is, we all are just lab rats for war AMD/NVIDIA wage over who will be next supplier of next gen console SOC..

MARK MY WORDS!!
ALL this(among other) is just so devs can train to work with RT for next gen consoles.
PC gamers will get ray trace in GAMEWERKS!!

Attached: Torment Numenera.jpg (1366x768, 298K)

Yes, the game was rushed as hell, it needed 3 years a least.

To be fair, Kjimba ucked up pretty bad but there were always big big tensions in the studio and with Konami.

the way that nvidia shows this off and sliced the die, ray tracing and ai denoise are on pretty much on separate chips.

nvidia wont be in consoles, consoles aren't enough profit for nvidia to justify doing it, and amd is willing to do the work for very little beyond their r&d costs covered.

nvidia is only in 1 console right now because they had the perfect storm of amd not being ready with a mobile chip and nvidia was.

They're not.
- Ray trace cores
- Tensor cores are fused into each SM.

See pic... It's a snapshot of SMs in Volta.
Turing is essentially this SM except they likely replaced a block of tensor cores w/ ray trace cores and put supporting logic in the place of FP64.

In order to allow for a 'HYBRID PIPELINE', they also split out the pipeline in the SM :
> cdn.videocardz.com/1/2018/08/NVIDIA-Turing-L1-L2-Cache.jpg

When Meme-tracing is enabled, the resources are shared between the traditional pipeline and ray tracing logic. When it is disabled, it is likely that the traditional GPU pipeline has access to increased L2 and other resources. This is where the speedups will occur over pascal and this is also why performance drops when you enabled gaytrace.

The ray trace logic is simple intersection logic calculations done on vectors. It's a tiny calculator built into each SM.

The graphic that showed was more of a marketing gimmick.

Attached: turing_sm_equals_volta_redux.png (2517x3333, 355K)

we will see when turning is hown in more detail, I thought it was all tensor core for the ray tracing in the first place, however having shorts telling me the ray and tensor areas are seperate I shrugged and said I was wrong on that, and got interested as it would bean the ray section is off of the normal die, which allows for a more interesting approach, if its all like this, then its more disappointing due to competing for the resources.

> we will see when turning is hown in more detail
I don't need to see. It's already been detailed in the manner I defined. They split out the Load/Store Unit in an SM... That means they're accommodating a new pipeline. They expanded the L2 cache which is where they're keep the BVH. So, ray trace functionality is in the SM and spread throughout the chip.

When they detail Turing (publicly), it will be confirmed. A slide leaked that already confirms this though...
> cdn.videocardz.com/1/2018/08/NVIDIA-Turing-L1-L2-Cache.jpg

They're not going to move this out until they test it over a generation or two. This allows for them to extract max performance with little impact such as a complete re-architecture. As the platform matures, and they've milked a couple of plebs, they'll transition it out.