THANK YOU BASED NVIDIA

youtube.com/watch?v=tjf-1BxpR9c

THANK YOU BASED NVIDIA

Attached: maxresdefault.jpg (1280x720, 108K)

Other urls found in this thread:

youtu.be/HSmm_vEVs10
developer.nvidia.com/optix-denoiser
extremetech.com/gaming/146781-imagination-technologies-unveils-first-hardware-ray-tracing-graphics-cards
arstechnica.com/gadgets/2013/01/shedding-some-realistic-light-on-imaginations-real-time-ray-tracing-card/
imgtec.com/news/press-release/otoy-imagination-unveil-breakthrough-powervr-ray-tracing-hardware-platform-cinematic-real-time-rendering/
imgtec.com/blog/powervr-hardware-accelerated-ray-tracing-api/
anandtech.com/show/2752
youtube.com/watch?v=yVflNdzmKjg
embree.github.io/
twitter.com/SFWRedditGifs

Why did they used niggers for the demo? No apr has ever bought a graphic car

AMD BTFO?

Attached: 1448862398635.jpg (263x383, 41K)

Goyforce

from the yt comments

>graphics are getting better
>games are getting worse

>Mfw buying 9900K just to spite AMD niggers
You are the reason I don't buy AMD you kike Pajeets.

I fucking hate this new form of "ray tracing". Pic related was made a decade ago and we still can't put this in vidya. Pathetic.

Attached: Glasses_800_edit.png (2048x1536, 2.9M)

>>Mfw buying 9900K just to spite AMD niggers
Nothing wrong with 9900K though, since it's soldered.

Because consoles

Because full on raytracing takes fuckload of time.
This new shit is mostly rasterization but it employs RT specifically where it matters the most.

How the fuck do we not have enough CPU horsepower to do it yet? Better yet, where the fuck are the accelerators?

It's still a fraction of what real ray tracing looks like and even then it's not even FP64 precision ray tracing is it?

Picrelated for an instance took me like 15 minutes on 2700X and that's one frame.

the new shit will try and give the shadows just one pass and then try and denoise and interpolate the grainy noisy mess.

Attached: Lux_Sphere2.png (1920x1080, 3.58M)

Well that sucks but glad it's now minutes instead of hours. Maybe a decade more before it's milliseconds.

Raytracing is a complete rendering process that calculates every bounce of every photon of a given light source. With raytracing everything you see is rendered as if your eye was taking in the photons/photon reflections. It is extremely computationally expensive and consumer hardware will probably never be able to raytrace a video game at 30FPS with acceptable image quality.

What Nvidia is selling as "raytracing" is more like a step up from their "global illumination" software technique from the 900 series. Sure, some photons from a light source will be calculated. How many? How many reflections are processed? How accurate are the colors (raytracing produces colors by calculating the energy of a photon, energy of a photon = color)? How is this tied into traditional rasterized rendering without causing contention, errors, and wasted processing?

How much GPU time does the "AI denoising" use? How blurry does the denoising make the final product? Do the 2060 and under cards even have tensor cores?
Just how many photons (rays) can a single 2080 process in a scene while providing 60FPS?
Nvidia claims "10 billion rays a second", likely for their top end card, and likely when ONLY running a raytracing scene. That's 166 million photon reflections per frame.

tl'dr
>How many photons would hit a sqaure foot of space per second?
>"One watt of light on a square foot of surface would be very bright. The average visible light photon has about 3.3 x 10^-19 joules of energy. A watt is a joule per second. That would suggest by dividing power by energy per photon that there would be about 3 x 10^18 photons per second. For more or less power multiply by the number of watts. For photons more toward blue (I assumed yellow) there will be fewer per watt. For photons toward red there will be more.

If you prefer names to exponential notation, it is 3 quintillion photons per second or 3 billion billion."

It isn't true raytracing and you'll never have true raytracing.

But it is true raytracing. Just with 1 photon per pixel.
The real magic is, that your can use this information to have graphics of the next decade.
Sure its not 'physically correct' but that does not matter for a video game card

youtu.be/HSmm_vEVs10
Its amazing how fast they put this into hardware

How many games will actually use this? DX12 is still a death sentence for sales because of Windows 10

No, nigger.
True raytracing is when the scene is only rendered by using photons.
Only a retard or an Nvidia shill would be trying to change the definition to fit their needs. You sound like both.

>they
This stuff's been around since the 80s

holy shit

Yes deep neural network denoising of single raytraced scenes is in hardware since the 80s

You do understand that the research paper/video you posted has nothing to do with Nvidia?
Right?

Nothing to do with their hardware, their technqiues, their "AI denoiser", at all right?
You did watch the video, right?
You do know that the research did not use a Neural Network, AI, or any other meme variation of the phrase deep learning approach to achieve their filtering?

Are you stupid?
You sound stupid.

Nvidia does 1 ray per pixel
Nvidia does denoise it with DNNs
Both steps are hardware implemented in the Turing arch

Whats your point again?

They haven't said the denoising runs on dedicated hardware.
You're jumping to conclusions.

Further:
developer.nvidia.com/optix-denoiser
Look at how long it takes the program to build proper reflections into the glasses on the table.
It's fucking trash. Literally no better than the current global voxel illumination standard, which is basically raycasting on the same level as 5 photons per pixel making 5-7 reflections.

Yes, you are stupid. Get used to it.

>""95W"" TDP

What the fuck is ray tracing?

Hurr durr....who cares if it is not pure ray tracing? Do you believe human eyes can process all of the photons fired from an object? As long as it tricks us into believing that it looks real it is fine. Humans cant hear outside of the 20hz-20khz range, who cares if the generated sound is not outside this spectrum(expect audiofags)

Wait wait wait so why do you need like 1 million samples per pixel to make a clear photo realistic raytraced image?

Essentially it's creating shadows/lighting by literally shooting photons at pixels in a 3D environment and recording every path they take. Very computationally expensive and requires thousands of samples or more per pixel in a single frame to render a photo-realistic image.

Novidyas solution is to just record 1 sample and then apply a noise filter and tell anyone who complains about nice things like sub surface scattering and radiance to go fuck themselves.

>the human eye can't see more than 30 photons

Well yeah, doing any more would be absurdly computationally expensive. At least this solution works in real time and looks better than current rendering techniques

I think video games, like most things, peaked at the hobbyist level (not niche) and just gets worse with large-scale adoption. I can't think of much that doesn't get worse when it's being developed for the lowest common denominator. That isn't to say that certain video games and other products don't exist for the hobbyist, it's just that forces of the market push for lower quality products for the masses.

That I can agree with. Gotta learn to crawl before we can walk before we can run before we can fly before we can transcend the physical realm.

Cool so graphics are slightly better like they have been getting for that past 25 years.

Meanwhile they keep building games that look like that and you toss a nade into the room and nothing is affected physics-wise

Fuck graphics, work on more important things to make games more realistic

Ray tracing is producing an image by calculating the light from photons and all of the bounces/reflections a photon makes from the photon's source (sun, window, lightbulb, flame) until the photons hit your eyes.
>Do you believe human eyes process all of the photons
That's how sight works, yes, the eye processes all photons that enter it.
>as long as it tricks us
It's going to be a blurred out mess unless that denoiser is something special (read: computationally expensive as well)
>let me compare vision to hearing and tell you what's what
Okay, I can do that too. Sounds above or below a person's hearing range often have harmonic noises and noises from echoes/reverberations that we DO hear, noises that do affect ultimately what a thing sounds like.

Roughly 3 quadrillion photons hit a square meter of area per second.

Some realistic looking lighting crap

Remember physx? Look how that turned out

>quad
quintillion*

To put that into perspective,
If a square meter of sidewalk on a sunny day was rendered onto a 1080p screen that would be 24 billion photons per pixel per frame at 60fps to be physically accurate.
Nvidia is talking about 5 pixels per photon and haven't mentioned if the reflections are accurate or have some low res limit as well.

Nvidia RT is -not- special

5 photons per pixel*

I need my morning coffee

>t. AyyyMDrone

>actual real life nigger

#GamingEvolved

I can still tell this is a render. When will it get realistic enough to fool people

Wouldn't be so bad if it weren't for the terrible depth of field shit that isn't even consistent across the image.

It's not just that, it's my opinion that outside of a few select companies, video games don't attract top talent when it comes to programming or design. Back when Carmack worked on Quake or when Newell left Microsoft to found Valve, (PC) games were cutting edge tech, not just entertainment. I think that's no longer the case as it's very rare for a game to push technological boundaries these days.
But still, there are other interesting things that come out of the industry, like the Unreal Engine or Virtual Reality.

#betterred
#radeonrebellion
#poorvolta

That images is from 2006.
This is what more modern raytraced images look like.

Attached: 1509757863598.jpg (1400x1050, 925K)

Blame the Scandinavians.

>Did you hear, my son finally found a job
>Oh Nancy it's about time, we were all worried.
>I know right!
>So what does user do?
>He posts on some internet website
>He...what?
>Yeah the company pays him to convince people to buy their product
>*scoff* Nancy, come on, that's not a real job
>user says if he makes 20 posts a day it breaks even for the cost of living in my basement.
>Nancy he lives with you for free

that was epic, epic for the win! XD

EggsDee I know right!

Was he talking about Nvidia or AMD shills or was he talking about soiboiz that do it for free?
Who knows!

Is it clear weather or not Nvidia won't gimp the ray tracing features outside of gaming? They've done this before with various hardware in the Geforce card....

My concern is that they're going to limit the NV-link and Ray tracing cores outside of gaming via limiting access via their sdk/drivers. They do so with regards to the capture interface on all Nvidia cards currently and multi-gpu outside of physical chassis.

You seem to know what you're talking about...
May I inquire if you are familiar with :
> extremetech.com/gaming/146781-imagination-technologies-unveils-first-hardware-ray-tracing-graphics-cards
> arstechnica.com/gadgets/2013/01/shedding-some-realistic-light-on-imaginations-real-time-ray-tracing-card/
> imgtec.com/news/press-release/otoy-imagination-unveil-breakthrough-powervr-ray-tracing-hardware-platform-cinematic-real-time-rendering/
> imgtec.com/blog/powervr-hardware-accelerated-ray-tracing-api/

Essentially these guys already did it years ago in hardware and detailed it. They did so through an acquisition of caustic :
> anandtech.com/show/2752
So, 10 years ago... Someone put ray tracing in hardware. What's old is new again?

looks like shit, like if you used a 4k texture pack in skyrim

That vapourware NEVER hit the market

First my ass

Nvidia's product is a REAL product available launching on Monday and available either this week or next week

What? The Witcher 3 was the best RPG game in centuries.

youtube.com/watch?v=yVflNdzmKjg

THANK YOU BASED NVIDIA

That was slavs not scandis

Oh, they are from Poland, nvm. Who are you talking about

Why do you people suck at insults?
Why not AMDtard?

Attached: ibftMoNbX7wYpV.jpg (1920x1080, 1.78M)

impressive
you can still tell if you look closely though

Oh ok fucking ok when i make a thread about this it only gets one reply then gets archived and that one reply tells me to fuck of to /v/ but when you make a thread about this it gets a shit ton of replys and a big discussion


fuck you Jow Forums fuck.......you

yeah the windows on the left got me
they should really have made more than 4 different ones

shlomovidia

what even ticks the "somethin aint right" box on in your brain

thanks clover
dont even bother posting the image

Attached: arnaudimoberstegsceneindustriellef92k1600x2400.jpgx12477.jpg (1600x2400, 847K)

You seen like someone making slogans for the failed state of Cuba or Venezuela

Attached: 1499313540105.jpg (3840x2160, 1.78M)

I was speaking from a technological stand point in regard to understanding how the pipeline and possible cores work... Not from the perspective of some fanboi dick rider like you who only cares about what becomes popular.

The console is all over the place
> custom .dll files
> embree.github.io/ Embree ray tracing
> Open CL
> CUDA
Then there's Microsoft DX12 Real time ray tracing...
How exactly is nvidia doing ray tracing and what is the broader support for it as in SDK/API access?

top pleb

Gotta hand it to Nvidia for this nice tech demo. I can only imagine how ray traced shadows and reflections would work on a colorful background scenic area. One that would recreate an "epic" feeling like in the games like FF7 Remake. If Square is competent enough to implement it.

there's something about that picture that makes it obvious that it's fake, like if they used a super shitty sharpening filter along the textures

the textures are fucked, the carpet and fabrics look weird

no u

>at the end part
Impressive. It looks so real.

FFXV get vulkan update to get raytrace feature.
Nomura is a fucking graphics whore, imagine hey we need change UE4 for luminous engine.

>>the textures are fucked, the carpet and fabrics look weird
yeah and that plant also gives away.

|
|>
|
|3
|

Is this another gimmick like PhysX?

You misspelled Divinity 2

>That's how sight works, yes, the eye processes all photons that enter it.
It doesn't. It does all kinds of tricks to make the image look clear, and even then you can never be sure if what I'm seeing is the same as what you're seeing let alone reality.

the people who say games are getting worse are the same people who will not know about a game existing unless they blasted by a 100 million dollar ad campaign.

Riddle me this, Jow Forums
>single pass raytrace
>AI denoise pass
>continue raytracing until convergence
Why not?

PhysX was middleware SDK with CPU and hardware acceleration support which was all proprietary.

The real time ray tracing is hardware level acceleration which compatible with OptiX, Microsoft DXR, and Vulkan.

The vast majority of games released are getting worse. I'm sorry you're upset that no one knows about your literal who indie dev.

First of all, AI denoise is just a marketing gimmick.
Looks fine on static marketing slides, but looks like garbage because of artifacts in real life dynamic scenes.
Literally temporal aliasing 2.0.

But answering your question, if that shit hypothetically worked, you still have finite amount of time to render the scene.

Make it somewhat open and allow AMD and Intel to lincense it. Otherwise it will die like all of their meme tech.

TIL selling a million copies with support from a major publisher still qualifies you as a literal who.

it is. although not Intel since this has nothing to do with CPUs and their GPUs aren't set to launch for another 2 years.