CryTek just BTFO Nvidia to oblivion:

Real-time ray tracing on CRYENGINE.
"This demo was rendered on an AMD Vega 56 GPU."

youtube.com/watch?v=1nqhkDm2_Tw

Attached: 1462882271928.png (318x269, 113K)

Other urls found in this thread:

en.wikipedia.org/wiki/Emission_theory_(vision)
forum.beyond3d.com/threads/next-gen-lighting-technologies-voxelised-traced-and-everything-else-spawn.60873/page-70#post-2062044
twitter.com/SFWRedditVideos

holy shit

Attached: 1502260365712.jpg (218x250, 7K)

oof, that gotta burn

alright that's pretty funny

>rendered
so not realtime then?

I hope you're pretending user

Attached: 80c.png (645x729, 58K)

I think you are thinking of pre-rendering.

not that guy but technically the video is pre rendered even if it was real time at the the time it was recorded.

Attached: 135.png (680x680, 124K)

didn't crytek go bankrupt at some point or am I thinking of some other ded company

I don't often say this, but this warrants it: based

you sicken me

Only warface devs because they left crytek.

Wait wut, I thought crytek died two years ago?

Real time ray tracing is the most retarded thing ever

Nice. The demo looks like the next Deus Ex.

LMAO.

Now all we need is more CryTek games or games that utilize Crytek's voxel based ray tracing. Then Radeon VII will outshine nvidia's proprietary RTX bullshit.

This is a legit question, the video says "rendered" but not real-time rendered, it would be silly have a non-real-time demo of a game engine but still.

>games spend all their time on realtime raytracing, pbr, reflections, particles, normal mapping and detail mapping

>they all still use that weird lambert diffusion model that makes the entire scene look matte plastic anyway

did you watch the same video as me? it even says it in the title.

...

Can someone tell me why everyone is jumping on ray tracings dick?

What's so special about it? How does this improve gameplay

>gameplay
This is not a gameplay board, this is a technology board

It simulates light bounces like in real-life rather than the pathetic approximations real-time rendering always relied on, it's basically the endgame of lighting rendering.

Its merely bringing some "physics" like proper lighting/reflections into the game. Now a game would be interesting if there was global "gravity" rather than the stupid ragdoll physics we use today in many games. Its a step forward in the future.

Attached: based_scottish_accent_guy.png (319x133, 9K)

>endgame
Not even close my retarded friend. Their light model is the reverse of real life. Its like as if you had laser eyes and shot photons out of your eyes to scan the surroundings, instead of photons being scattered and entering your pupil.

>"runs on mainstream hardware"
>doesn't state at what framerate

Probably rendered at 0.2fps and then sped the video up desu

>Now a game would be interesting if there was global "gravity"
Uh how do you think things fall down in games?
Gravity is the easiest shit to simulate and games in the 70's already had that, literally just add some downwards speed or towards a point every physics step, movement and gravity are solved problems but stuff like physics interactions between objects are a lot trickier.

How does 3D improve gameplay? Video games were more fun back on the NES than anything released in decades.

Functionally identical my friend. en.wikipedia.org/wiki/Emission_theory_(vision)

The nes came out in the 70s or so user. These zoomer millenials were born in 2000 or later and have never known a life before 3d graphics. They have never heard of assembly and think 8bit refer to graphics.

No it literally is not. Not even close. Not even remotely close.

Huh, there are a lot of great games now, and I believe the lighting effects would help with the atmosphere is something like subnautica or whatever.

Are you just pretending to be retarded? Why would the engine simulate photons that never reach the camera?

We have hacks right now. It does the job on one to one basis or small numbers basis. But we don't have proper global gravity. Practical gravity limited to objectx and objecty is fairly easy. So is the practical lighting between objectx and objecty. But proper raytracing fixes the lighting hacks we use. But proper gravity is not here yet. It would take either a novel approach in software or hardware upgrade for global gravity.

It's close enough that it took us a long arse time to disprove emission theory. If you travelled backwards in time shit would look the same.

I never said they had to, god knows you can precompute visibility by calculating the probability of a photon reaching the eye from some initial position and momentum, stop raising strawmen you retards. Eyes do not work by shooting lasers, ray tracing is not the end game, a photon collector, a camera, is the endgame.

This is not raytracing

It's garbage from a dead company, no games uses CryEngine these days

Only Nvidia has real time raytracing that kills AYYMD which is at least 6 years behind, AYYMD didn't even have a compute GPU until 2012 GCN when Nvidia had compute GPU in 2006

No, no it would not god damn the retards here. Time is not reversible like that, photons are not deterministic. Fucking CS people nevre learned anything beyond newton.

>But proper gravity is not here yet. It would take either a novel approach in software or hardware upgrade for global gravity.
That makes zero sense at all, do you even understand how a physics engine or gravity work?

Simulating surface gravity is as simple as velocity.x += gravity * delta, with gravity being something like -9.8.
Point gravity for celestial mechanics is a bit trickier but still easily done with age-old equations, what problem are you trying to solve exactly?

> photons are not deterministic
Not decided yet, AFAIK.

What is the practical difference between inverted photons and "proper" photon capturing? It's literally just an optimization with no visible difference.

It makes literally no fucking difference.
Optics works just as well in "reverse". Ever seen that fancy rainbow shit on the pink floyd album? That works both ways, as shown experimentally by newton literally 400 fucking years ago.

t. Physicist.

>hurr durr Nasa physics uses newtonian gravity with forward explicit Euler integration
Let me guess you are a gamedev

Simulation of luminance as opposed to just sampling the predetermined luminance of an object.

There's no functional difference between NG and GR until you get to cosmic scales, but there are toolkits for simulating GR.
You're still not making any sense, what does a "global gravity" even entails?

it says real-time in the title and video @ 8 seconds

>photons are deterministic
Have you ever used a polarizing filter

So you're of the belief that raytracing is useless?

>the controller only has 2 buttons + d-pad for gameplay
rather limited console desu

Looks like shit desu.
Nowhere near the level of what RTX has to offer.

It makes lighting as simple as placing a camera.

It looks better than rtx where everything is a mirror

Crytek didn't btfo anything, SVOGI is not raytracing

Rasterization can fake things but it will never be as good as raytracing, the industry standard in movie rendering

I'm talking about physics, not lighting.
Ray tracing is great because it has a huge difference in visual quality compared to traditional real time rasterization, that has nothing do with the problem of game physics and gravity.
Games do gravity and movement fine, what they suck at is physical properties of objects like breakability, flammability and other such complex physical interactions that are not related to gravity and can't be easily solved in a "global" way without literally simulating reality down to the quantum level.

>inb4 /sci/fags going NUH UH GRAVITY AFFECTS LIGHTING as if that had any practical bearing in a videogame

>mfw don't really care all that much for RT'd refractive surfaces

gib real time RT'd GI at decent sample counts so we can finally stop having so many games with bolted down scenery.

Why is everything so shiny?
Same shit was in Crysis2+

Because reflections are the most obvious way to showcase accurate light bouncing.

REEEEEEEEE
DELET DIS
STOP THIS ANTI SEMITIC PROPAGANDA RIGHT NOW
STOP KVETCHING AND JUST BUY IT
REEEEEEEEEEEE

Attached: 62960_02_toms-hardware-founder-chimes-over-internal-struggles.jpg (620x427, 35K)

>never played a space game where light bends around the sun and planets
Ok buddy

But it looks very "cheap" to me.
Most of the surfaces in the video aren't supposed to be so "plasticky".

>it has a huge difference in visual quality
Not really. We have lot of ways to fake it with static calculations. The only thing raytracing did was allow realtime calculations across the global illuminations.

Because the purpose of the demo is to showcase reflective surfaces, not how much grime they can put on a texture.

Reminder, nVidia switched from Crytek to Unreal to co-develop the engine. Crytek's raytracing is now out there to advertise to AMD that they should work together.

>brainlet (((scientists))) of the 21st century can't determine x, therfore x is not deterministic.

>tfw both our shitposts have been set in motion at the big bang and there was nothing that could have been done to prevent them from happening today.

cope harder nvidia shill

/thread

Nope, sorry. This thread will continue whether your masters at Nvidia want it to or not.

Nigga what? The demo looked extremely good considering it's using mid-range performance level cards from today's standards.

Why does that user want hyperreal gravity on games of all things? Do you want DF to evolve even further beyond?

>kek just imagine how much better it would look on nvidia gaytracing

i wanted to make some game with unreal engine 4, but i just downloaded the cryengine to test it, and there isn't any fee required anymore with crytech, amazing

Unreal Engine 4 = nVidia engine. Crytek engine = hardware agnostic engine.

forum.beyond3d.com/threads/next-gen-lighting-technologies-voxelised-traced-and-everything-else-spawn.60873/page-70#post-2062044

You can fake things with rasterization but it will never be as good as real raytracing

Attached: edgescrjrv.png (1033x668, 718K)

Because games were made for fun+profit rather than maximum profit possible.

Unreal Engine looks better than this garbage

Found the chink shill

KRAUT SPACE MAGIC

Attached: 7edfwrnpzrkz.jpg (671x604, 53K)

>this level of damage control

I don't think the claim was ever that only Nvidia could do possibly raytracing. The claim is that at the current time only Nvidia can offer accurate raytracing instead of merely approximate raytracing while maintaining even remotely decent performance by utilizing their matrix computation cores.

>You can fake things with rasterization
Well, yeah. That's pretty much what rasterization does: produce a convincing lie that doesn't take fucking ages to compute.

But nvidia can't even do that.

No!!! I need validation for wasting money on a RTX 2070...

Attached: 1534765580344.gif (255x231, 112K)

>real raytracing
RTX isn't "real raytracing" my man

quake 2 is entirely path traced

Raytracing has been a thing since the 80s but only now GPU's are powerful enough to render it in real time. With raytracing implementing things like mirrors becomes a lot easier because the properties of lighting are closer to real life.

>but only now GPU's are powerful enough to render it in real time.
Maybe something like the HGX-2, but rtx is fake ai assisted ray tracing

people were not saying that when they first introduced tessellation xD

>1spp
>real raytracing
that pic still looks better than any reflection novideo has implemented onto recent games (bf rtx reflections are so bad that they are 50% mixed with ssr in the final framebuffer)
lmaoing at your pathetic nvicuck lyfe

>think 8bit refer to graphics.
To be fair, so did everyone else of the era during it.

It absolutely is.

>but only now GPU's are powerful enough to render it in real time

Still years away, no single gpu can do raytracing at real time, right now the 2080ti only uses raytracing with ultra low samples and at a reduced resolution, the horrible results goes into a denoising shit and can be used for reflections and shadows, just don't look too close at them...

So pretty much everything on games with rtx is rasterized except for reflections, as new gpus comes the samples and resolution will increase and then start to take over the rest of the raster engine job, but 3 years minimum.

PS. except quake, there's a full path traced quake, but afaik it still has noise and it's using some tricks for performance that can't be ported to any complex models/shapes.

It's the same dirty trick than DLSS, just render a shitty version at 1spp and fill the gaps with magic DL.

One sample-per-pixel implementations do _not_ qualify as real ray-tracing.
Better luck on the next iteration, Jensen.

Wait why the fuck does the cards have RT cores if you can do RTX without even needing RT cores

Attached: 1549414127032.jpg (1080x869, 263K)

nVidia moved from Crytek to Epic. Times changed. Maybe Crytek found out why nVidia wanted the high tessellation and thought it was "evil" and nVidia stopped funding them. Who knows.

hopefully the implementation is not optimised around the vertex buffer so they can modify visibility by reflection depth and actually do gameplay mechanics with this, not needlessly decimate performance for very little gain over environment mapping.

A) The shittiest quality ray-tracing is still ray-tracing.
B) You can take as many samples as you want. There are no limitations in DXR or RTX implementation. Whether the hardware can support your specific implementation is another discussion.

According to A and B then we have been able to do realtime raytracing for years now.

>The shittiest quality ray-tracing is still ray-tracing.
Nice cope, but no, ray-casting or path-tracing is not real ray-tracing.