Ray Tracing in 20-30 years when it can be used fully (no hybrid solution) while maintening 4k60fps? Why not.
But as of right now, it's 100% useless and a Nvidia Scam. What we need isn't muh Ray Tracing and muh RTX. We need smart visual artists and smart engeniers and a great graphical engine.
Ground Zeroes PC: Screen Space Reflection IN REAL TIME (no fake cubemap) and it looks very great, without having a significant performance drop. Of course ray tracing is technologically superior, but it's FAAAAAAAAAAAR away.
Ray Tracing is 100% Overhyped tech
Other urls found in this thread:
international.download.nvidia.com
international.download.nvidia.com
geforce.com
youtube.com
youtube.com
youtube.com
ea.com
ea.com
youtube.com
twitter.com
kys amd poofag shitposter
More Comparison pic.
international.download.nvidia.com
international.download.nvidia.com
>4K60fps
I am the only one who thinks 60 fps looks like dogshit
I hate using monitors under 180hz
The Nvidia guide of the game: geforce.com
As you can see, it's 100% real time and extremely impressive. It has been Tuneddown a lot in TPP's PC.
Think the more impressive is in the cave area where you have the rescue the girl prisoneer in the Ground Zeroes mission, with the lights bouncing in the walls, extremely impressive, and it doesn't feel exegerated, it just feels..right. The Fox Engine is a god engine with the right devs. The game itself look pretty muh, but once you put the correct physic based lighting, everything blends in together and everything looks great.
We need smart devs, not lazy devs who wait on "muh hardware,muh ray tracing" to do all the work for them.
Just look between 14:50-19:00
youtube.com
The lighting is VERY good, without having major performance hit and without muh raytracing. SMART DEVS and GREAT ENGINE.
Why does the water with this shit enabled look like a blurry fucking mess?
People don't realize here that you can do very convincing lighting and lighting physic based games (Like Ground Zeroes PC and Fox Engine physic based lighting) without having to use an extremely avant-garde technology who won't be seriously and fully exploitable in real time before 20 or 30 years.
it's what Nvidia is trying to do with their RTX cards, supporting raytracing directly in the hardware. But guess what? That doesn't make mircavles.
Just look how their "hardware based" Physix such as Hairwork and other exclusive Nvidia graphical options work on Nvidia cards who are supposed to have hardware support for all of it? It's still runs like shit. Then guess how's going to be with a far more demanding tech like Ray Tracing.
Like I said, 20-30 years before 4k@60fps with full authentic Ray Tracing.
60 is about where the limit is for what you can go back to.
granted, you are able to go lower as long as frame timing is great, as in let's say doom, or or a sync monitor, but 60 is where games start to feel good enough its not an issue, granted somewhere around 90 is where I personally stop caring, while I can go up to 144, Im not turning settings down to get that outside of fast paced fps.
driving games I would also say I would turn settings down for, but honestly vr and racing is so much better then racing and high frame rate
Go outside more.
>amd shitposter
> Ground Zeroes PC being a "The way it's meant to be played" Nvidia sponsored game
Fuck off back to /v/.
honestly, the battlefield 5 60fps at 1080p is happening because they had 0 turing optimizations, they were still brute forcing it like they were with volta for testing. how big of a hit you take is still up in the air, and at the time of dice showing it off, the rays were still 1:1 resolution with the display.
as much as I think this shit is to early, I still want to see final implementations in game to really judge it.
It's RTX and Ray Tracing related so it's Jow Forums relevant.
Just cut out the manchild toys, then. They are not to be discussed here.
metal gear may be a twimtbp game, but its not the game is fucking broken now implementation it usually is.
Autism
Yes of course. But see how bad is their Nvidia Physyx implementation for things like hairwork, physics particules and other exclusive Nvidia features? Even if Nvidia's cvard have hardware support of them, they still run like shit on Nvidia Cards and are still a massive hit performance wise. So I don't have any trust in Nvidia to handel an extremely performance heavy tech such as Ray Tracing if they can't even master "simple" particule physics or hair physics with hardware support on their GTX Gpus. I will expect 20-30 years before proper and full speed implementation in real time.
>we a great graphical engine
vulkan looks promising and ironically it seems to be the one that handles rtx the best as well
> Video Game's Graphical Engines and Graphical rendering options aren't technology
> Nvidia GTX and RTX cards,which are primarly target video gamers aren't technology
Imagine being 1-digit IQ.
>X is technology, therefore it belongs on Jow Forums
You know how I know you've only been here for a few weeks?
>in 20-30 years
>4k60fps
That would be just sad.
thats pretty good, some images fooled me
The Ubisoft CEO says that streaming is the future for games. So fuck upgrading to a RTX. Get yourself 1GB/S korean internet.
Why do people invest so much time in convincing others they don't need something?
People are going to buy the RTX because it's the latest and greatest.
Left looks better. Which one is raytraced?
crap, it's from 2012 lmao
Well, the one of the Left is the "Real Life" picture, which is "naturally" ray traced.
The one on the right is made inside the FoxEngine, without ray tracing.
If only we had the map editor in TPP...
youtube.com
Because it's not needed right now, the price jump is ludicrous and you won't benefit really benefit from the ray traced aspect.
Well, Ray Tracing is extremely heavy (and the one showed in BFV and Tomb Raider are extremely bastardize and hybrid versions of Ray Tracing, not the real thing) and thus, I'll be surprised that we go past this framerate and resolution with full raytracing.
just chill, AMD will eventually use raytracing too as soon as games use it at the point of being unable to play without raytracing
>makes a GPU thread in /v/
>thread gets deleted in 10 minutes
BTFO for not being related to /v/
Imagine a market of videogames where "You must have a Nvidia video card to run this game."
Of course they will, Intel will too if they are serious about their gaming GPU. Then, Sony and Microsoft will jump in it too (but not in the next generation, more like in the next-next generation so in a very long time). It's going to be the rush toward ray tracing, they won't push further than 4k60fps now, this will become the standard on PC and consoles for the future. Next GPUs won't be developped with the purpose of running traditional rasterialized games (non ray tracing games) better, but they will be developped in mind of having better raytracing perfs.
What does it mean? It means that Nvidia/AMD/Intel will allocate more and more to the raytracing for GPUs and less and less for tradinonal raytracing. It wouldn't surprise me that after turing, we won't see any significant performance increase for rasteratie tradional games and all will go toward raytracing improvement, a.k.a, you can forget your 8k@240fps gaming, this is not the objective now, it's raytracing and they will aim for 4k@60fps. Moreover, it wouldn't surprise me that we futur raytracing gaming GPUs, we could not play traditional rasteration games anymore! Since that there won't be any hardware material for that on the gpus anymore, but only dedicated raytracing gpus elements.
The kids would be so upset.
physics got the shit end of the stick because they handled it with cuda cores instead of dedicated silicon to it, that means it has to share with the rest of the game for shaders.
hair works in particular did the retarded double whammy of tessellated to fuck and back, along with each hair being an object rather than having hair work in volumes.
physx actually works really well when they are not abusing it to fuck with old gen cards and amd.
but thats that and this is this, ray tracing shouldn't be using any of the normal games shaders to run, however it seems like they are coupling ray tracing speed with the card, take a look at battlefield 5, that game was 1080p with what they had enabled, on volta optimizations, as in if they opened it up, it would run about this good even on amd hardware if I remember numbers correctly, however, volta in demos shown was a 5-10 times performance jump over volta specifically for ray tracing, honestly speaking, if they allow themselves to go a bit granny with details, we could see near full ray tracing soon, at least in terms of reflections, refractions and shadows, full environment may be a bit further off, but dice saying they could have the rays render at 720p instead of 1080 would basically mean it could act like shadow maps do, except a more dynamic version, the close you get the better it looks rather then the closer you get the bigger the jagged edge gets, and with a bit of blending on the ai/tensor core side, it could be a smooth edge... you know, I wonder how that would work... would it be possible to aa the edge of a ray traced shadow or would it cost less to just increase the ray resolution over all?
I imagine 90% of people will never care as thats about how big a market force nvidia is.
amd already does ray trace, however they were showing off baking rather then real time. my understanding is all the demos were shown off without optimizations as nvidia only got turing out about a week before they showed it off, so it was using volta level optimizations, which amd I believe is already as good or faster then nvidia at, they don't have shit on turing if octane is to be believed, but there is no real way to tell entirely till we get shit in our hands.
Ray Tracing is Hairworks 2.0 its just here to keep people buying the full range of cards including the 10 series until they switch to 7nm
>turn on 4k in Andromeda
>literally can't see a difference
So why is this case of snake oil being pushed so hard exactly?
>Of course, there are a number of other interesting tricks and optimisations which are implemented to ensure consistent 60fps performance at 1080p resolution. Instead of letting rays bounce around continuously, the second reflection ray from a reflection - the reflection of a reflection, if you will - is not cast back into the BVH. Instead, the ray is shot back into the pre-rendered cube maps already scattered around the game level for the standard presentation. This means that reflections within reflections - such as the reflection on a character helmet as seen in a mirror - are more accurate versions of standard cube map reflections.
Not too excited for those further optimizations just yet.
Buy glasses.
>I imagine 90% of people will never care as thats about how big a market force nvidia is.
>consoles use AMD
>Buy glasses.
20/20 vision. I must be doing something wrong. But the picture actually looks somewhat worse now that I can see shit.
I think its because its a pirated repack. I bet the 4k textures were tossed oit
Definitely something wrong with the game or your graphics card and it's settings.
What about Nintendo Switch? It uses Nvidia's Tegra.
because tegra was the only to market chip at that point.
yes... those weak consoles that eat shit and enjoy it... yea, if someone wants to make an nvidia exclusive game, its going to be on pc, and no one will care because everyone playing games uses nvidia. Shit sucks and im most likely getting a navi but I have come to realize, people dont want amd to exist in gaming.
that was a consequence of volta being what they were able to test on, granted stamping out the hall of mirrors effect in some way performance gains, remember, the code path was not using the ray tracing segment of the gpu at all, volta is at least according to octane 1/10th as powerful when it comes to rays, along with the battlefield demo also not using any of the denoise on gpu and instead using a custom solution because the denoise wasn't available to them before turing was given to them.
Im not saying its ready, but what I am saying is the demos we got right now are mostly bullshit, we need these gpus in 3rd parties who aren't licking nvidias sack hands.
kys poojet
back to Jow Forumsamd
>Muh amd
>SEETHING
If AMD came with HW raytracing instead of Nvidia, you'd be shilling all over the board about how it's the technology of the future.
Personally I don't give a shit, I'm not buying a Turing GPU because the price for performance increase over Pascal is pathetic.
Cringe
Based
I'm a Nvidia user idiot, I've never owned AMD hardware. That doesn't mean I'll follow Nvidia blindly and that I won't criticize when they are acting like kikes.
This is just cherry picked as fuck OP
It's not cherry pic, it's the case of a extremely talented studio, Kojima Production, and an engine built from scratch to support physic based lighting rendering. It's 100% possible if you have the will to do it.
Moreover, if you had played the Ground Zeroes PC game in ultra settings, you can see that the whole level benefits from it, some area more than others but it improves the visual everywhere.
neither
The problem isn't the hardware/driver implementation, but how they're used by software.
It's not just
>this here is an 'apple'
>I'm going to put it at x y z and want you to take care of all of it's physics interactions with everything else
>I'll just deal with the preset animations, model and texture updates, and tell you when to stop because I don't need it anymore
>good luck
You have to KNOW how to use the physics engine in order to use it properly. You have to figure out the right numbers for what you're trying to do, like mass, density, rigidity, elasticity and so on. And for everything to work together, you have to find the right numbers for everything in the scene.
No wonder why ragdolls start seizing and stretching like crazy, since the walls and floor around them are completely static and have just preset values for physics interactions.
Having said that, there really is one big problem with the current GPU physics implementations: they're unable to run proper full-fledged fluid dynamics computations in real time, which means for example that hair and "light" particles (like snow) only interact with emulated environments, not with air or other fluids around them. To prove how this is a real problem, just look at big projects that deal with aerodynamics: they only run simulations (and not in real time) for the initial phases, to come up with good design contenders that will then run longer simulations and finally be tested in real wind tunnels. Even for small simulations, they don't use just one or two 1080Ti. I'd guess not even just one Tesla array. And they're still not in real time, because currently real time simulations won't come even close to the real thing, most often they won't even be able to show you if something is wrong.
If your game relies on raytracing to be good, it's probably not going to be good.
288hz is the sweet spot
With that paragraph alone you've proven you know fuck all what you're talking about.
indeed.
The puddle looks like shit to me.
Buy better glasses.
>early 2000's fake reflections
Looks bad, man.
(((people)))
It's in real time, you'll know if you had played the game.
Pirate it if you wish, Ground Zeroes doesn't have Denuvo.
Ubisoft are also a bunch of cunts.
nVidya plays the rope-a-dope with meme trace and outrageous pricing so that reasonable people go buy AMD stuff instead.
Once upon a time, Ford and others conspired to take down an emerging auto company because they felt "people weren't ready for innovation"
Market manipulation happens.
if you dont understand the importance of raytracing it means you are another gaymer who plays shitty games.
if it was for you simpletons we would still be playing counter strike tier games. Fuck you.
Implying this wasn't already the case. So many games shipped with nvidia gameworks and other such garbage that ran poorly on amd cards.
Of course Ray Tracing is technologically far superior and the objective, but it's way too far to shill it the way that Nvidia does, especially at this price tag.
this isn't even real ray-tracing, they literally use machine learning as a substitute to fill in the gaps, it's not physically-based at all. I've used OptiX before (pre-acceleration) and it's a recursive stack-based architecture, NVidia is doing some real marketing bullshit here, the benefits of their hardware acceleration are next to nil.
> Counter-Strike tier games
You mean one of the most popular, entertaining and engaging games of all time? Gee whizz, that sounds bad.
>this isn't even real ray-tracing
do you know what machine learning is or how it works ?
does that mean a self driven car is not real driving because machine learning is helping it get there faster ?
have you looked at nvidia research videos on youtube and see the comparison between raytracing and machine learning raytracing ? its almost indistinguishable to the untrained eye, it will only get better and better
>they're unable to run proper full-fledged fluid dynamics computations in real time
you are retarded and dont know what you are talking about. GPU fluid dynamics is very possible nowadays.
Consider:
Fox Engine is Black Magic
I'm not in complete agreement with this but I do find it hard to accept that digital artists and game designers hit a wall on what can be done with pre-baked lighting and SSAO.
Yes we've all seen this video a ton of times now but youtube.com
The more you buy the more you save.
>if it was for you simpletons we would still be playing counter strike tier games. Fuck you.
?
what are you getting at? how is counter strike a bad game?
>you'd be shilling all over the board about how it's the technology of the future.
I would too because I know they don't have the audacity to charge $400 fucking dollars for it.
no... at this point its not ((())) its everyone
>"Why cant amd put out a high end gpu so nvidia has to lower the prices"
This isn't the ((())) anymore, this is something the ((()))'s wish they could instill.
fpbp
neck yourself jelly poorshit
wow what a fucking disaster, it actually decreases the immersion imo. Just buy goyim
If a lot of people buy overpriced shit, it will send the message to nvidia that they can overprice their shit
If people are wary and reject overpriced shit, it will send the message to nvidia that they must price their shit competitively
What is so hard to understand about this concept?
>latest and gratest
pull your consumerist head out of your ass. You may afford these cards now, but if the trend continues, chances are you won't be able to afford them at some point
And was doing goy tracing in their tech demos 10 y ago lol