Enlisted running on Vulkan with RTX global illumination (4K 90FPS)

Where were you when Nvidia btfo the competition.

Attached: btfo.jpg (2686x1300, 751K)

Other urls found in this thread:

youtube.com/watch?time_continue=11&v=AvXra6yqEZs
cgicoffee.com/blog/2018/03/what-is-nvidia-rtx-directx-dxr
codercorner.com/blog/?p=2013
twitter.com/SFWRedditVideos

youtube.com/watch?time_continue=11&v=AvXra6yqEZs

...

SIR PLS DELET

>game looks like shit
>lets run it at high DPI so we can see the shit clearly
Novidya just seem like that cunt on Gumtree/Craigslist thats selling an old graphics card for way too much money because it can play CSGO at aver 100 FPS.

does it run league of legends? literally the only reason i keep a windows dual boot nowadays

>game looks like shit
does pic related look like shit ?

Attached: enlisted.jpg (1200x675, 199K)

It really does, those illuminated surfaces/shadows have absolutely no radiance, sub surface scattering, or any basic kind of AA.

typical autist. No wonder faggots like you don't care about picture quality, only "muh 0.000,000,6 second fps spikes". At least post the gameplay footage.

> no radiance, sub surface scattering
stop pretending to be an idiot, you can literally see sub surface scattering in the guys face.

Attached: enlisted2.jpg (1920x1080, 269K)

Looks good here
Looks like shit here

First one looks like a render, second one looks like in-game footage. Both have shitty textures.

Are you fucking blind, his face is brighter. That's it.

Anyway what a huge fucking disappointment even novidya with billions of dollars to burn to R&D can't make a fucking actual ray tracing accelerator worth a shit. It's not even real tracing like pic related. Pathetic.

Attached: TRC2014-Laura_Marie_Lediaev.jpg (1600x950, 329K)

I'd rather play arma desu.

nvidiots don't give a fuck about picture quality. Reminder that these placebo-driven fangays are so autistic that nvidia once got away with reducing picture quality to boost fps. Because their consumerbase literally cannot see any difference. All they want is to push everything to ultra and get 105 fps with 6gorillion fps spikes every once in a while.

> nvdia cannot even do 4k 60fps on a single gpu, its shit
> well yeah it can
> b but it cannot even do raytracing at 4k 60fps
> well yeah it can
> b but but thats not even real raytracing
> t. amdrone

:-*

believing nvidia's pr gimmicks
guys are in line with the light source yet the shadows are non existance

EXACTLY like we saw on tomb raider

You're dumb as shit. Wow.

>"So as far as I see it, RTX (in case of real-time ray tracing) it's a sophisticated temporal point sample denoiser coupled with a low (really low, like 1 SPP low) ray traced data designed to cut frame conversion time to values acceptable for real-time applications."

cgicoffee.com/blog/2018/03/what-is-nvidia-rtx-directx-dxr

Also source of it doing 4k60fps raytracing? Demos all showed "ray tracing" at 1080p 20-30 fps

They're not in line with it. The light is at an angle relative to the entrance, and you can still see a shadow coming off the guy in the back.

yeah thats a shadow of a collumn not the guy

>didn't even watch the video
brainlet.jpg

I know you're too lazy to watch the video, but you can clearly see the lack of shadow on the opposite side of the guy, as well as the shadow cast by his helmet and the strap hanging off his leg.

ahahahahahahahaha
looks like shit

Attached: 1520133869148.jpg (2847x1400, 181K)

>that disgustingly unrealistic halo SSAO-looking ambient occlusion

NOOOOOOOOOOOOOOOOOO THIS CANT BE HAPPENING MAMA LISA IS IN CHARGE HERE

Attached: 1534624655080.jpg (1841x1227, 669K)

How the FUCK did we go from crysis 3 graphics to this?

Attached: crysis3_2013_02_22_01_22_03_704.jpg (2560x1440, 3.56M)

The colors look nice but there's really no depth to the textures and a basic bloom effect instead of proper subsurface scattering.

I'm an nvidia fan but even I'm disappointed. Crysis 3 came out like 5 years ago and nothing has improved graphics wise. I think RTX makes things look WORSE.

If I'm dumb enough to buy a 2080ti I'm definitely turning that diarrhea off.

Attached: n7arIvH.jpg (1920x1101, 458K)

except real life lighting literally looks like this

AHAHAHAHAHAHAHAHAHAHAHAHA

Attached: Glasses_800.png (2048x1536, 2.9M)

Your pic looks like a Call of Duty 2 HD texture mod screenshot. That's how much unimpressive it is.

>posts direct sunlight
>the screenshot is scattered indirect light full of secondary and tertiary shadows

you are literally retarded

Is this a joke?

Attached: 0skfltvdcrh11.jpg (780x379, 54K)

point is it's nothing like actual ray tracing. Do you know how long that shit takes?

do you drones understand the different between realtime and baked ? crysis 3 is all fucking baked and really high resolution textures, theres nothing revolutionary about that.

But nothing has improved you tard. RTX actually looks worse than the suicide crisis hotline baked pizzas.

> I'm such a retard that I dont know what realtime raytracing vs baked global illumination means.

This is awful.

Why are we still using flat textures since WW1?

Attached: Screenshot_2018-08-24-15-42-41(1).jpg (249x272, 39K)

looks like shit unironically
but this is what you should expect more from ruski sukas mainly known as creators of bydlo boomer games like world of tanks

So are you admitting this isn't ray tracing? Because yeah, people shouldn't get the wrong idea that you can do the work of a10 rack epyc system with a $1,200 nintendo adapter.

Now imagine if they had any creative drive at all and used that technology to create fantastic worlds instead of another boring war shooter.

I'll explain it to you in simple terms.

Baked global illumination (what every game has out there) looks perfect but the downside is that its not dynamic, doest change, you cannot destroy entire levels while making the lighting look good, you cannot generate things procedurally without shit looking fake.

RTX is realtime raytracing, meaning you can do procedural, day night cycles, destroy every fucking object in the game world and every shadow, every light, every reflection will look perfect.

it might not be perfect right now but RTX is a taste of whats coming. in the future all games will be raytraced (what pixar uses for their movies)

So it's baked global illumination with some faux ray tracing then, right? Because as far as I know 1 spp ≠ ray tracing.

Well no shit there is nothing in existence that can full frame render in real time. It’s the holy grail of graphics and you’d have to be a retard to believe Nvidia or anyone was doing that on a card that’s $1k. We are at least a few decades away from full real time rtx at 1080 let alone 4K. Reatrded baby gamers need to fuck off.

Why couldn't novidya just call it "enhabced" global illumination then? People are going to get the wrong idea and think this RTX bullshit can replace software v-ray.

>Now imagine if they had any creative drive
Even if they had any, they are not allowed to do anything.
I watched russian local presentations from them (it's a stretch to call it a presentation though, more like friendly talk between devs) and they are completely self-aware.
They know that they are creating shitty sub-pair games just to make money.
That is what their bosses tell them to do.

Competition?

How does the kremlin feel about mulana (whatever the fuck her name is) being married to trump who constantly has to pay whores to keep quiet about his impotence in bed?

How about you fuck off, you dumb Jow Forums nigger?

its raytracing, simply that they are using machine learning to do realtime denoising.

When you generate a raytraced scene you decide how much noise and bounces you want to deal with, the more bounces the more realistic, but the more time it takes to generate.

With machine learning, you can use a very smart algorithm to denoise the image and basically fill in the holes.

Its brute force raytracing vs smart raytracing.

Attached: raytracing.jpg (1280x720, 203K)

Look man, just curious about russians desu. They know more about us than we do so it's only fair that I ask. It's not like this is a legitimate tech thread anyway. It's one of the thousands that will be posted here to shill people into buying the next nvidia turd has pushed out.

Attached: DefiantMemorableKakapo.webm (1280x720, 842K)

Well this smart ray tracing looks like ass.

kek it looks so much worse


ray tracing in gaming right now is fucking retarded for the hardware costs

It's essentially the hardware video encoding of the gaymen world except the result is much worse than the very very ultra mega lightning fast SW preset.

did they finally open-source their firmware and/or drivers? if no, then they are still not as good as amd

its static (baked lighting) vs dynamic (raytracing)

I rather have dynamic worlds than prebaked static bullshit.

Ever wonder why most games have static worlds and the ones that allow you to change it look fake or cartoony? its because nobody could do realtime global illumination / raytracing at realtime

I agree.

That shit looks like CS:GO with bloom.

>gaijin
oh how i envy you, those that didn't play the games these slavniggers made, how i hate having experienced them
And WT could have been such a great game

The truth is that Nvidia is in the same position as intel. They are hitting a wall with performance due to node size. They need solutions or ways to branch off from just brute forcing performance thru cramming transistors. Nvidia most likely has to wait for foundries to catch up on die size and this is a stop gap measure to stave off competition.

It's like Metal Gear Solid V Ground Zeores and The Phantom Pain: Meh model, Meh textures, Meh world but god like lighting, which makes everything very pleasing to the eye, even though technically, they aren't really that great.

Attached: mgsvtpp 2018-06-13 20-22-52-44.png (1702x1718, 3.17M)

Woah sick lighting! /s

Id much rather prefer higher fps and higher res textures than ray tracing. I mean shit Unreal has been doing just fine without it.

literally the first 3 posts you imbecile

>pre-alpha gameplay

>1200x675
>.jpg
>does pic related look like shit ?
Yeah

I don't care how it's done, it looks like blurry shit.
When it looks better and doesn't rape performance i'll care

Wow, it's almost like one demo is not completely indicative of a new technology and shitposting trolls on Jow Forums will latch on to the first thing they can at launch and use it for bait ad nauseam.

Strange...

codercorner.com/blog/?p=2013
>amd shills btfo

IT
JUST
WERKS

Attached: 839.gif (680x435, 135K)

>But the raytracing API itself comes from Microsoft, so it is easy to imagine that it will eventually appear in some future Xbox consoles. And this time they will really deserve their “next-gen” moniker.
this is really a glimpse of hope consoles have to much influence rn

>and the ones that allow you to change it look fake or cartoony?
I can literally think off not a single example

all major examples of dynamic shadows come from games with realistic art styles like half-life 2, fear, crysis, and metro

>"global illumination"
>exhibits SSAO defects

What is elisted and why should I care?

never

I dont think you understand.

Do half life 2, fear, crysis or metro have realtime destruction ? No, because indirect lighting is baked, completely.

indirect lighting / global illumination is for the most part baked into textures and applied to every object in the scene. Sure you have some dynamic lights in rooms, that's direct lighting, not indirect.

pic related is indirect lighting as a texture, and how its applied as a coat to everything, the downside is that nothing can change, otherwise you break the illusion and you need to rebake the scene for it to look correct.

with rtx none of that matters as things are realtime.

Attached: lightmap.jpg (800x336, 38K)

All this has managed to prove so far is that Vulkan basically outclasses DX12 in every conceivable manner possible when it comes to being able to access the computer capability of the GPU.

Given that the game is basically in a beta-ish phase, being able to achieve 120fps+ at 4K would only be viable with an API that allows you near metal access to the card. We know for a fact that DX12 in implementation, thus far has been a dumpster fire when it comes to performance on Windows; but we also know that Vulkan that's implemented well enough, can do some really amazing stuff.

That's what this is. Vulkan being fucking amazing. I remember when I played Doom on my 780 and I struggled to get 30-40fps on average at 1080p (windowed) on high everything minus compute shaders with OGL4.3. Then the Vulkan renderer dropped in the post-launch update, and I started getting 50-52fps average on the same fucking card, same settings. A 69% increase in performance on the same hardware.

So yeah, I don't believe for a second that RTX tech had anything to do with the performance, as much as the API in question being primarily responsible for the difference.

OK so it's fine for everything to look like dogshit as long as it's running in real time, but if anything's prebaked and looks good, it's bad.

Reminder that Vega 20 will have 50% better raw compute than the 2080 ti (13.4 tflops vs 20 tflops) and twice the VRAM.

>AMD fanboys choice Believe lies WCCFTech

>100+FPS
>looks like shit
>dem ugly AO
I gave up gaming because of reasons like these. It looks like shit and WTF I love crysis 1 now.

Right well when Vega 20 is permanently sold out because it's the best mining card ever made by a margin of 50% or more, don't say I didn't tell you.

the rock and the handle on the bolt action say it all

Vega 20 is a expensive 7nm and 4 dies HBM2 plus FP64 compute and Over 5000 dollars.

Vega 20 for HPC

yet you can destroy many objects and they all have a shadow and the shadow disappears w/em lol

>btfo
thats what you think

Attached: 1534857980383.jpg (1280x720, 91K)

Basically Gimpworks 2.0

>over 5000 dollars
It will be under 2000, easily. It's a mining card through-and-through. It doesn't have the ray tracing or any other gayming bells and whistles of the 2080ti, it doesn't have an equivalent to tensor cores, all it has is raw compute and RAM - which are both needed for mining.
AMD can't market any HPC cards because they don't have a competitor to NVLINK, tensor cores, or CUDA Toolkit.

>/s
sup Jow Forumstechnology or Jow Forumsgaming

has someone fucked with the image in photoshop or something? looks so fucking bad hahaha

Attached: UWWDc77.gif (700x298, 3.67M)

You know that DX12 is as close to the metal than Vulkan right? Like its totally different from DX11.
Just pointing that out, I'd still prefer Vulkan for the portability.

Is user confusing the fact the game map is 4 by 4 kilometres in size with the resolution being 4k?

Why does Ivan have a slab of concrete on his head?

>FPS drop of 6fps with GI on/off
I call bullshit

No, Crysis 3 does everything real time it even has real time refraction and reflection

Every one of their shirts has the same creases on the back.

Fucking 2011 Skyrim with mods and an ENB looks better than this.

Obviously that's just the hyper-realism of what happens when a lopsided private irons the shirts of his whole platoon.

Personally I don't care if someone claims they implement (((raytracing))) if it doesn't look even remotely close to raytracing.