Rtx on

Fps 1/2
It
JUST
works
Imagine releasing a 1500usd+ flagship gpu that can't even run its own (exclusive features) at 4k 60 let alone 30fps
Gamers moron miners and nvidiots btfu
Why are gpus in such a shit state? Polaris and pascal where pretty good but now both sides have gone full retarded

Attached: Screenshot_2019-02-15-23-26-29-69.png (1920x1080, 608K)

Other urls found in this thread:

youtu.be/-mEP5k_-zso
twitter.com/AnonBabble

I don't see the issue, would you rather developers downgraded the game visually so it can run on toasters?
I personally prefer devs pushing the limit.

Also, how is Nvidia at fault, they released the best GPU possible.

RTX is worthless until they release a piece of hardware that is able to utilize it well. For now it's like PhysX on early stages and not meant to be used.

>best gpu possible
>only has 11gb of vram
Nvidia ain't pushing shit I'm glad their stock is in the crypto trash fuck them and fuck you for defending this trash

Attached: Screenshot_2019-02-15-23-35-41-53.png (1920x1080, 564K)

>would you rather developers downgraded the game visually so it can run on toasters?
Yes

It's literally physx back when they made stand alone cards for it.
They literally just tacked separate rt cores onto turing and cbf even die shrinking to 7nm instead using ancient 12nm from 2 years ago

He meant the 2080ti is a toaster why doesn't everyone have a 3 thousand dollar rtx titan with 24gb of vram and barely 10% faster for 3x the cost goy

It is Nvidia's gambit on making discrete GPUs relevant to mainstream customers. They are deathly scared that iGPUs are catching up and starting to make lower-end and mid-range SKUs practically obsolete (Protip: They make-up the lion's share of discrete GPU revenue).
They are making the tools accessible to developers in the hope that they will start to make RTX mode mandatory for future titles. Pixel/Vertex shading were in the same boat in the early 2000s. They became requirements by the end of the 2000s.

>have 144hz screen
>literally drops bellow 30 on max settings
>mfw

Attached: 1502384030425.jpg (646x720, 23K)

>4k
this meme needs to stop

This, we're getting to the point where you can't run something at 30FPS with a GPU that costs as much as a decent gaming build. That is remarkably stupid. They should focus more on optimization instead of this image quality bullshit that is beginning to be subjective, to be honest. You can look at Metro Exodus and in some cases the game doesn't look any different with RTX than it would look without it in a different title. In fact, there are better looking titles without this RTX thing that I'm sure run better. In some cases it's this thing where instead of looking better or worse, it just looks different.

PhysX was something that more people could at least perceive as an improvement. This isn't that and the impact is bigger on performance.

Nice thats pretty good for the 2060

Based. I want gaming to be separated by ray traced high end gaming and mobile trash without it.

>muh vram meme
Even 8GB is more than any gaming load could max out

Yeah I remember hardware t&l and gpus becoming mandatory before that as well (software modes died out by the early 00s)
Dxr is already standard should amd release a new gpu arch in the future that could run it (Arcturus?)
The game runs at 80+ avg in 4k tho
If u watched the previous video he did there is literally no difference to the lighting at all if the scene is indoors and dark/single light source
Also dxao and many other ao types do the same thing.
Plus this particular dxr powered ao seems broken on trees and just flat out doesn't apply to some objects
AND costs 50% performance
youtu.be/-mEP5k_-zso

Attached: Screenshot_2019-02-16-00-01-14-16.png (1920x1080, 2.56M)

I'm fine with a few titles giving something for the richfags to play with. e.g. Crysis. Not every new game should be 60+ fps on extreme settings.

Programmable shading actually offered a world of advantages.
RTRT isn't that.
>Dxr is already standard should amd release a new gpu arch in the future that could run it (Arcturus?)
AMD's entire ideology is to avoid FF hardware wherever possible.
Don't expect it any time soon (like 5 years soon).

Fuck you poorfag

Dx12/vulkan (mantle) compatible hardware from 2011 isn't forward facing?
Fuck off desu and I say that as a ex 290 390x Vega 56 owner
This.
At least crysis actually broke new ground and even with fancy new raytraced shit every new game still doesn't look as good
Bfv lighting is shit as well as metros despite bding what fucking 12 years newer?
We really have regressed desu not even 10%+ a year especially per $
Fuck Intel for killing amd and amd themselves making garbage for 8+ years until zen

Attached: images-5.jpg (541x567, 58K)

You're bound to run into problems and performance issues with the early adoption of any new graphics technology, and many other technologies in general, especially those relating to computing. We had similar issues when global illumination and more advanced rendering techniques for shadows weee introduced, and that was years ago. With as long as this market has been around, you shouldn't be questioning why performance is suffering at such an early stage of development of any new technology, the fact you don't know the answer to your own question shows you are either incredibly naive and inexperienced with technology or too blatantly stupid to not understand the answer after all of these years.

>Dx12/vulkan (mantle) compatible hardware from 2011 isn't forward facing?
Feature_level for 2011 h/w being?

The thing is you're not making something that looks as good. Crysis had a point, it never sacrificed shit for performance but at least it's still today better looking than many titles. BFV and Metro Exodus on the other hand aren't exactly ahead of their time, they just look like what you would expect to get now. They're not the new Crysis and there's no justification for this kind of crap. "I spent more therefore games should run like shit for 1% better visuals" that's ridiculous.

Because fermi Maxwell and kepler aged so well? Oh wait...
This they are extremely average.
It's like when bloom hdr ambient occlusion and normal maps/parralax occlusion first came out snd where shoved onto visually uninteresting bland looking games to tart them up.
At least from mid-late 00s visually games improved alot
Now it's literally the same shit I doubt the next gen of consoles coming out in 2 years time will be much better than current midrange pcs so expect visuals to stagnate even further
I swear if they pull some 8k 30fps shit and no 4k 60hz+ 1440p/120fps+ they will fucking flop
Hell xbox one s and ps4 slim run at 720p in most games with dynamic res at 25fps with drops
Anyway I'm getting off topic if amd manages to shove dxr capable gpus into Arcturus or gen 11 consoles in 5-10 years then it might catch on but ultimately it's pointless
It's a shitty visually noisey implementation by nvidia that kills half your fps or more
Isn't the whole idea of a graphics accelerator is that it accelerates? Not jumping on the brakes and gimping the entire card for some meme tech most devs games or people will ever use.
Why they decided to release this stop gap shit instead of 7nm with double the rt cores and higher clocks is beyond me.
Vega 7 ain't got nothing on rtx at least it has lots of ram and not some useless Dlss rtx gimp tech that kills image quality and or fps

>Because fermi Maxwell and kepler aged so well? Oh wait...
Fermi and Maxwell aged very well, Kepler was just a bit of a misstep on nV's part.

They release stop gap gimmick shit to pad out hardware releases. If they strive to release the best technology available they risk getting too close to the upper limit of their hardware's capabilities, meaning less difference between each generation and less consumers buying new cards. In short: capitalism

>Fermi and Maxwell aged very well
nice bait

Attached: 1536108982452.jpg (499x595, 47K)

Remember when new gpu gens offered more than piddly 5-20% gains? I just think they are bumpinh their heads on moores laws ceiling desu dies are still growing even with node shrink and power isn't really going down either
Gcn/cuda is fucking ancient
No it bloody didn't I had a 460 it was garbage
770 was literally a 680
9xx owners got fucked by shit vram and gimped dx12/vulkan performance and now pascal is getting shit on in newer async fp32/int heavy games
T. 1080 owner it was fast 3 years ago not anymore it's getting shit on by a 2060 oc with 5gb of vram

>not every game should be 60+ fps on extreme settings

unironically kys

Say what you want but ray tracing is the future

Lower your settings, poorfag. High end hardware is high end for a reason.

How do we use rtx to mine coins?

>lowering settings on literally the latest videocard designed for gaming and being ok with it

good goy

>Best card nvidia has to offer
>less than 60 fps at 4k with rtx
>that too, with gimped dlss and not actual 4k
nvidiafags confirmed blind and btfo

Attached: BTFO.gif (400x436, 162K)

You don't, mining is dead.

The idea of running games ultra on 1060 and scaling all the way down to some shitty iGPU's sounds incredibly dreary. No thanks man, I want fast progress.

The future being half a dozen titles and deep learning anti alienation?
Oh wow I can't wait for shit 1440p up scaled smeared shit to claw back 25% fps when rtx still kills 25% fps
Rt cores in gaming cards are a meme
Even if they went to 7nm and tripled the rt cores they'd still have a arch bottleneck because it absolutely kills render times (probably memory speed as well so they'd need to go 16gb+ hbm2 4x modules or gdd6x + since dxr absolutely eats vram especially at 2-4k)

But that's exactly what the xbox one x is (big polaris is even more powerful than a 590)
Devs target that first and work backwards
Hence why graphics have basically gone nowhere since 2016
Screenspace and viewport reflections are improving as well latest unreal engine has some sort of buffer for reflection that persists when the object is out of view
And raytracing still has aggressive culling and is absolutely almost indistinguishable from current cutting edge non raytraced/dxr
Bubbles blown the digital gold rush is over

I talk about *RAY TRACING*...

Isn't rt outdated already?
There's newer tracing going on a decade now