1200usd for 30fps 1080p and it can't even do Ray tracing in any of the games

overclock3d.net/news/software/nvidia_clarifies_-_rtx_in_games_doesn_t_mean_ray_tracing/1
LITERALLY JUST MEME AA AO HAHAHAHAHAHAAH
NOT ALL OF THE GAMES ADVERTISED HAD RAY TRACING
>Nvidia Clarifies - RTX in games doesn't just mean Ray Tracing
At their Geforce Gaming Celebration before Gamescom 2018, Nvidia announced that a total of 21 games would support Nvidia's RTX technologies, a feature that was almost exclusively marketed as Ray Tracing support at the show, with examples like Battlefield V, Metro Exodus, Atomic Heart and Assetto Corsa: Competizione all featuring support for real-time Ray Tracing.

Now that the dust has settled a little, Nvidia has released a press release to confirm that RTX technology refers to both AI and Ray Tracing, which means that almost half of the "RTX Games" below will lack any support for Ray Tracing. In fact, more of the titles below support Nvidia's AI-driven DLSS (Deep Learning Super Sampling) than real-time Ray Tracing, with 16 games supporting DLSS while 11 support Ray Traced elements.

Below is a list of all the games below that will support Real-Time Ray Tracing;

- Assetto Corsa Competizione
- Atomic Heart
- Battlefield V
- Control
- Enlisted
- Justice
- JX3
- MechWarrior 5: Mercenaries
- Metro Exodus
- ProjectDH
- Shadow of the Tomb Raider

Hahahahahhahahahha

Attached: file_5-01-02-01.jpg (1065x560, 154K)

Other urls found in this thread:

youtube.com/watch?v=V0TyhQiY2pQ
twitter.com/SFWRedditGifs

God, I feel so smug about buying a 1080 Ti at launch. "JUST WAIT™!!!" people screamed, new cards were "just around the corner".

You fags actually waited 18 months for THIS. My sides, lads.

Attached: BUFF DADDY.jpg (701x527, 201K)

>Below is a list of all the games below that will support Real-Time Ray Tracing

Were you laughing this smugly when they announced a GPU with programmable shaders?

I feel so fucking happy about waitfags getting told
>hurf durf 2080 will be 30% faster than 1080ti the leaks on the clickbait sites say so
The thing that gets at me is their arrogance. If they actually paid attention to the last five years of nvidia history they'd know exactly what to expect

Geforce 3 was 1000usd+ and got stomped by ati almost immediately
18 years later it will happen again except amd will just ignore them and let them fuck their own market into the ground

Attached: raytracevojak.png (807x867, 390K)

Oh and even if the game does run in (ray traced mode) it only gets out 30fps 1080p with settings turned down and flat lighting
This is the funniest graphics flop I've ever seen 1000usd for that looool

Attached: Screenshot_2018082122141208.png (1920x1080, 2.02M)

Can't wait until 7nm GPUs arrive and BTFO this absolute abortion of a GPU generation

Funnier still they are using (optimised) 16nm still that's like 3+ years old

>Can't wait until 45nm GPUs arrive and BTFO this absolute abortion of a GPU generation
>Can't wait until 32nm GPUs arrive and BTFO this absolute abortion of a GPU generation
>Can't wait until 22nm GPUs arrive and BTFO this absolute abortion of a GPU generation
>Can't wait until 14nm GPUs arrive and BTFO this absolute abortion of a GPU generation
>Can't wait until 10nm GPUs arrive and BTFO this absolute abortion of a GPU generation
>Can't wait until 7nm GPUs arrive and BTFO this absolute abortion of a GPU generation
>Can't wait until 5nm GPUs arrive and BTFO this absolute abortion of a GPU generation

t. brainlet that counts too much on die shrink solving everything

It's more that this is the worst performance increase for the price in a long fucking time and whatever comes after this will likely be a shit ton better in that regard. A $300 up charge for what looks like 20% increase at best in overall performance is a joke. But keep sperging out about nodes I guess, never said it was going to fix the problems all by itself, but I don't think it's that controversial to say it'll bring some decent performance gains

I never heard ray tracing before. What is it in short?

It's gaytracing, brainlet. Haven't you heard Jensen's new unit of measurement measurement GigaGays per second?

tracing of rays.

are you ironically acting butthurt?

>ai botnet in your gpu too

>ai botnet in your cpu's got ai spu in them

fuck this future

>didn't wait
>bought GTX 1080 in February
I really should have returned it for a Ti but I'm still satified

>in February
of last year
somebody told me you had a graphics card who looked like a card that I had in february of last year

Anything with a CPU and internet connection is Botnet even the dumbest of old iot shit

t. Brainlet Luddite

basically it replaces the need for spot lighting and replaces it with dynamic real-time lighting. It also adds reflections to surfaces without additional programming, and processes scenes as a whole rather than by a "camera can see this" method.

AMD shills in damage control.

>Geforce 3 was 1000usd+ and got stomped by ati almost immediately

And every GPU today has programmable shaders. See you in a few years kid when you are comparing NV and AMD ray tracing benchmarks.

>Because brand new technology that was only possible on super computers a few years ago should be able to run at 4k 60fps flawlessly on a consumer graphics card.
>The state of AMD shills.

I bet you would have talked shit about anti aliasing when it first came out just because it had a performance hit too.

I was 10 when the gf3 came out and stuck on a Riva tnt2 turbo 16mb
Wasn't laughing I saw the demo and thought Nah we won't see this in games for years and it turned out I was right at least until doom 3 dropped

Nvidia released a 3D accelerator before ATI/AMD
Nvidia was the first to introduce programmable shaders
Nvidia was the first to introduce a unified shader architecture
Nvidia was the first to introduce GPGPU
Nvidia is the first to introduce dedicated ray tracing acceleration on the desktop

It's a fact. Nvidia innovates and AMD emulates.

Buying the first gen is still retarded

So what you're saying is it would be retarded to buy this and instead wait for it to become the standard rather than the no support meme?

Gotcha.

AMD shills want graphics to stagnate because they can't afford new technology.

That's ok I just want a good bang for my buck GPU. Brand loyalty is silly.

what's with you people thinking that the only people who dont jump on a $1200 gpu right away are the one who can't afford it? Ray tracing will be virtually useless for the next few years, it's just a waste of money buying those card right now if you have a 1080 at least

That's a lot of assumptions on your part.

feel free to get a 2080ti for those sweet 15% gains after turning ray tracing off, nobody's stopping you

That's right, shitpost the butthurt away

>want to know more about DLSS
>every fucking article is only talking about gay tracing

Attached: 1527230459406.jpg (200x325, 28K)

>mfw I have no plans to buy any of those games
>mfw the games I play will not support this technology within the next year
>mfw there is an extrmely small chance they'll ever support it
>mfw I paid 100 USD for an RX560 that just works on GNU/Linux and plays everything I ever wanted
>mfw memelords are chucking a thousand bucks for a graphics card that barely plays games they don't give a fuck about at 30FPS using the flagship technology the graphics card was sold on
Something something a fool and his money

Attached: smug cawfee and gween tea.jpg (1440x810, 210K)

>DLSS

Try searching for something less marketing-buzzword, like machine learning super resolution.

Then it's a useless gimmick, you buy a 1200$ card and if you use it as advertised your game plays like shit

Do AMD chips have botnets?

What should I get if I want to play 1440p @144fps then?

Attached: 1530554423067.jpg (996x720, 98K)

whats the source of turing not being much better than pascal?

What games do you play? Everyone has been suggesting a 1080 or 1080Ti for that bracket for a while, but they're probably expecting you to play the latest AAA turds and suggesting accordingly. If you play Doom all day every day, you won't need a 1080Ti to push 1440p at 144Hz. Look at benchmarks for games you like playing, that should give you the first good idea of what to buy.

Attached: 1519672746168.jpg (596x600, 26K)

We can't know until non-gaytracing benchmarks roll in but the lack of talk about performance from Nvidia's side doesn't make me optimistic.

HAHAHAHA

>Just gotta wait, you'll see

You sound like an AMD fan defending Vega at launch

Are there any games that are worth playing that need a card even half as expensive.

Ford was the first to mass produce cars, doesnt make them better at it.

The argument is weak is all.

Throwing your money at a product based on a marketing presentation with no facts is a lot of assumptions.

But GPUs today already do ray tracing relatively fast, the only difference is that nvidia made better denoise algorhitm with AI (neural network?).

Attached: Slide1.png (1600x1348, 1.7M)

>First to introduce unified shaders
>Xbox 360 alreadly had an ATI with unified shaders before the 8000 GT series and HD 2000 series was out.

Paid digital distribution has completely ruined gaming.
I would be more than happy to go back to sixth generation graphics if it meant games were good again.

God I can't wait till they release it for prices to slash. I need to upgrade soon and I'm thinking of 1080ti. Is it a good idea?

Nobody can tell you before benchmarks for the new cards come out.

AA adds input lag... that's a nope from me user

I don't want or need the new series though. I'm just wondering if 1080ti price is worth it compared to 1080, based on performance.

so its just pure speculation based on novideo showing only raytracing bench?

Pretty much.

That's rather silly considering how a 2070 is going to be in the same price range but might perform better. In any case deciding on a graphics card means you first have to decide on what you're going to do with it. If you're playing at 4K at least a 1080Ti is a must, if you're playing at 1080p even a regular 1080 might be overkill.

probably
the fact that showed absolutely no benchmarks should tell you a lot
lol

He doesn't seem butthurt to me desu

>none of it applies to me so it applies to no one

Literally no AAA games can run at 144fps 1440p

more games support ray tracing than dx12... and dx12 is required for ANY increase for cpu over 2cores and doesn't even make a diference if you have 980/1060/2070 or even 3060.. literally for mid range gaming more than 2cpu will not make a diference until 2026. that's 2decades of intel selling gamers 4+ cpus that have 0 impact on any game … you guys on average have prob be scammed out of 1500$+ out of paying for extra cores in intel cpus over this 20year time frame and your not complaining.

ray tracing is fine it will be the new SM3.0 in 2009 games like blackops and all the 360 ports 100% required SM3.0 600$ cards from only 1-2 years before didn't have it. same thing will happen with raytracing in 2020 there will be games that literally don't boot on 1060/1070ti/1080ti/titanV

your fucked you wasted 800$ fuck off.

Ray tracing and Global Illumination do not belong in video games

What games are coming out in 2020 that require Ray tracing? What games are even coming out that you care about?

so it intentionally renders a whole environment even if the player is not "seeing" that area? who the fuck thought this was a good idea?

You betcha. 99% of games coming out today are fucking garbage. The rest are either able to be run just fine on the 10*0 and RX 5*0 cards anyway. On the remote off-chance that a good game comes out that will only run with these raytracing cards, I won't lose sleep over not being able to play it for a generation or two of graphics cards, when the technology has trickled down from enthusiast- to middle-tier cards, and said game has come down to sub-$20.

Thanks for beta testing!

Attached: 1518650839141.jpg (848x688, 107K)

>doesnt make them better at it.

If Ford was the first to introduce every single automotive technological advancement in the last 100 years then I guess it would make them better at it.

This

Titan V is only about 20% ahead of the 1080Ti in games which don't utilize async compute (so 99%+ of games).

2080Ti is weaker than the Titan V unless it has some massively better ROP performance.

>Below is a list of all the games below that will support Real-Time Ray Tracing;
A number of these, if not all, won't support it on launch and only in a post-launch patch.

7nm is coming next year, m8. That's what people meant by waiting.
Everyone with a brain knew that Ampere/Turing would be a skip generation.

You're the brainlet if you don't realize there were massive performance increases on the same node in the past.
>5870 and 6970 were both 40nm.
Hmmm.
>7970 and 290X were both 28nm.
Get brains.

Only 50%?
/pcbg/ kept being spammed by fucktards saying the 2080 would be 50% more powerful than the 1080 for $500.
They kept saying dumb shit like
>hahaha screengrabbed for when you're BTFO amd shill
Now they are fucking gone. No screencaps have been posted.

there were people saying the 2080 would be slightly faster than the titan V
now they're just "It's 10% higher performance for 10% higher price, and that's a good thing"

literally everyone called me faggot, retard and waitfag when I said 2080 is transitional before 7nm
now look at it nvidiots

KYS

OP IS A SHITTY SHILL TROLL

OP'S PIC IS OBVIOUSLY SHOOPED AND HE'S SPREADING FUD

REMEMBER THE JIGGARAYS
2070 IS 6 TIMES FASTER THAN THE TITAN XP IN EVERYTHING

here's another one of OP's shitty threads

>I never heard ray tracing before. What is it in short?
AMD showed this shit 10 years ago m8.
youtube.com/watch?v=V0TyhQiY2pQ

>If retards didn't fall for marketing and buy Nvidia GPUs, we'd have photorealistic games by now.
>Actually more than photorealistic. You'd literally be able to touch and feel your waifu in augmented reality jacked directly into your brain with BETTER visuals than real life if AMD were the ones with the monopoly.

People like you being this unknowledgable is why Nvidia has this near monopoly.

>Nvidia was the first to introduce a unified shader architecture
Wrong. First was the Xbox 360 GPU developed by Ati.
Get your facts straight.
>Nvidia is the first to introduce dedicated ray tracing acceleration on the desktop
What?
>Nvidia was the first to introduce GPGPU
Also wrong.
They're just the first which tried to push it in a way which gimps AMD GPUs despite AMD GPUs being many times better at it.
They even had that proprietary garbage in Warframe and only recently replaced it with an open implementation which runs better on all cards.

Delusional.

Nothing can reliably run that.

It's more like 30-80% higher price for 10% better in the US.

Attached: vivaldi_2018-08-21_18-56-29.jpg (1998x1120, 220K)

when you have off-screen reflections/ illuminations?

when you have billions of rays being reflected literally everywhere on the game scene?

say that to some one in 2008 that brought a 600$ top end radion that couldn't run black ops 18months later couldn't even boot the game because of SM3.0

Mein Vivaldi brother from another mother.

Who the fuck plays Black ops on PC lmao

>brainlet attempts to wrap his tiny head around a technical subject with metaphors

Don't worry about it. Just play your video games.

Well that fucking blows. This isn't even real ray tracing, is it?

I'm an industrial designer and the raytracing stuff could actually be beneficial to me if I can offload my renders to GPU

Attached: 1522327150434.jpg (640x853, 63K)

Yeah, this is fake ray tracing. I'm pretty sure he said 10 GigaFauxRays/s. Fake rays are 6x easier to trace than real rays.That's were the performance numbers come from. It's all a ruse.

You'll be insanely disappointed.

what happens when we get

>1spp
Jesus christ, it's hideous.

Attached: hqdefault (3).jpg (480x360, 12K)

I've been wanting to upgrade for a while, this is my workstation/gaymen/ all rounder PC. Originally I was just going to pop in a 1070 or 1080 but the mining craze fucked everything. Now I'm considering a new build with the RTX and either the 12 core threadripper or i9 9900k as I regularly hit 100% CPU and high ram usage with a combination of Solidworks, Siemens NX, Keyshot, photoshop etc.

Attached: FB_IMG_1533473668526.jpg (908x671, 38K)

>6x
I had to do like 100,000 ray path samples to get a image of a glass cup with no visible noise

You're better off with an epyc setup unless you like pretend ray tracing. This rtx bullshit isn't even remotely close to real ray tracing or even "real time".

> Brand new

There are discussing games that was doing ray tracing moron
And was doing it back in 2006
Star citizen has dynamic lighting and still can hit 60 FPS lol

Then you'll never have to read another brainlet on Jow Forums posting about muh next die shrink because there won't be any.

Doom 3 and HL2 had dynamiclighting. Dynamic lighting isn't ray tracing.

Will this raytracing help games rendering reflection/mirrors?

10 gigarays per second should be 10 billion rays per second.
There are ~0.0083 billion pixels in 4K resolution.
If that was some sort of legitimate number, you should be able to inverse raycast, at 4k 60fps, 20 fucking rays per fucking pixel per fucking frame but the hardware is CLEARLY not capable of that given the noise in that UE4 demo.
I don't get how the fuck they're calculating this. Even if each "ray" meant per ray per bounce, the numbers still don't add up.

Eh it's a tech demo. The reflections look okay. Programmers are often not very good artists.

There's a lot of things which are called ray tracing in raster graphics which you wouldn't traditionally think of as ray tracing because there is no visual unbiased path tracing done.
I think how GIs are done are typically considered ray tracing, for example, just with voxelization and incredibly few rays.

at 30fps at 1080p for $1200 ya.
Also, we've had mirrors in games for decades.

>Also, we've had mirrors in games for decades.
Yea, one single mirror in toilet in 99% of the game. I'm talking more about reflection in a big environment. Recent games can't do it

0 nm.

>Recent games can't do it
Recent games do, but yeah they're generally screenspace reflections which have artifacting.

The latest Gears of War game has planar reflections running on console without gimmickworks, soooo. There you go.

I'm sure 10 GigaRays/s is a best case scenario. Do you really think that these GPU get 10Gflops/s with actual workloads? To get those performance number you would basically have to perform the exact same multiply-add over and over again.

Also, 4K 60fps is probably an unrealistic expectation.You are talking a paradigm shift here. You might have to take a step back and settle for HD resolution if you want real time ray tracing. Nobody was playing Crysis at 4K 60fps back when it came out. You kids are spoiled today.

Attached: 11c[1].png (403x347, 72K)

Same with tessellation
Same with tl


It will take years so kudos for now, but I would only but if I can get a substantial uplift performance, and all ppoints it will not be now.
To tell the truth it's like Fermi 2: electric bungaloo except with no ati to compete