What can AMD do to combat RTX?

What can AMD do to combat RTX?

Attached: 5b831ab4dbd67 (1).jpg (700x596, 59K)

Other urls found in this thread:

youtube.com/watch?v=tnOjY3w3mg4&t=1078s
youtube.com/watch?v=51PSjERAlTc
twitter.com/NSFWRedditVideo

If Navi is a 1080ti for ~$450-550 it will win

Nothing. They are game over.

They release same technology with better implementation at lower price. After all, raytracing and every "advantage" of hybrid raytracing was available before without tracing anything.
youtube.com/watch?v=tnOjY3w3mg4&t=1078s

>implying RTX is good in the first place

The 2080Ti can't reach 60 fps in 1080p

7nm + 200 watts*AMD= 2070?

memes aside what's the speculation surrounding 7nm performance

Combat RTX? It's a flop, 2080 = 1080 Ti +$200. You have to be fucking retarded to actually see a benefit of buying RTX.

AMD has had raytracing capabilities for a while now. This video is from March this year.
youtube.com/watch?v=51PSjERAlTc
They just haven't incorporated it into graphics cards because the technology isn't good enough to do it. Even Nvidia's implementation is really basic and not real raytracing by any means. We don't have the technology to do real time full blown raytracing yet. It is commendable that Nvidia is trying to push it forward, but it's mostly a publicity stunt that will have little to no adoption from studios.
For now, raytracing will only be used in animation movies and CGI, as it has been for the past couple decades already.

>The 2080Ti can't reach 60 fps in 1080p
What game? What settings? You sound like a dumb butthurt AMDfag

>Too retarded knowing deep learning AI is the future

rtx is a flop user

Doesn't change the fact, that the previous statement of the user I quoted was fucking stupid

housefire

Attached: higherthanvega.png (822x399, 964K)

He's right, the 2080ti can't do raytracing at 1080p 60fps. This has been proven in at least a couple games already.
>lmao cherry picking games
Yes, cherry picking, because only a handful of games actually support it, and out of those, only like 2 are actually playable right now. Raytracing is going nowhere fast in games, at least for now.

>Paying $1200 to beta test a feature + 30% performance upgrade over the 1080 ti
>Double the cost

Boy I sure hope Jow Forums isn't this retarded

>The 2080Ti can't reach 60 fps in 1080p
max on any game with raytracing

>Boy I sure hope Jow Forums isn't this retarded
>Jow Forums
Jow Forums doesn't but housefire nvidia/intel memes, thats nu Jow Forums you are thinking of

I agree that RTX is a useless meme, but we don't have the final software yet. It is not optimized yet.
We don't know how it will perform when it hits the shelves or 1 year into the meme.
Performance from the card is nice either way, the price is just a joke

>I agree that RTX is a useless meme, but we don't have the final software yet. It is not optimized yet.
If Nvidia could not help such software to exist with all the money it had it's impossile.

I mean, you're right that it's nowhere even close to optimized. But when will it be? It's tech that has been used for a while, but never in games due to the obvious performance hit. And when it really is optimized, how many games will realistically use it? Right now, it's a gimmick confined to top end video cards that a very small percentage of people will actually get. Why even bother if you're a game developer, unless you're getting heavy money bags thrown at you by Nvidia?

These cards are quite obviously a placeholder from Nvidia to scam people out of their money while they wait for 7nm to mature into next year. They had to put something out, and they had to find a way to sell it, so Raytracing is what they went with. It is great technology, just not ready for video games. It's in prime ripe form for marketing, though.

> cucks ITT think that somehow tech will get better by not being used actively by any developer

fact of the matter is that NVIDIA is the only company pushing the boundaries, it happened with shaders in 2002 and its happening now with raytracing. tech needs to be pushed aggressively or we will be stuck with console tier graphics for decades.

DESU I loved my first shader card
Dat Morrowind water

but we all already have 1080tis

Shitpost about avg fps

>Too retarded to know that the 2080 cards aren't the future, they are the now.

Skip the 20## series, wait for the 30##...or 21## who the fuck knows what it will be.

10%

Then looks like you get to save money or buy a new chair instead.

They would make the same thing as RTX, but open source and better, yet no game dev would ever use it

Attached: 15363537288910.jpg (530x558, 92K)

The star wars RTX demo for instance ?
Shadow of the tomb raider ?
Metro exodus where the 2080Ti stays in the low 40 FPS in 1080p ? Don't be delusional I have a 1070 I'm not an AMD fag when it comes to GPUs but Turing is a scam. Even with software optimization you won't be able to play on anything higher than 1080p which is a joke

AMD had Roy tracing. I think Roy quit though.

Attached: 15360095684171s.jpg (250x236, 9K)

When Nvidia introduce the first ever 'GPU', its still come with a performance that warrant the pricing on top of having those new shader pipeline.