NVIDIA GeForce RTX TERRIBLE Performance per Dollar

This is fucking ridiculus and I'm not even a AMDrone fan...These card are fucking overpriced

Attached: 2018-09-20 08_38_09-NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB Review _ TechPowerUp.png (924x857, 200K)

Other urls found in this thread:

anandtech.com/show/13282/nvidia-turing-architecture-deep-dive
twitter.com/SebAaltonen/status/1032897644123901952
twitter.com/SFWRedditGifs

Tell us something that we don't already know

Almost as if nVidia has no competition.

fixed

Attached: 1537447231175.png (924x857, 213K)

All the poorfags are jealous cuz their daddies won't buy STRONG 2080 Ti BEST VGA CARD

B-but the more you buy, the more you save!

Are you retarded?

Are you?

name any gayme that supports real-time gaytracing right now

Kys

the new tomb raider game

Shit engine

wrong
it doesn't at the moment

t. tracelet

Nope, we're upset, because this new series of gpus are trash

lmao

Attached: chrome_2018-09-20_16-03-27.png (2560x1440, 1.72M)

These video cards are a revolution actually. You expected all of those new tensor cores and real time Ray tracing capabilities to come cheap poorfag? 2080 ti is the best GPU on the planet right now and a giant step up for the future of gaming graphic technology. Afford it or not this is something that needs to happen.

The patch of the game with raytracing hasn't come out yet.
People with the RTX GPUs can't use raytracing on anything as of now.

Tough to quantify the value RT will have to a given gamer. It's a call the buyer will have to make imo.
I'm perfectly happy with my 1060 6GB atm and even if the games with RT in the works were already released, I wouldn't purchase a 20-series. By the time the 1060 stops doing what I want, RT cards will be in the market at more price points, and the suite of games using DXR will have expanded.

>he doesn't know ray tracing can be software implemented

amd cards cant ray trace?

lel

cherry_picking.jpeg

>RX 570 & 580 have best performance/dollar
BASED RAJA TECHOLOGIES GROUP

Thank fuck the mining meme died down

They technically could, but like Pascal and prior cards, the performance would be awful.
Nvidia had OptiX around for a decade before RT cores, but it was only used in the professional space.

>my shill card can't beat best card performance wise to this date
>cherry picking
S E E T H I N G

E

E

T

H

I

N

G

>AMDfags acting like real time raytracing isn't the most important advancement in gaming graphics in human history

Attached: 354deaa3770912621bb816da070346ab.jpg (258x245, 12K)

Really considering just getting another 1080 to for the next 2 years or so.

What did the 2080 mean by this?

Attached: 2080 performance.png (1483x804, 388K)

>Let's buy something for features that doesn't exist yet!
Yeah, I remember getting excited about hair works too.

Look at all this damage control all over the board. Ayymdrones know how badly AMD fucked up.

980 4g is best performance per dollar this chart intentionally misses it.

Are AMD even working on video cards anymore? I am interested in upgrading to something decent for 1440p but the prices here are completely unreasonable.

this is like launching the playstation 5 , but with no ps5 games , can only play ps4 games . but costs 2x the price.

what were nvidia thinking by launching these cards without software that can use the selling point of the cards.

its not like they had to rush cards out to counter AMD lol.

BASED AMD
ray tracing: snake oil

>Buying an overpriced GPU to enjoy real time ray tracing in a console game

Attached: Smile.jpg (600x600, 94K)

dogshit marketing, dogshit products. anyone with half a brain saw this coming. just gotta hope it's not a trend thats here to stay

>dogshit products
There's nothing wrong with the 20-series on a technical level. It's just the price that's the issue.

Exactly , if the 2080 ti was $650 to $750 people would be excited.

>I fucked you all and don't even feel bad about it

Attached: Raja-Koduri-Intel.jpg (602x430, 104K)

> NVIDIA GeForce RTX TERRIBLE Performance per Dollar
More news at 11.

You'll take what Nvidia gives you because you have no other options.

>1920x1080
what year is this?

4k all day noobs

this is the current state of nvidiots mental retardation
good goyim

aren't a lot of cards expensive on launch because idiots pay it? figured i'd wait a month or two for prices to stabilize (plus I want to see where 2070 ends up price/perf wise)

the 2080 is basically a rebranded 1080ti and has less memory. you have to spend a shit ton on the 2080ti if you want noticable performance gains. not okay my dude and and the kikes at nvidia know it, hence the dogshit marketing with ambiguous graphs etc

the people posting "but they're better!" why do you love pointing out the obvious? that's not the issue here.

No, unless it's something semi-custom.
Navi10 is somewhere H2 2019 but it's a very very small die.

Pretty much. You'll see the prices come down a bit, but not much below MSRP, if at all.

>the 2080 is basically a rebranded 1080ti
You might be clinically retarded. Seek help.

substantiate or kill yourself

Vega 64 has a higher 1% min framerate than the card with a 4 digit price tag. What are you trying to prove again?

anandtech.com/show/13282/nvidia-turing-architecture-deep-dive

It's an entirely new uarch that has more in common with Volta.

*fart* nice one retard, go look at benchmarks before you talk

What's the 1% min framerate of the Vega 64 once you enable raytracing?

>"It's a rebranded Pascal!"
>But it's clearly not
>"Well the benches show similar perf so it's clearly the same!"
The two GPUs have nothing to do with each other. Stop posting.

My 980ti is struggling with 1440p. I wanted to get the 2080, should I just get a 1080ti?

The non-FE 2080 has the same MSRP as the 1080ti, but the 1080ti can be had for a good price thanks to overstock.
Neither choice is bad assuming you get a deal.

>the 2080 is basically a rebranded 1080ti
>"It's a rebranded Pascal!"

you stop posting you retarded faggot strawmanner

>1080ti isn't Pascal
Your idiocy knows no bounds.

It's a pascal with some MOAR COARS band-aid'ed on top for "ray-tracing" (ray-traced reflections, not real, honest to god ray-tracing of full scenes).
These cards are a fucking disgrace and early adopters will get assfucked as usual.

>more strawmanning

yawn

The same, because RTX is going to be a victim of market segmentation as only the top tier 20XX series cards will have it.
>majority of PC gamers purchase XX60 series cards or lower

This fact should bother you.
RTX will become a meme like SLI or die off like PhysX. It's worthless to devs unless the entire next generation has it, and NVidia somehow don't understand that.

>Neither choice is bad assuming you don't care about dpc latency.
ftfy

>It's a pascal with some MOAR COARS
But it's not.
The structure of the SMs have changed. It's an entirely new pipeline with better potential for async compute.
>"ray-tracing"
It's ray-tracing no matter the scale, and looks better than raster hacks. There's nothing to complain about in regards to the implementation, given full ray-tracing is impossible to bring to consumers currently.
This is leaving aside other semi-fixed function compute capacity in the form of Tensor acceleration for DLSS, or anything they want really.

>These cards are a fucking disgrace
There's nothing wrong with the cards outside of inflated price vis-a-vis Pascal, which looks worse than it really is. Pascal cards are unusually cheap, and the Turing cards are only available in as FE, which makes them more expensive currently.
The non-FE 2080 costs the same as a 1080ti once the overstock ends.

it is but hell if im going to invest in it before it even exists

wtf I hate gaytracing now

Do you think those prices are going down when Nvidia forced 3rd parties to buy their pascal stocks to be first in line for Turding?
"Ray tracing" as it stands now is an unknown. Nobody knows who will use it, if it will perform well enough and so on and so on.
Don't believe the hype.

>real time
Does that include the time it takes for support to be added to your game?

It's not proper real time ray-tracing, it's real time ray-tracing of certain elements in a scene.

>ray tracing happening in real time
>support for ray tracing happening in valve time
oh boy here we go

>Do you think those prices are going down
Why wouldn't they? The overstock isn't infinite.
>Nobody knows who will use it
Last I heard 11 games were slated to use it, and 16 games were set to use DLSS. A handful were using both. The numbers are only going to go up given large-scale buy-in from DirectX and Vulkan.
>if it will perform well enough
We'll have to wait for reviews, I agree.

"Ray-tracing" might take off in 2-3 generations of cards, not this one.

Most game sales are made on consoles.
Ray tracing won't take off unless the next gen of consoles uses it.

Games are coming with the features set for release in this gen of cards. Unless every single one of them become cancelled or delayed for several years, it's going to happen now.

Yeah, and performance is going to be fucking atrocious in reality. Remember how well tesselation worked?
Is this the first card you've bought with new hardware features?

Actually, one generation of consoles.
>Remember how well tesselation worked?
Remember how well programmable shading worked?

>Yeah, and performance is going to be fucking atrocious in reality.
Whoa, how do you know that? Are you a developer for DICE?
Can you tell them I appreciate the coming graphical improvements via DXR, but I could do without the punk-rocker girls carrying MP44s, thanks.

>Tesselation
>Added parts to the chip
>Programmable shaders
>Changed how all the shaders worked
>"Ray-tracing"
>Added parts to the chip

See where this is going?

>Whoa, how do you know that? Are you a developer for DICE?

Attached: 1372550690305[1].jpg (802x437, 52K)

>Garbage AAA games will look prettier with ray tracing
How fucking wonderful

WE HAVE NO OTHERS OPTIONS

Attached: 1536418504265.png (447x625, 243K)

you have no idea do you now..

twitter.com/SebAaltonen/status/1032897644123901952

because ray tracing needs TIME
and time spent on the pipeline is never good
so unless turing is a 5ghz monster saying that the perfomance will be good means only one thing

dynamic lightning that will be masked as RT
or no rt at all

I have a 770 with one of the fans being broken. I need an upgrade that'll last me for years and was thinknig about getting a 2070 once it comes out (maybe a 2060 depending on the price).

Bang for buck is important for me too, however.

Should I just wait and buy the 1070 ti once the 2070/2060 come out?

How does it work for rendering in Blender?

...and having your customers spend that much shekels to just fucking play games.

It's just not going to happen. However it will eventually go down in price and by then there will probably be games utilizing raytricing ofc. Hopeless release I don't understand what they were thinking. I'm not fucking paying that much for a god damn gpu, I can use my money on so much else

Whatever you buy is going to be bad. 1070 is too old and expensive, 2070 is too expensive and doesn't really support raytracing. 2060 maybe?

wait for real, vega 64 has higher minimum framerates than any other card?

Come on, you've got to give us a graph of GIGARAYS per dollar.

It doesn't matter since RTX 2080 Ti can run CUDA workloads much faster than GTX 1080 Ti, and companies will buy them all for their machine learning server farms.

RX580 and wait a generation or two is the clear answer.

>nvidia switches the paradigm and invests in a technology that will revolutionize graphics down the line
>Still able to out perform the old model even though full investment wasn't for performance

BUT MUH BENCHMARKS

Where did you get this chart?
The GTX 1060 is both faster and cheaper than an RX 480.

I call shenanigans on the entire thing.

You've got to admit that the GTX 2080 being $150-200 more than a 1080 Ti is kind of a letdown when they perform almost exactly the same in traditional rasterization.

Ray tracing performance is been lackluster so far, and DLSS implementation seems sketchy at best when looking at the FFXV benchmark (which is quite literally the only thing on the market right now that supports DLSS and it's not even a game).

Maybe in about a year from now when the Turing cards drop in price and we've got more games out that support DLSS and ray tracing I'll consider buying one. But paying the premium to be an early adopter for a tech that's not even usable yet is just plain stupid.

they're using bullshit msrp I think

thats what i did. will last me until 21xx series or whatever the fuck amd put out if they decide to actually join the party next round

RX 480's go for like over $400 you can't even buy them anymore, 1060 is way cheaper.

This chart is fucking stupid.

Attached: 1516556664318.jpg (630x630, 43K)

>Retards actually wasting money on these shit RTX cards
Why the fuck did Nvidia release this bullshit when nobody is ready for RTX? 99% of PC gamers can't afford or don't need a 850 or 1150 dollar card.

It's almost like they want to push people into buying consoles because of this Jew shit.

Higher demand and low supply. They can still sell 10xx series cards while these cater to the premium market where people can drop $2,000 on a 4k HDR 144Hz screen.

Yeah but that uses regular compute units (shader processors). If leatherman doesn't lie (which I doubt) rtx cards have dedicated units for jiggarays tracing. Which means their blurry as fuck raytracing imitation will not affect performance on rtx cards if the right amount is used (much, much less amount than their demos) but will cripple other cards with emulation.