RTX 2080 only 10% more Tflops than 1080

GTX 980: 5 Teraflops
GTX 1080: 9 Teraflops
RTX 2080: 10 Teraflops

Sounds like a big flop to me.

Attached: RTX_Car_678x452[1].jpg (678x438, 26K)

Other urls found in this thread:

youtube.com/watch?v=NeCPvQcKr5Q
youtube.com/watch?v=162DeB3xbqk
youtube.com/watch?v=Irs8jyEmmPQ
pcgamesn.com/nvidia-rtx-2080-ti-hands-on?tw=PCGN1
youtube.com/watch?v=aT37b2QjkZc
twitter.com/SFWRedditVideos

>HE DOESNT KNOW ABOUT THE GIGAJIGGARAYS PER SECOND

This thread is only 10% more cancer than all the other GTX 20xx threads as well

Thank you for your kind words

>what is bandwidth
>what is advanced SM

Are the GTX 1000 cards starved for memory bandwidth when gaming though?

Jensen is the biggest Asian kike there is with his ugly ass leather jacket.

Stop supporting this gook and go buy a Vega 64 instead you fucking faggots.

they were not

Yeah but my GTX 780 died 6 months ago and I've been stuck without a GPU and you have to be retarded to buy AMD or a 10-series in 2018.

European prices for the RTX cards are retarded, EVGA sells the GTX 1080 for like 510 Euros and is asking 930 Euros for the RTX 2080

It's a better card, but it's not 80% better

You're an idiot.

If you find a good deal on a used 10 series or a Vega, you have to be an idiot to pass it up.

I always buy previous gen shit like a week before it's announced. Lots of dweebs dump their near mint cards on the market for super cheap because their mommies give them their credit cards so they can buy the "latest and greatest".

RX680 where?

Polaris is a shit architecture and only good for Fortnite and 20 year old games.

Go get a Vega 64.

Sadly it won’t be a flow since all the nvidia drones are all already setting their “useless” 10-series cards now so they can buy this shit.
And the biggest joke is that this is just a shameless cashgrab, cause nvidia can’t afford not to jump on 7nm as soon as its out. So expect the same braindead zombies to yet again run out and buy that shit up too.

I hope AMD fixes their naming scheme. Vega should have been called the RX 590 Vega or something

youtube.com/watch?v=NeCPvQcKr5Q
>"So in titles where Ray Tracing isn't being used heavily, or I suppose at all, you're looking at a 10% performance uplift for almost an 80% hike in price"
Fuck Nvidia in the asshole. And also fuck AMD in the eye, mouth and ear for failing to provide competition.

Attached: download (1).jpg (300x168, 14K)

>amd competes
>nvidiot gaymers still don't buy it because its not the way its meant to be played

Meh

Polaris was geared towards laptop and cheap desktops.

GPUs are simply not worth this much money. People are really spending $1200 for some slightly better shadows and reflections in a video game they probably won't even notice.

nobody takes buying advice from small channels ran by dullards

>nvidia wasting huge amounts of die area for half baked features nobody asked for

>gaymers don't take advice from a calculated logical thought process
Sure ting Jensen. Have you got any more made up metrics as to why a 2080ti is 25 times faster than its predecessor? Perhaps the channel that was bought out by Dyson is more to your liking. Which funnily enough is the only channel praising Turing.
youtube.com/watch?v=162DeB3xbqk

My mistake. JayzTwoCents avoids badmouthing the 20 series too.
youtube.com/watch?v=Irs8jyEmmPQ

you mad? I already have a Vega 64 though

Attached: 111144881.jpg (960x540, 148K)

/v/tards when vega came out
>who the fuck uses dx12? So what if it's better in dx12?
>who the fuck cares about optimisation later? Does it give 6gorillion fps spikes for 0.0000006 seconds on day 1? hmm?
>vega1000wats, heh
>1440p doesn't matter!
>picture quality doesn't matter!
>smoothness and stability doesn't matter
/v/tards now
>but the raytracing! You'll just have to wait for gaemes to inplement it. Why do you want it right out the gate?
>6x prformance in gaems! yes! Proof? why the fuck do oyu want to see proofs?
>300 bucks 2060 is gonna completely btfo the 1080ti! mark my words!
>who cares about ghz?
>who cares about fans?

>gaymers don't take advice from a calculated logical thought process

yes that's exactly it
they make emotional purchases

I will never buy AMD gpu because I hate amd fanboys.

I will never buy Nvidia gpu because I hate Nvidia fanboys.

You are silly.

same

>Sounds like a big flop to me.
I guess you could say it's a... TERAflop?

...I'll see myself out.

AMD fanboys accusing others of making purchases based on emotion.
There have been a few marketing studies done that show that some consumers will purchase products from companies they perceive as underdogs, it appeals to their sense of justice or something.

I'm using a fucking 560Ti, stop calling everyone that disagrees with you a fanboy you idiot.

>or something.
Competition drives down prices and leads to better technology.

1645 USD for a 2080 Ti in Sweden.
1009 USD for a normal 2080.

at least from Nvidias official site.
Third party ones are more expensive.
ROG Strix 2080 is 1260 USD

its ok user we have GIGARAYS!!!111!!!

despite the fact that this number is still out of context since nvidia refuses to give an actual number of triangles being interescted in order to get that number...

Funny is that the power consumption from Vega is massively overstated. All those LMAO 350WATTS stuff is from setting the power target to 50%. You can get the same clocks and performance at lower power target without throttling.
As for pricing, it depends in your area and if you could find one at or near MSRP.
Then count how much less you pay for Freesync over Gsync.
Just the lower cost of a Freesync monitor already offsets any and all power usage costs over the Nvidia counterpart the Vega could have.

>1645 USD for a 2080 Ti in Sweden.
>1009 USD for a normal 2080.

Including 25% sales tax.

Attached: löfvenlyxfällan.jpg (563x317, 51K)

Corrected for taxes, the US and Swedish prices are the same.

>Corrected for taxes, the US and Swedish prices are the same.

Actually, they're cheaper in Sweden if you deduct the special chemical tax as well.

>AMD Fanboys!
Nice strawman, faggot.

>implying all teraflops are created equal
Just wait for actual benchmarks you retards

Attached: 1514445431303.png (540x270, 57K)

I will never buy a gpu because I hate fanboys

It is only a little bit more than 1080Ti. Power consumption is meme is overrated for the hardcore gayming crowd. They only care about performance no matter the cost.

The only crowd who cares about power consumption are near-silence and silent-PC crowd.

I'm not entirely sure how the taxes work in the US compared to Sweden, especially for electronics. Quick rundown?

It's over, nVidia is finished.

I agree
Only poorfags who can't afford AC care about power comsumption

>bandwith
irrelevant since the 980
>advanced SM
lmao

BUY NVIDIA
THE MORE YOU BUY THE MORE YOU SAVE
AHAHAHAHAAHAAHAHAHHAHAHAHAHAAHAAHAHAHHAHAHAHAHAAHAAHAHAHH

>they don't realize they are playing themselves

god i love novideot butthurt so much, keep buying nvidia

this

Vega 64 13.7 tflops
Yet it performs worse than a 1080

Tflops and gaytracing flops means shit brainlet wait for true benchmarks.

Seething AMD shill, nobody bought a Vega 64 because it costed $500 without the crypto tax, so essentially it was $740. Plus even today it is still hard to find one and it performed just as good as a 1080 but 1 year later.

If you remove the gaytracing Turing is basically the same as Pascal, at the very best Turing would be 2-3% faster clock-for-clock and shader-for-shader. So Tflops is perfectly comparable between those two. It's the Tflops between completely different architectures (like Vega and Pascal) that's not directly comparable, brainlet.

>Are the GTX 1000 cards starved for memory bandwidth when gaming though?
Gimped gddr5 versions (like 1070) were, full gddr5x versions weren't.

>Hurr durr AMD should've provided competition for free after we boycotted it for 10 years so we could buy better cards from novideo
It doesn't work like that.

In your argument you mentioned that you cannot compare between two different architectures (Vega and Pascal) but yet you try to say that the relative tflops performance between turing and pascal (two different architecture) is the same. Really now?

Did you read my post? Turing is basically Pascal + Tensor cores bolted on.

Proof? Or are you speculating like a dumb AMD shill? Jensen and no one in the press has said that. Even if it was true you can’t really tell me that the performance between incremental architecture changes is the same. Are you gonna tell me that the performance of a HD 7970 and a RX 580 is the same because they both have GCN?

jenhsun used die space for gigarays.
well, you can offload all the lightings and shadows to ray tracing freeing up shaders.

i am amd pajeet, but this is revolutionary stuff.

>Pascal + Tensor cores bolted on
It's what Volta is. It doesn't look like consumer RTX cards have Tensor cores at all, only Quadro has.

See Maxwell vs Pascal clock-for-clock. I'm 99% sure this will be the case with Pascal and Turing. Screencap this.

will amd navi have ray tracing?
is sony pushing ray tracing?

>your nice thing is made by a Jew
>buy my nice thing that's made by a Jew

>Are you gonna tell me that the performance of a HD 7970 and a RX 580 is the same because they both have GCN?
Downclock Polaris 10 then somehow disabled two out of four of the geometry units, and yes.

>well, you can offload all the lightings and shadows to ray tracing freeing up shaders.
It would be the case if rtx cards actually had dedicated hardware for raytracing which doesn't seem to be the case here.

30fps LOOOL

Attached: Screenshot_2018-08-21-22-14-12-08-01.jpg (1641x1079, 487K)

>2080 is twice the cost of 1080
>performance gain is only 10%
How did they get away with this?

Also limit memory and disable memory compression

You’re deluded if you think Maxwell and Pascal perform the same especially when benchmarks are out there.

What part of clock-for-clock do you not understand.

>Nvidia's RTX 2080 Ti struggles to hit 60fps at 1080p in the ray traced version of Shadow of the Tomb Raider
h
>pcgamesn.com/nvidia-rtx-2080-ti-hands-on?tw=PCGN1

>can't even do 1080p 60fps with "ray trace" on the 2080 ti

>they want to charger 1200 for it

WHAT DID THEY MEAN BY THIS?

And Vega wasn’t that bad spec wise except it had half of the software missing and no primitive shaders. Get ready for thicc green cocc for every mid high to high end GPUs.

wow, reddit humor

>nobody bought it, that's why it went out of stock
>muh 1000gorillionwatts
>since demand is so low for vega, retailers increased it's price. Cause that's how the market works.
Reminder that vega completely btfo'd your $3000 card at mining efficiency. Go buy your i9+2080ti combo and don't forget the industrial chillers and liquid nitrogen.

Then resort to black magic to make it work on Linux. The Linux drivers are not even ready yet.

Do they make a HERS version?

But Vega is basically polaris + primitive discard that got disabled and rapid packed math that's barely used

>that got disabled
Why

WHERE IS MY BLOWER STYLE DESIGN NIGGVIDYA.
REEEEEEEEEEEEEEEEEEEEEEEEE

Attached: Normies+get+out+reeeeeee+_890269f82634a1042c186369a0c567f4.gif (326x318, 189K)

It's not like devs would use primitive shaders even if they were working

>muh crypto
It’s dead, if not for memecoins Vega would have flopped hard. Also depending on different coins you had different performances.

don't listen to him go- i mean boys
nvidia is great company and has best products

Vega has primitive discard accelerator

Attached: polygon.png (1018x412, 19K)

So let’s just give Nvidia mid high to high end market. Let’s also forget about GPU as General Processing Unit and let CUDA ass rape everything in future (after biggest openCL shill drooping it for own proprietary API). Dzis is gonna bee fun :D.

Does anyone remember a time when AMD showed their real time ray traced demo with 2 of their 4870 cards?

youtube.com/watch?v=aT37b2QjkZc

Attached: 18e.jpg (1024x576, 112K)

I was waiting for 2070 but when I saw the price I bought 1070 ti strix advanced in sale for only 10€ more than FE in nvidia shop.

that's like saying the gtx 1080 provides only 10% performance uplift in DirectX2D

every game dev company will be using Raytracing now, Remedy is already shipping a game in 2019 with it

>screwing yourself because of other people
I bet you also bought nvidia "before it was cool," you fool.
t. Guy who will not buy AMD GPUs until they can compete with my 1080 ti

That's not what I want but what everybody else does

See Point still applies

I miss Ruby.

OH WOW ONE GAME BUILT FROM THE GROUND UP TO RUN AT 30FPS 1080P ON A 1000USD CARD THAT WILL BE OUTDATED IN A YEAR BEFORE THE GAME EVEN COMES OUT

Holy fucking shit we could have such graphics 5 years ago if normies weren't giving all their moneys to nvidia

Tflops don't mean any thing 980ti also has 5flops but you can overclock it to almost 1080ti level of performance.

flops are just as inaccurate as the marking bullshit nvidia made up about giga rays and what ever.


rtx 2000 series could still do really well in benchmarks just have to wait and see. rumour that it was 25% improvment and people consider 9series to 10series was 30% but it was actually more like 5% when you consider the only way to overclock a 10series is to spend 5 years learning microsoldering to buypass the limiters while you just flash the bios on a 9series.


most economical PC build is till 700$ for a 8350k oc to 5.5ghz and 8gb of ram mobo and a 980ti bios modded to almost 1080ti speeds. any build more expensive than this only gets like 3-4 more fps regardless how much you spend.


before you could spend any where from 700-3000$ and only get single digit fps improvment.


now i expect if you spend 700-1000$ you only get single diget improvment but maybe if you spend over 1100$ you can get 10-15fps improvment or 1400$ you can get 20fps improvment


spending double the amount to get 20fps more in games that already get 100+fps at 1080p isnt worth it for most people and even spending double to increase 60fps at 4k to 80fps is retarded unless you have one of thous cheap 4k Korean TV monitors that LTT shill that can do 98hz with HDR on and are getting gsync patched in soon if you on one of them increasing 4k fps from 60 to 80 is probably pretty nice but I doubt many of you shills have that monitor and just have dumb asus 60hz 4k monitors with a minority of you having the super expensive 144hz one that's so expensive you will buy dumb hardware for it regardless.

I think the best feature about RTX is the stock cooler might actually be really high quality and actually 5x quitter at max OC like they claim.

being able to see reflections of people around courners in BFV seems nice too but beside that I cant really justify it.

also its worth considering when in 6-8months when AMD/Intel release new GPUs that will prob be targeted towards miners the market will be flooded with 1070 nvidia is probably going to fix SLI drivers in that timespan and the better support for Nvlink on the new cards might spill over to better SLI support on 10series with old link and I honestly think in a years time running 2-4 1070s in sli will be optimal bang for money setup. you guys don't actually comprehend how many 1070s are in bitcoining mining factorys atm once they upgrade to newer AMD and INTEL there will be like half a mil or more 1070s on the market and they will only cost like 150$ or some thing.


2/3/4 1070s will be faster than even the 2080ti imo. and 4 of them might be faster than the RTX titian if they improve the drivers and who cares about vram.

You aren't overclocking these new GPUs to 3GHz. Lmfao. This absolute state of delusion.
There is a reason their base clocks are actually LOWER than Pascal. The changes to shaders to support Async compute has resulted in having to lower clock speeds even on the 16nm+++ process.

Pascal likely OCs better than these new Turing GPUs. You can easily get 2050MHz on the 1070Ti on air. I bet you need water cooling to maintain that on the 2070.

Maxwell also had unlocked voltage, letting them OC higher.
If you overclocked a 980Ti and put it up against an overclocked 1080 on launch, the 980Ti went from 35% slower to only 15% slower, as it overclocked better. Albeit at higher power consumption, but still, it did it.

>2/3/4 1070s will be faster than even the 2080ti imo
Wow. I'm glad I didn't read most of your post. You're delusional.
SLI still requires developer support. It doesn't magically work on every game.

Huang literally said THIS GPU IS MADE FOR OVERCLOCKING when he announced 2070.
I had a hearty chuckle

>18 iun. 2008

Attached: 1401328801548.jpg (550x366, 60K)

>paying 40-70% more than 1080ti for a 20% performance increase
Most games won't even use the RayTracing part of the RTX but just the new AI supersampling thing