Turing is a flop

Turing is a flop

Attached: Screenshot_20180826-083525_YouTube.jpg (2560x1440, 846K)

Other urls found in this thread:

shadertoy.com/view/llfyRj
newegg.com/Product/Product.aspx?Item=N82E16814500436&cm_re=gtx_2080-_-14-500-436-_-Product
twitter.com/SFWRedditVideos

A gigaflop.

>2080 faster than a 1080Ti by quite a bit
That's way more than you'd normally expect from a new generation, so I'd say it's better than expected. Nobody thought it would be 6x (or even 2x) faster than the 1000-series in normal games without ray tracing features, my dear AMDrone. The whole point is you get ray tracing performance on top of the usual.

Also
>assumed
>grain of salt
>most of the driver optimization isn't anywhere near done

Attached: 4563548343253.png (653x726, 34K)

hope it's true, nvidia needs a little wake up call, I think they are overestimating paying capability of their "fans"

>That's way more than you'd normally expect from a new generation

Attached: 1519987643219.gif (540x303, 1.56M)

Just BUY it!! How hard can it be?

>quite a bit
You have passed fifth grade have you

But it's twice as expensive holy shit it fucking sucks

>the benchmark itself says not to trust it
>Jow Forums trusts it
ofcourse you fucks do

Attached: 1531927799445.gif (330x300, 953K)

>That's way more than you'd normally expect from a new generation
Is this a joke? This was their longest time between generations, what looks like the smallest gain between generations, and the most expensive generation there's ever been.
>b-but the 8800 was expensive
When you adjust for inflation it was nowhere near as expensive. And manufacturers seem like they're gonna charge over the founder's edition price for a fucking plastic blower cooler.

btw where's the vega 64 in this comparison? it does gr8 at hdr

Attached: Untitled.png (630x178, 12K)

I choose to hope it's true. There is a difference to "trusting" something and hoping for something.

>That's way more than you'd normally expect from a new generation
Go away, underage retard.

They don't have much to do until AMD releases another curry poo card in like 2 years, cut them some slack.

Dude, if you don't buy our "new" RebrandeonTX 2069Lel you'll be losing money!
Just buy it!

hm, why nvidia gets hit in HDR?

you know maxwell gets rebranded 3 times at this point, what is hilarious you can pinpoint performance gains to TDP gains
I won't even go into 8800 situation way back then

>That's way more than you'd normally expect from a new generation

Yeah, nah. Skipping this gen.

Attached: ALL YOU CAN EXPECT .png (511x1059, 81K)

>curry poo card
Meet mr.Wang RTG new CTO, he's got the touch.

Attached: 1526718103180.png (1260x709, 723K)

Not him but all your image proves is that the 1000-series was an abnormality.

they cant use their shitty color compression "technics" on 10bit, performance plummets (thats right, sdr has lower image quality on novideo)

Wait, it is worse in F1 than 1080ti?\
And its only 10fps gain max at best?
I know its early/grain of salt but what the fuck?

Attached: black.jpg (383x409, 22K)

Arent results like this kind of standard with new cards until better drivers get made?

I had 390 for two years, and 1070 now for a year
Replaying some games now, they do look a little washed out. Nvidia drivers switching from full rgb to limited randomly doesn't help.

no. it's first gen in a while to be this weak.

Wasn't the 980 also way above the 780?

gtx 670 to gtx 770 was disappointing too, but you're right.

The compression. Pipeline they use cant support well the bandwidth needed for hdr.

Anyone remember back when GPUs used to advance?

Attached: f12012_5760_1080[1].gif (400x377, 27K)

First nvidia generation. AMD has released a load of flop generations too.

Yes goyim, just an abnormality! You've never had it better!

Attached: Nvidia-GeForce-performance-drop_02.png (1280x699, 163K)

the 700 to 900 series was a ridiculous jump

Attached: tombraider_1920_1080.gif (400x343, 25K)

You do realize this is a much larger gap than previous gens too right? The 1080 came out in 2016, more than 2 years ago.

770 was just a rebranded 680 and 670 was a cut down 680, so yeah.
The interesting thing is I think both the 900 and 700 series TPU benchmarks have the same series of games as the OP image, just the older versions.

God I am so happy I listened and I bought a GTX 1080Ti secondhand (with warranty intact).

Of course it's not gonna be that big of a jump because they aimed for the whole RTX thing, they effectively have 4 GPUs (usual CUDA stuff, Tensor core stuff, RT stuff, AI stuff, if I recall correctly).

You'll see a massive gap when you enable ray tracing features, but without those you can expect 10-20% gain over the previous generation cards. If you expected anything else after watching the GDC presentation you're a retard, there's a reason they didn't show "normal" game performance (both because Nvidia seems to be fully invested in pushing RT as the future and because it wouldn't amaze anyone).

amd at least has decency to call refreshes what they are, not some new REVOLUTIONARY technology

either way both companies clearly preparing for 7nm which can be something or nothing again and we will be stuck with GCN and 4th maxwell generation

Attached: SHUT IT DOWN.gif (640x360, 1.41M)

>2080
>faster than 1080ti
>flop

>2080
>higher launch price than 1080 Ti
>good

>most of the driver optimization isn't anywhere near done

its ok when nvidia does it

geforce 20 is a scam man, produce no inventory, cook up some marketing tricks, change the name, then literally double the price for all the whales while they wait for 7nm. Cant believe anyone is actually defending it desu.

Turing is a massive success and OP IS A FAGGOT

You Luddites can't stop progress and GeForce RTX and Quadro RTX will be widely adopted across the industry

GUYS JUST WAIT VEGA

Attached: geforce-rtx-2080-technical-photography-pcb-front-001-Custom-1.png (2560x1165, 3.6M)

>produce no inventory
I think they made too much, it's not gonna sell.

>pushing RT as the future
what the hell does a russian broadcast station have to do with any of this?
you mean raymarching? raymarching as been the future since 2008. for years now, you can raymarch in realtime, even in your browser
shadertoy.com/view/llfyRj

Attached: rt.png (800x450, 30K)

post the link please OP

>new technology costs more
>bad

>on Jow Forums
>doesn't know what ray tracing is

Anyone here honestly ordered one? Answer seriously, some people like new tech for the sake of new, but did you?

>casually ignoring the fact that raymarching isnt anything new to avoid looking like an idiot

no, waited for the announcement and as soon as I heard the prices i immediately started looking for a used 1080 Ti. fuck Nvidia and fuck AMD for not competing properly .
>inb4 you should've bought AMD if you wanted competition to thrive, derp.
I bought AMD until 7850 series, everything after that was a rebrand.

>new technology provides 30fps at 1080p
>good

>Take these with a grain of salt.
gonna have to wait for people to review this card to actually tell if it's worth it (for the price, probably isn't).

Attached: 31211759_10208748016097156_5362789479947955016_n.jpg (595x720, 50K)

i find it fascinating the r*ddit actually has these boards. someone needs to stop this degeneracy.

>we put useless shit on a chip to sell it for 50% more
>new technology

Attached: 1526379239514.png (190x266, 5K)

>to review this card to actually tell
here is what going to happen:
>emphasis on RTX ON and DLSS
>yes it's not that much fast but you pay for the future
and so on.

>$699
>Higher than $699 1080 Ti

Fuck off, math flunk

Even if these are true (and you ignore the >grain of salt text) it shows the 2080 being as fast if not faster than the 1080Ti in conventional rasterized gaming.

How is that a flop you fucking retard?

>my 1080 ti was about $680 at launch
>2080 ti is $1250
yeah nah, my opinion of this company is in the toilet atm.

This is what the most annoying this is.
>the die is XBOX HUEG, so the price hikes are justified!
If that's the case why not just sell GTX versions of the same cards without all the extra bullshit and a smaller die?

S-shut up and just buy AMD already!

>HDR doesn't matter!
>RTX matters!
what do nvidia shills mean by that?

Attached: 1521536246571.png (211x239, 4K)

>paying double for a sidegrade
Do you use mommy or daddy's credit card?

I said this before the announcement with specs.

I said it'd be a 10-20% core increase.
Has async compute support which scales better than AMD's, but like only 15 games out of 15,000 support that shit.
No IPC per core increase, except if you're retarded enough to count async compute as an IPC increase.
No 10% perf loss from HDR.
More memory bandwidth, but that the 2080 and 2070 are still likely to have worse memory bandwidth than the 1080Ti.

Those were my predictions, and that's what it turned out to be. I didn't expect them to be absolutely retarded enough to try to snakeoil in tensor cores and some RT ASIC, but anyone with a brain can see how premature and a waste of money that is and that you should wait for 7nm.

>make a prediction anyone with over 85IQ can make
>wants a medal

whats the -10% thing?

Running in HDR ruins performance on current Nvidia stuff so the lower one is HDR and the higher one is SDR.

>assumed
>no source
What the fuck?

>Nvidia's bullshit sub-FE MSRP for third parties that will never translate to the real world
>in any way meaningful
Provide a direct link to where I can pre-order a 2080 for $699. Or just save yourself the trouble, because this is the absolute cheapest card available:
newegg.com/Product/Product.aspx?Item=N82E16814500436&cm_re=gtx_2080-_-14-500-436-_-Product
Which is $749 for a shitty, plastic blower that will sound like a hairdryer. All others are priced equal to or above the Founders Edition, because no third party is going to sell their card with an upgraded PCB and cooler for less than Nvidia's basic model.

turing was a faggot anyways

Ya I know, you're right. But still got spammed by

That did the same thing with Pascal launch. Real prices were $80-$100+ over the "MSRP".
AMD tried the same thing with Vega and got crucified despite cards actually being available for the stated MSRP on preorder.

>the whole point is you get ray tracing performance on top of the usual
Yeah, usual assfucking.
Someone please post a picture of poo intel wojac crying with a smug intel wojak mask

amd's msrp was counting a monitor coupon or some bullshit in price

GPUs are also hitting Moore's wall. Get ready for not much improvement going forward.

You could not find a $699 1080ti for months either

Also EVGA 2080 SC is also $750 and it has dual fan.

No it wasn't.
It was the post-launch prices where if you didn't preorder and wanted it at that MSRP, you had to get it with a monitor or games.

Being able to preorder to get them at MSRP was at least better than Nvidia's completely fake MSRPs where no card is available at that price for months.

No they aren't.
And even if they were, they shouldn't be getting disproportionately more expensive.

>Paying the same MSRP as a previous gen card for worse performance in some games

Attached: 1497937659791.png (335x557, 151K)

>You could not find a $699 1080ti for months either
Couldn't you literally order them directly from Nvidia?

They are more expensive because of rtx cores

>No they aren't.
Yes they are. It's the exact same thing as you're seeing on CPUs: Instead of substantially improving the performance of the core functions, they're branching out into special-purpose hardware for acceleration certain very specific tasks. In just the same way as you're seeing AVX-512 or AES-NI on x86, you're now seeting "ray-tracing" and "tensor cores" on GPUs. All because they're running into walls in the core capabilities of their respective products.
>And even if they were, they shouldn't be getting disproportionately more expensive.
Of course not, and neither was it intended to be an excuse for that.

>new technology
>ray tracing is literally a concept from the 70's

Fucking retard
They aren't hitting any wall when it comes to rasterization performance, they just put more effort into ray tracing. This technology is in its infancy and I wish Nvidia would have waited till it was mature to release something. Now we get a card that hasn't improved rasterization performance that much, and that it's not not going to perform very well when it comes to ray tracing

little early to be jumping on that train after how big the maxwell and pascal jumps were bro, still thinking gtx 20 is a money grab while they dump their dev budget into 7nm.

>most of driver optimization isn't anywhere near done
so let's gimp the 1080ti like we did in 980ti my fellow nvidiot

>muh gimping
Nice buzzword

>They aren't hitting any wall when it comes to rasterization performance
It's clearly much smaller than previous, and if they really wanted to rake in the gay men money, they would've taken out the tensor cores and raytracing shit that nobody outside the professional space really is looking for and crammed tit full of shader cores/TMUs/ROPs instead. The very fact that they didn't just shows you that doing so wasn't feasible, probably either because of hitting Moore's-law-related power walls, or because they can't feed it enough memory bandwidth, showing the limits of what can be done on a single piece of silicon.
>JUST WAIT

Actually, If RTG pulls their heads out of their asses and actually fixes the 4 triangle per clock front end geometry bottleneck of GCN, there'd still be plenty of untapped potential in Vega like card. At bare minimum, even if RTG doesn't fix anything about GCN, you can expect a 7nm Navi to be proportionally faster to Vega by the same proportion it clocks faster, since the bottleneck is per-clock.

2080 is more expensive than 1080ti you fucking nutsack

OH NONONO WE GOT TOO COCKY NVIDIA BROS

WHAT ARE WE GOING TO DO??

Attached: 78658765876876.png (125x108, 15K)

Or if they fucking give it more than 1 ROP per CU. Simple as that.
Vega 11CU has 16 ROPs. Vega 24CU has 32 ROPs. These both perform very well for their die size and power consumption.
If Vega 64 had at least 86 ROPs, it'd have at least matched the 1080Ti. But it was an HPC card and not a gaming card even though a bit more ROPs doesn't take that much space.

Engineers were betting on more effective triangles per clock with NGG which didn't happen.

How was the 1080 compared to the 980ti?

>Engineers were betting on more effective triangles per clock with NGG which didn't happen.
It didn't happen because, as far as anyone can tell, the software team never did any development on trying to actually implement primitive shaders for gaming.

Hell, the driver team appears not to have even *started* the drivers until 3 months to hardlaunch.

Who on earth knows what the fuck Raja had the software teams doing for the three plus years before that.

+35% in favor of the 1080 stock.
+15% with both overclocked, but with even more than a 2x perf/watt advantage when you OC'd a 980Ti that much.

Can't help but wonder if Nvidia locked down Pascal overclocking so much because they planned these Turing GPUs 3 years ago and didn't want people to just overclock Pascal higher than Turing can achieve with

There was a lot in Vega architecture that's broken in regards to NGG. What was planned to be supported in driver wasn't possible with the hardware.
Primitive shaders was a small part of it which will likely be dropped completely in favor of other parts of NGG.
Supposedly it's planned for gfx10. Don't know if Vega 7nm is gfx10 or Navi is.
>Who on earth knows what the fuck Raja had the software teams doing for the three plus years before that.
It's the hardware that is flawed. NGG can't be implemented in software on it. Check the Linux drivers.

What happened is Raja was promising features they were WORKING TOWARD but weren't in hardware.
Then Momma Su got tired of Raja not finishing what he was paid to do, and made them release it, as it was already nearly a year behind schedule. Can't really blame her.
From what I hear, things are much better under Mr Wang.

The important benchmark is the 2070 vs 1070ti since they are very close in price.

Anyone have?

Faster in Far Cry 5 only because of fp16 (there are like two games that support it). Dunno why it's faster in BF1 tho. On par in other two games. It's a flop. Novideo wants to show the new card in best way possible therefore they ask people to test it in 4K because it has higher memory bandwidth compared to GTX 1080 and with HDR ON because Pascal has problems with HDR (lower performance). At lower resolutions memory bandwidth doesn't matter as much and GTX 1080 Ti will be like 10% faster while costing less.
>b-but muh ray tracing!
GTX 2080 is on par with GTX 1080 Ti which already struggles at 4K. Now enable RTX and you'll get silky smooth 30-40FPS. Basically this new card will be slower at every resolution except for 4K and only if you don't enable RTX.

By buying RTX card's at those huge prises you're basically accepting that you'll have to pay more every gen to get more performance. No more performance leaps like with GTX 960 -> GTX 1060 for the same money.

take a hike, kike

Attached: 1521225232623.png (350x729, 131K)

JUST BUY IT ALREADY, YOU FILTHY GOYIM!