Nvidia's biggest blunder ever?!

how much did they fuck up this time?

>add half-baked raytracing silicon to new cards
>can't even break 40fps with raytracing turned on which means it's useless for games
>new games won't even support it fully so it will be useless for the next 2-4 years
>general performance increase over previous generations is under 20%
>invent new performance metrics because RTX sucks shit on standard ones and doesn't justify the cost

who here is retarded enough to pay $500+ extra for a card that's less than 20% better?

Attached: 1515797990441.png (523x411, 33K)

Other urls found in this thread:

dsogaming.com/news/nvidias-performance-improvement-per-generation-has-dropped-from-60-pre-2010-to-30-post-2010/
palit.com/palit/vgapro.php?id=3006&lang=en&pn=NE6208TS20LC-150A&tab=sp
gainward.com/main/vgapro.php?id=1016&lang=en
youtube.com/watch?v=_j6TiSdKT0A
youtube.com/watch?v=Jd1bp9eSfwo
youtube.com/watch?v=QoePGSmFkBg
tomshardware.co.uk/answers/id-3360606/980ti-sli-gtx-1080ti.html
twitter.com/NSFWRedditVideo

You already said all that needs to be said within the thread.
As for who will buy it, Retards, retards will

already ordered 30 1080s to resell when the stock runs out. Love making money off gamer kiddies

Despite that, they're selling like hot cakes. They have created a good marketing campaign for this generation. These cards will make Nvidia a lot of money.

fucking retard

It isn't a blunder if they make a ton of money off of early adopters and then drop prices for the rest of us by Christmas or early next year.

20% increase in performance will be fantastic when they drop prices.

>when they drop prices.
L O L

The 10-series launched at inflated prices too, and I think it was more than 6 months before Nvidia finally dropped prices, but they eventually did.

;)

more like 50% faster

>more like 50% faster
LOL'd, that never happens with nvidia cards.
also, if it were true, they wouldn't need to make up bullshit specs and benchmarks.

I'm sorry that ayymd doesn't have anything to offer my afro american ayymd fanboy.

>MUH AMD
we're talking about Nvidia, you buttmad fanboi

yes, the best cards in the market, oh, I know whats going on, you are a NEET faggot who cannot afford these amazing cards, poor faggot

You're right. As a general rule, the average generation to generation increase in performance has averaged 30% per upgrade over the last 5 or so generations.

The 10-series was an exception, offering a 50% improvement over the 9-series, at least for the 1080, but I don't see this generation offering anything more than 30%.

Attached: x80-genj9ard.png (2559x1398, 491K)

My 980 is still chugging along pretty well considering I don't plan on going 4k for a good, long time.
Considering the sorry state of pc gaming, I see no reason to jump in as an early adopter. nVidia's pricing antics are a bigger upgrade deterrent than Microsoft's DirectX antics.

Attached: 1419561831875.png (673x708, 237K)

this graph is what ive been looking for to make a case against buying the new cards when the real benchmarks come out do you have a source?

AdoredTV

dsogaming.com/news/nvidias-performance-improvement-per-generation-has-dropped-from-60-pre-2010-to-30-post-2010/

Imagine larping as a retard.

The first card specs are officially up

>Palit
>boost clock 1650 MHz
>GeForce® RTX 2080 Ti GamingPro OC
palit.com/palit/vgapro.php?id=3006&lang=en&pn=NE6208TS20LC-150A&tab=sp

>Gainward
>boost clock 1650 MHz
>GeForce® RTX 2080 Ti Phoenix "GS"
gainward.com/main/vgapro.php?id=1016&lang=en

So you can expect around 1650 MHz from every other OC card. This means they are all a little bit faster than the "normal" boost clock of the "founders" Nvidia cards.

But the prices right now are just shit. Literally every fucking reseller just uses the founders edition price with a tip...

>can't even break 40fps with raytracing turned on which means it's useless for games

at 4k?
are we pretending everyone goes for 4k?

Nope, at 1080p. It was dropping to the 30s in some games at 1080p. The UE4 demo was 22fps, too.

Why didn't they put AMD and Nvidia in the same picture?

This is pretty inline with my claim that performance for the dollar has increased by about 25-40% every 8 months at least up until this generation which has is a DECREASE in performance for the dollar.
I think that's even true for the 9800 GTX vs 8800GTX because of how much cheaper they were.

Attached: Nvidia rtx demo ray traced shadows.jpg (600x1152, 149K)

>The first card specs are officially up
they're up on nvidia.com. but those are almost useless without some real performance figures.

Attached: 1511109352329.jpg (2185x2143, 296K)

>blunder
They're going to sell incredibly well and make Nvidia a ton of money, so no. And all the retards will upgrade AGAIN when the real 7nm cards come out in 12-18 months. Nvidia would be stupid NOT to keep milking their idiotic fanboys.

>>general performance increase over previous generations is under 20%
There's a chance it could actually be even less than that. We won't know for sure until some competent review outlets release their data.

>They're going to sell incredibly well
LOL'd. Not when all the gaming sites start doing the performance testing and expose the lies and bullshit GIGARAYS and other nonsense.

>Why didn't they put AMD and Nvidia in the same picture?

I think it's because it was AdoredTV's two part history of Nvidia.

Part 1:
youtube.com/watch?v=_j6TiSdKT0A

Part 2:
youtube.com/watch?v=Jd1bp9eSfwo

So the focus of the charts is on Nvidia and its progress over time. Would love to see someone map ATI/AMD's progress on the same graph over the same time period.

The only official price drop Pascal ever got was when the 1080 Ti arrived. Since the Ti is there at launch this time, there's no reason for them to reduce prices in the future.

You're delusional if you think these aren't going to sell incredibly well. People who want the latest and greatest don't care about any of that and will pay just to have the new shiny thing. Nvidiots have never cared about the actual performance of their cards, which is why they still outsold AMD even back during the dark days of Fermi when their cards were objectively inferior in every way. This is quite literally the future PC gamers chose back when they had a chance to stop it.

Yeah I'd love to see both AMD and Nvidia relative increases AND price:performance on the same graph.

>You're delusional if you think these aren't going to sell incredibly well.
are you nuts? they're WAY overpriced and gaymers will quickly realize that all the ray tracing they paid for is TOTALLY useless.

Insane price + Useless tech = mediocre sales.

>who here is retarded enough to pay $500+ extra for a card unless he is a coin miner
Fixed that for you. There is no reason to upgrade.

200$ ps4 can play gow at 4K 60fps
2000$ vga can’t play games at 1080p and sub30fps

>add half-baked raytracing silicon to new cards
whats half baked about it?
>can't even break 40fps with raytracing turned on which means it's useless for games
Sort of. Remember when AA was a huge deal, remember when AO was a huge deal or physics etc..etc... Just wait and see how it goes on finals hardware and software. It will take time to optimise and get right.
>new games won't even support it fully so it will be useless for the next 2-4 years
As with all bleeding edge tech. I could say the same about HDR Only people sitting on a heap of cash should even consider jumping into this generation
>general performance increase over previous generations is under 20%
What else would you expect. Wait for 7nm
>invent new performance metrics because RTX sucks shit on standard ones and doesn't justify the cost
meh

based

Is this rhe 2nd coming of GeForce FX?

Attached: DlTYo8LU8AAGz37.jpg (846x1200, 134K)

I think the crazy pricing is a result of Russian collusion.

I don't give a flying fuck about the gaymurr graphics card wars. All I care about is that I will finally get to mess with GPU computing and machine learning by this Black Friday.

>t. scientific researcher, whose work is actually beneficial and useful to society.

>can't even break 40fps with raytracing turned on which means it's useless for games

Retard, ray tracing is harder than you think.
Nvidia managed to increase the performance hundredfold from 1 frame each 2-3 MINUTES to 30-40 frames per SECOND with some extra silicon. I can guarantee you that the technology will be mature enough in 2-3 generations and there will be no performance hit

nice try NSA

Back to /v/eddit
It's about fucking time that us professionals who can actually make use of GPU technology have access to it instead of foul degenerates like gaymurrs and shitcoin miners wasting all that processing power and energy on their mindless futility.

Don't forget that we contribute more to society in a single day than gaymurr manchildren ever did in their entire lives. We're better than you, never forget that.

The only thing you contribute to is your parent's growing disappointment.

so why release it when its not ready?

video cards were literally invented to play video games you cuck
go do another useless study on something everyone already knows

Which were the GPUs that Nvidia kept rebranding which held DX9 back since those rebrands didn't support DX9 and we were stuck with shitty shaders forever?
I think FX2?

These are that, because they're going to hold us back a few years from now.

>Nvidia managed to increase the performance hundredfold from 1 frame each 2-3 MINUTES to 30-40 frames per SECOND
No they didn't.
AI ray tracing and AI denoising in real time existed before Volta and Turing.
They increased it maybe 2-4x. Still far from good enough.

we're 2 years into the 10 series and we are now finally hitting sub MSRP prices. They will recreate this scenario even without the shitcoin boom because they have zero reason not to due to amd having no competitive product for any area.

>amd is still selling rebranded overclocked 280s in the current year.
the absolute status

There was no FX2, it was 2nd gen fx 5900, 5700 and 5500.
Most rebranded series were the Tesla 1.0 series geforce 8, geforce 9 up to gts 250

Yeah I'm thinking of like didn't some Geforce4 get rebranded as Geforce5 despite Geforce4 only supporting Dx8?

Or maybe I'm thinking of Dx10 that Nvidia held back with rebrands.

No they aren't.

GTX 1080 were $400-$480 in around March of 2017.

>best selling amd card of the last year is the 580
a rebranded 280 lmao.

Big brainlet here, I thought that ray tracing was already massively used even in real time, I'm looking at wikipedia and it's fucking interesting how many different techniques exists for gaymers and cartoons for basedboys.

It's still gonna sell and you know it

Attached: 1424995261923.png (500x340, 85K)

That's a pretty good drawing

Polaris is not a rebranded 280. That's a fact.

You're just upset that GCN was so good that AMD hasn't needed to change drastically until post Polaris.
Whereas Fermi and Kepler were fucking garbage.

It is. But I can't get behind any /ss/ where the boy is stupid enough to want an RTX card.

Actually, the reason they priced it at $1200 probably has a lot to do with market researching showing that it won't sell well, so they need to make more margins per unit.

We often call some things ray tracing that you wouldn't think of as ray tracing.
Like VXGI and other voxelized ray tracing GI implementations.
Also planar reflections kinda? You need a raycast to get the angle iirc.

No NV Link for RTX 2070 buyers.
youtube.com/watch?v=QoePGSmFkBg

>Turn on DLSS to upscale 1080P to FauxK
It's twice as fast!
>Turn on HDR to gimp the GTX 1080 Ti
It's 1.5X as fast!
>Cherry pick titles that use HDR or have async support
Looks at these games running at 78 FPS! in FauxK!
>Hide the specs used and hide the min framerates
You don't need to know that goyim!
>Turn on RTX and get a 100% performance hit just to make some shadows and reflections nobody cares about look prettier.
But it's realtime raytracing!
>Hide the fact that the Star Wars demo is running at 22 FPS.
But look how great the reflections are!
>You won't be getting that level of anything for the next 5 years.
Buy now goyim! The more you buy! The more you save! It's only $68,000 in easy monthly payments!

>NV Link? Why do you want NV Link on a midrange GPU?
Buy the RTX 2080 Ti (or better still our fabulous line of quadros) if you need to do fancy rendering goyim!

I have 2x 980ti in my machine, should I upgrade or wait?

Wait

Upgrade to 1080Ti when the prices drop.

>This is quite literally the future PC gamers chose back when they had a chance to stop it.
muh nvidia boogeymen

Nah, it is the Geforce 3 launch all over again.

Blame the whole mining craze and lack of competitive at the high-end gaming market for the new price points.

The mining craze was an accidental marketing test that tested the waters to see if the market was willing to stomach higher price points. Non-miners were still getting 1070, 1080s and 1080Ti despite the inflated prices at the peak of the craze. Nvidia shareholders wants a price of the pie instead of etailers/resellers.

Nvidia also wants to continued to sell its older Pascal stock at current prices and retain their margins. They want most budget conscious buyers to soak all of the Pascal stock until lesser Turing SKU take over those price points.

Wrong, the RTX SKUs are going to be selling like crazy for high-end cards. There are more enough buyers for them. The mining craze proven this. Besides, the yields on these chips are probably crappy anyway and GDDR6 is expensive as fuck. I wouldn't be too shock that TU104 and TU102 cost just about as much Vega 10 to make.

>There are more enough buyers for them
With a GPU die that big?
No.

There are miners who will still buy them for altcoin mining due to the faster memory. In places where electricity is cheap or in some cases free it pays for itself.

I don't remember most of these early geforce launches because my 9700 pro lasted so damn long before I needed to upgrade to an 8800 gt.
But I guess Geforce3 actually preceded that by a year. Thank god I wasn't making a PC then.

>The mining craze was an accidental marketing test that tested the waters to see if the market was willing to stomach higher price points
I've been meaning to mention this, but didn't get a good opportunity.

GAMERS were paying $1200 for the 1080Ti during the mining craze. Miners were NOT paying that much.
It gave Nvidia a lot of market data on just how absurdly they can price a card that is only a little bit more than twice as good as a $250 card.

The usual thinking as a consumer is that the more you pay, the more value for your money you get is. Even if it's not only true, bulk buying teaches us that.
Yet with GPUs, people who just have to have "the best" will pay 5x more for double the performance.

Personally, I'd easily pay $1500 for a graphics card. Even $3000. But I expect that $1500 graphics card to be around 3x more powerful than the $500 one. Yields on large dies makes it sound like this wouldn't be possible, but the die isn't the only cost in a graphics card.
Say on a 1080, the cost of the die is say $80 of the cost, memory is $50, pcb and its components say $40.
That's $170 total.
Say you increase the die size 2x, and increase the bus width but with the same amount of memory since gayming doesn't need that much, now your yields are far worse so lets say it's a $240 die cost. Well that's still just $330 total instead of $170 total. Okay, maybe $350 to improve power deliver. You could easily price that twice as powerful graphics card at $900 instead of $400 and it'd be decent fps:$.

>I wouldn't be too shock that TU104 and TU102 cost just about as much Vega 10 to make.
Probably more, really. GDDR6 probably costs about as much as HBM2, and they have even larger dies although no such interposer.

How long before someone makes an RTCoin for the ray tracing computer engine?

Hard to say if that is even possible but we'll see.

Will a single 1080ti outperform 2 980tis?

Not many games do SLI so it might be worth the switchover in that sense but I don't think it is faster than two 980 Ti's in SLI. But SLI means you only use like 80% of the scond GPU so maybe. I can't be bothered to do the math.

tomshardware.co.uk/answers/id-3360606/980ti-sli-gtx-1080ti.html

So yes and no.

Attached: RTX's and Morty, Season 2000.png (916x2844, 965K)

20%? fuck you
NVIDIA told us in very nice chart that the shitty 1080 is only ONE while the 2080 is more than ONE POINT FIVE in nvidia card goodness units that is more than 50%
get your facts right faggot

Attached: 17361553_1296661707087694_3643969592264492630_n.jpg (261x139, 7K)

blame it on 0 competition. NoVidia has free reign

100% True. It's a prime example to kids, that doesn't understand, that a monopoly is always a disadvantage yo consumers.

First, stretching releases to every two years, then, prices are up like oil prices in the 70's.

Attached: Nvidia_Price_evolution.png (845x639, 64K)

trips don't lie

Attached: 1519411398947.jpg (1024x499, 50K)

>AMDniggers with their shitty pre rendered shadows in full damage control mode

Attached: 3.jpg (753x724, 117K)

I don't think so because tensor cores only do such specific ops.

Damn. 780Ti was $700 despite being so shit and demolished by the 290X?
All that does is prove that competition doesn't matter. People still bought that shit despite it being so inferior.

Also wtf the 1060 wasn't $200 MSRP ever.

You guys love shitposting without any facts. These cards are a worthy upgrade and prices will normalize to at least msrp after the initial early adopter rush to buy. I can’t wait to get a 2080 for $700.

Consoles play at 30-20 fps retard

>tensor cores only do such specific ops.
>what is RTX

You want me to explain what RTX is to you?
It's a minor evolution of the Volta architecture which has a sparse voxel octree accelerator ASIC in addition to the Volta tensor cores (which only do mixed precision multiply+add op on 4x4 matrixes) and Volta async compute.

No, I mean the RTX cores.

That's not the point of cryptocurrency you dumbass.

???
There are no RTX cores.
Stop making shit up just because you don't know what you're talking about.

>nvidia does the single biggest graphical leap since 3D rendering
>AMD shills try to downplay it as being nothing more than some softer shadows with a big FPS dip
>meanwhile AMD can barely reach 1080 levels of performance

Cringe.

Attached: 1521382630498.png (1070x601, 463K)

For now what they showed us is that they do raytrace-like reflections and shadows.

>hey, what if we do raytracing instead of rasterization?
>but raytracing is too costly and we advanced a lot in rasterization where we can almost reach raytracing level of visual fidelity
>but we can charge double the price!
>do it fampai

It's sad in hindsight when you think about "muh async" spam.
>it's revolutionarry ONLY if it's amd
literally the appletard mentality

Because AMD already has something similar for ages called Radeon Rays. It's just not as refined and their hardware is probably not capable of realtime (*COUGH * yeah right. Enjoy your 30 FPS shadows) yet.

so, no different than the introductions of the T&L engine or FSAA then
ok

And yet Nvidia are adding async compute to RTX. Keep shilling though I find it amusing.

>Radeon Rays
Damage control shit for investors after being BTFO.

Is that what the R stands for?

Yeah, vega cards have that R.

how much did they fuck up this time?

>add half-baked vector 3d graphics silicon to new cards
>can't even break 10fps with hardware rendering turned on which means it's useless for games
>new games won't even support it fully so it will be useless for the next 2-4 years
>general performance increase over previous generations is under 20%
>invent new performance metrics because Trident sucks shit on standard ones and doesn't justify the cost

who here is retarded enough to pay $500+ extra for a card that's less than 20% better?

Attached: maxresdefault.jpg (1280x720, 192K)

AMD are doing fine thanks. Their Zen line will keep them around to work on their GPU division.

>implying gaymers care about muh fps
If anything, this'll convince a bunch of the "muh grafix" console bros to go out nd get a pc for these features

ITT: AMD shills in damage control.

Attached: NV-GeForce-RTX-2080-Performance.jpg (2560x1440, 124K)

If it's 5% faster, gaymers will buy it.

I dunno , I still think wood screws was funnier.