Why was Nvidia's GTX900 series superseded so quickly by the GTX1000?

Attached: .jpg (1600x1105, 617K)

Other urls found in this thread:

theverge.com/circuitbreaker/2017/11/28/16710568/hdmi-new-specification-10k-resolution-future-proof-standard
hotchips.org/program/
youtube.com/watch?v=jprKr8fgzHM
twitter.com/NSFWRedditGif

it's called evolution

Attached: 2018-05-30 23_29_12-6-Gen GeForce GTX x80 Comparison_ GTX 480, 580, 680, 780, 980 & 1080 - YouTu (1600x644, 228K)

Better question is why can I still find shitty new Kepler GT cards but no new 7er or 9er GTX cards?

if your going to get 600 series go for a 680 blower cooler its the only one that doesn't have shit caps
the best bet is ebay or amazon.

also check the caps on those old ones..

Interesting...

AMD had been hyping up vega so nvidia quickly reacted, but it then turned out to be a flop so now we're stuck with the 10 series for another year

Haven't prices increased too?

(OP)
sorry I'm
(You)

I forgot to answer OP
Its the switch form GDDR5 to GDDR5X
Indeed there will be a even bigger jump with HBM2 or HBM3

the difference is that AMD doesn't have the compute High ROP's they are stillstuck at 64 ROP

while Nvidia has 88 ROP that means GDDR6 should be very large despite slower memory.
moreover ROPS are the biggest factor with FPS hance is why AMD is still behind.

Makes sense..

980 is still best card if your running at CRT. 10seires dropped VGA support.

~18 months is normal for GPU generations. 20 months between the 900 series and 1000 series is perfectly normal.

>66155050
28nm -> 16nm

+increased clocks + increased number of cores + ram ( same power draw )

THIS

Kepler March 2012

Maxwell Sept 2014

Pascal May 2016

Nvidia always released new GPUs in 2 year cadences

The longer nvidia waits to release the 11 series, the more people get desperate for them. There's more people who are waiting for an upgrade, or have shitty old cards on their last legs. So when nvidia comes out with their $1300 gtx 1180s, people will lap them up like the dogs of Caesar.

Attached: giphy.gif (235x240, 1.85M)

Do you enjoy sucking off anons with your posts?

>cadences

The correct word is intervals.

you mean 980Ti

Why are GPUs so much more expensive than CPUs?

Because memory cost adds up, especially when the DRAM cartel is colluding to raise prices

>DRAM cartel

Just say it's the chinks and gooks. Because they are the ones driving the prices up.

see
Pascal is so far and away technically superior to Maxwell, and at the same time humiliated AMD's "next-gen" Vega.

On top of that, Volta in the Titan V already looks like a massive upgrade over Pascal, which is probably why Raja got fired from AMD.

Get ready to pay 2000 ameribucks for the GTX 1180, before mining drives it up.

They should also drop dvi and make the GPUs a little thinner.

>$1300
>with AMD releasing the steaming pile of shit called Vega
$2000 and it'll still fly off the shelves.

probably some kind of hardware related exploit which was not discovered yet by public

they wont drop dvi for literally 20+ years now. too many monitors use it and HDMI cant do more than 75hz or some thing.

His answer is simple and true
Just a node shrunk and some optimise

HDMI is also shit. We should've migrated to only using DP by now.

>and HDMI cant do more than 75hz or some thing

Attached: 1513845941223.jpg (665x574, 29K)

theverge.com/circuitbreaker/2017/11/28/16710568/hdmi-new-specification-10k-resolution-future-proof-standard

that's 1.5 years, not 2 years

It's just the gooks, the chinks don't make DRAM.

>HDMI cant do more than 75hz or some thing

Attached: sign of kindness.jpg (848x480, 93K)

hotchips.org/program/

>NVIDIA’s Next Generation Mainstream GPU

SOON

I didn't know that. I'll have to hold onto my old 760 and use it for muh emu crt.

This. 16nm is an extremely good node. Nvidia just updated some stuff on maxwell and released a refresh.
No need to produce the same arch on old node if yields are good.
I think Raja just went for the money of Intel, not got fired. Now that Intel wants to create better gpus, he might have wanted to go there.

Because the more you buy. The more you save goy.

>BUTTMAD AYYMDPOORFAGS WITH NO GPUS AND NO DRIVERS DETECTED

>TFW
Should I just wait for the 1160? I don't want to spend more than 200 €.

Attached: 1527715458672.jpg (496x185, 14K)

>No scale
>No units
Fuck off Tim Cook

the 1100 series won't have a card under 200€ thanks to miners
and no, msrp doesn't count

Nigga if they're gonna announce it "soon" it'll be Computex.

Because of the 3.5gb 970. Nvidia needed to make some quick cash to pay the class action

they demand it at increasing levels... basic economics.

let me guess, all those were benchmarked in a game launched after maxwell?

To make more money. The 1000 series will be superseded by the 1100 series just the same way.

youtube.com/watch?v=jprKr8fgzHM

Because NVIDIA wanted people to forget about their "3.5GB instead of 4GB" scam

You can always get a converter.

>analog to digital

Wouldn't that cause a bad input lag?

3.5

Source on the graph? I want to see how the '60 series compare among generations.

it's a complete fucking lie to say it had 3.5 gigabytes. it's not true. it's just not. the thing had four fucking giga bytes.

Nevermind, found it, seems HU didn't compare the '60 series.

where would one buy this 2.1 hdmi? It seems it's far more easier to buy the Display Port cable, althought it's not easier to get a motherboard with a DP or cheap VGA with one