THANK YOU BASED NVIDIA

nvidia.com/en-us/geforce/news/nvidia-geforce-gtx-1660-ti/

nvidia.com/en-us/geforce/news/anthem-game-ready-driver/

THANK YOU BASED NVIDIA

Attached: geforce-gtx-1660-ti-up-to-3x-faster.png (3330x1704, 316K)

that's actually a pretty nice way to display generational performance increase
totally in games with a bias towards the 1660 tho

Low-end 2060 or high end 1660?

>Up to
>Inaccurate, purely marketing graph
Dropped.
Unironically wait for reddit's performance analysis.

Same post on multiple boards. Thanks nvidia.
It shows nothing

A pre-crypto refreshed 1070

>choose games that hit the memory-chugging point on anything less than a 960 (2gigs)
>suddenly a remarkable improvement
good goys

Lol a 300usd low midrange card fuck off nvidia and amd

The Vega 56 already dropped to this pricepoint

The 56 sucks but at that price it's a blowout

Gee willikers user thanks for this informative post - I was just about to replace one memory crippled gpu with another!

Attached: 1550332869302.png (1064x698, 321K)

at 280 vs 280 it's stupid to buy the 1660

Attached: 1440p_Vega.png (1324x1161, 57K)

No it didn't.

It was a one time rebate while supplies lasted. Supplies are gone.

>8% slower

When the VII is 8% slower than the 2080 its a total blowout, AMD destroyed etc etc. When the 1660ti is 8% slower than V56 its too close to call, jsut wait for drivers etc etc.

In short, don't use facts on Jow Forums.

Attached: 1550846503985.png (500x254, 21K)

Performance per watt never mattered until Nvidia said it does. This isn't a laptop part and people don't game for 16 hours a day as a rule so power consumption is totally irrelevant.

Oh and unless you are harry potter no, the difference in heat output will not affect room temp in an meaningful way.

>BUTTMAD AYYMDPOORFAGS DETECTED

Spot the Poozen retard

Newfags pls go and stay go. When Fermi (GTX 480 in particular) was a nuclear fire it was ignored precisely because power consumption on the desktop doesn't matter. Even the 580 respin ran hot and sucked juice but nobody cared. When fat kepler was comparable to hawaii for power draw, nobody cared. The second maxwell dropped and Nvidia massive reduced power draw suddenly power draw became THE defining metric for a gpu and has remained oh-so important today.

Hell the 2080ti only gets a pass because it is fast as fuck.

>3x price
>3x performance
okay

>hehe im gonna replace a part of this cpu name with poo that'll show them
you must be over 18 to post here

High-end 2080ti saves you this kind of question.

RX590 lol

960 was a piece of shit tho. Probably the worst x60 series card ever.

that's just not true, the 960 was the only 9th gen GPU to have pure hardware 4k HEVC decode.
Other 9th gen GPUs had hybrid decoding.

1080 Ti the way to go, you’d be dumb not to.

Power consumption ceased to be a major concern for performance GPUs since Geforce FX series and Radeon X1xxx series. As long as it delivers performance nobody expect the silent/near-silence PC gave a shit.

It still remains true today. It is only reason 600W+ PSU are even commonplace for non-workstation/server builds.

Sli 1080Ti reporting in
Mined crypto with these cards

I don't care how good this card's price/performance ratio is, it's still at a threshold where if games get a little more demanding they render it obsolete. The RTX 2060 has more headroom to stay relevant for longer.

Yeah, plus this 1660 is closer to the 970 price than the 960.

>2/3 of geforce gamers have gtx 960 performance or less
I knew most nvidia shills are damn poor including OP

My GT 1030 is not for gaming.

>used 1070 are at half the price and keep on falling
Meh, but why should I care. My gtx 1060 is enough and I don't even play game weekly.

>most nvidia shills are damn poor
Yep. None of the bigmouths run 2080 Ti's. Most likely a Geforce 900 series or a 1060.

going to buy ryzen then?

Attached: 9900khousefire.png (1112x833, 74K)

this

Imagine choosing to buy this 6GB shit instead of the 8GB 1070.
HAHAHA

I run a 2700x with a 1080TI

>implying you can't use nvidia with amd
what a retarded amd shill holy shit

I wonder if 256-bit AVX in Zen 2 is gonna hugely increase power consumption in this test.

i bought a 680 then a 960 then a 1050ti

not defending the gtx1660ti, in fact - quite the opposite, but IMAGINE nvidia providing driver support for the 1660ti and leaving the 1070 and others to rot

you know it will.

I think it'll have 256-bit FPU too so floating point power consumption is gonna go up. At least from the early previews it isn't as bad as a 9900k.

i can imagine

Retard here, can someone explain why this card was named as it was?

no one knows
nvidia likes to do that
and then somehow blame AMD for confusing names like they did with the GPP

because it's newer and better than the 1000 series cards, and it's not got the ray tracing capabilities of the RTX 2000 series.

Been wondering this as well. I’m just fuckin confused now.

then why not 11xx?

Gaymersnexus read Nvidias discussion at the start of their review, give them a click

>release new generation of cards
>they're received so poorly that you actually start reviving the previous generation
And AMD -still- can't compete.

Why not 1100 with the RTX?

Who cares?

Why did AMD call their new GPU Radeon VII?


Why did AMD basically re-release the same GPU but change the numbers ever since the R9-290x in 2013?

It's to fool kids with the bigger than Pascal number, but at the same time they didn't want to infringe on RTX territory.

It is a refresh on mid-range segment and intended to be an upgraded "1060".

Nvida wants to keep 2xxx series RTX only branding.

11 is a shitty marketing/branding number.

Thats not the current marketing line. Peopel act as if an 80w difference is gnnig to destroy your power bill and turn a given room into a sauna - neither is likely to be true.

It's less reviving old gen (because its actually based on the new gen cards) and more a retarded naming scheme. Why couldn't it be 1160 or just a GTX 2060 nobody knows.

Why would it be GTX2060 when it has fewer CUDA cores?

GTX 2050 then

They did give an explanation to gamers nexus. Basically, 1160 sounds very close to 1060 but thats not the case, its closer to 2060 than 1060 so they named it 1660. As of GTX2060, that would make things confusing.

Nvidia isn't going to compromise on the ray tracing future. Regular GTX cards will be pushed back.

Attached: 987989.jpg (782x256, 123K)

The only that it does it makes noise management a PITA if you desire near-silence.

The rest of the fags don't give a shit until the fans sound like a server room rackmount unit.

TU116. The problem is, the upcoming GTX 1650 will be on a TU117.

>comparing to 4+ year old cards because it looks so bad compared to last gen
based

This, too. Though the left part of their graph makes sense, there are better alternatives instead of paying $280 for this level of performance.

>fortnite
cringe.
Also find it hard to believe that the 1660Ti beats Vega56 in Apex Legends. That game heavily favors AMD. Though I guess more so at 1080p than 1440p for some reason. I'd imagine a driver update is going to make it a blowout in AMD's favor at 1440p as well.

It was at $250-$350 plus 3 games for months now. Then finally sold out when people saw how disappointing the 2060 and 1660ti are and bought up the remaining supply.
Makes you think.

Really good 1440p combo.

wow i wonder why that's so easy to imagine.

>retard here, please explain
Nvidia employees and executives, outside of their engineers, are even more retarded than you now days.

Based

>to rot
Nah, they will continue to issue drivers for the 1070, said drivers for it will just hinder performance.

Meh. I’m holding out for the GeForce 1488Ti.

JUST FUCKING BUY IT YOU STUPID GOYIM

>GTX 960 1 performance
>GTX 1660 3 performances
OH NO NO NO NO HAHAHHAHHAHA

better than AMD's no performance ;^)

the X in amd 480x cards stands for multiplication by my digits

Check them

Well you rolled it yourself. Zero performance.

To fool people into thinking that it's a 2060 without ray tracing, when it's just the x50 Ti card rebranded.

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

Attached: 1550935780947.gif (372x340, 3.84M)

performance per watt never mattered because it was never such an insane difference before, to the point where one brand is quiet and the other makes your computer go VRROOOOOOOM like a fbi helicopter hovering above your house

Quit your 0-1-2-3 charts. Use real FPS if you're going to shill.

My Skylake-X and 1080ti machine makes the room it's in warm after a while, probably significantly more than a Ryzen and 1050ti would.

What is it for then?

Every previous generation named models for the 11 part of TU116, not the 16 part. A TU116 *60 card should be the GTX 1160.

Again with these rumors. What hurts old cards in new games is when they aren't built in a way that would allow newer features to work well.

>if your card is [a few years old to infinity] this new one is faster
no fucking shit

I had a 7970 and it mattered. 300W just to run games on medium-high 1080p.
Earlier on it was fine because it was way better than Fermi and it would run games maxed at more around 225W. But as games got more demanding and it had to run full out for 60fps without even maxing games out, it's an issue.

But the Polaris and Vega cards have exceptible heat outputs.
Yes, Polaris perf/watt isn't great, but it's good enough. It's still better than equivalent Maxwell cards.
And Vega doesn't use much more power than Polaris, while giving a lot more performance. I can undervolt and underclock my Vega 56 that only cost around $285 to the same as a 1660Ti but I'd rather have the more performance lmao?

Perf/watt matters, but only within reason and especially so on higher output cards.
This, basically.

This are just dumb shills. 80 watts peak difference is like not even $1 per month for most Americans. Still under 2 Euro in areas with higher energy prices.
But the catch they hate to admit is that places with more expensive energy are usually colder climates. PCs are very efficient space heaters. So every dollar you're spending in electricity for your PC is a dollar, more or less, saved in heating costs.

My Vega 56 is usually only using 14-125W at 2560x1600@60.

Yep, much closer. $200 for 960, $330 for 970.
$280 for 1660Ti sits closer to the 970.
No one can even say
>muh inflation
when there's been little inflation from 2015 to 2019.