THANK YOU BASED NVIDIA

videocardz.com/newz/galax-geforce-gtx-1660-ti-packaging-leaked

THANK YOU BASED NVIDIA

Attached: GALAX-GeForce-GTX-1660-Ti.jpg (1242x783, 170K)

Other urls found in this thread:

videocardz.com/newz/palit-geforce-gtx-1660-ti-stormx-pictured
videocardz.com/newz/evga-geforce-gtx-1660-ti-xc-pictured
twitter.com/SFWRedditVideos

At least the black guy who stole the keyboard is already in cuffs.

what a retarded name

that's a keyboard?
it looked like RTX artifacting

I still don't understand this card. Is it a 2060 without rtx? Then why buy this and not a 1070(ti?) with 2GB more VRAM?

Because you're retarded

Turing is better at DX12 because it supports Conservative Rasterization Tier 3 and the CUDA cores are better for compute

Also has newer hardware decoder and encoders, GDDR6 support

It's better at DX12 because the shaders can switch tasks at a hardware level.
Aka Asynchronous compute.
Pascal was able to switch tasks via software, but it was a half assed solution.

No, are you fucking retarded?

Do you know how bad the latency is from GPU to CPU?

All async compute is done on the GPU itself in hardware, not software

Even then, Pascal still crushes Poodeon in DX12 performance, async meme or not and Turing just annihilates Poodeon VII even more in performance

Stop talking about things you don't understand

Its weird pricing, you might as well buy a 2060

Attached: cores.png (1222x529, 69K)

>6GB

the hell, so 1660 is another 1060, Ti is slightly better but still not 1070 performance
whos the target audience for these cards?

The point of this product is to either get rid of the leftover stock 1060 chips, or to push people who want a new gpu to go with the rtx meme

May be alright just till Navi.

videocardz.com/newz/palit-geforce-gtx-1660-ti-stormx-pictured

videocardz.com/newz/evga-geforce-gtx-1660-ti-xc-pictured

THANK YOU BASED NVIDIA

Probably those who bought RX 580 or 1060. So basically the majority of people.

>sixteen sixty tee eye
wow thats a mouthful

>upgrading barely 2 year old cards
>for at best 35% more performance
baka!

Obviously not those who already bought a similar card but the market segment that spends that much money on a GPU, dumbo.

And I still got my RX570 what a pleb

>tfw upgrading my rx 570 to 3080 for 100% performance

I don't get it? I'm looking to replace my one card in the 4790k machine. 1070 (non ti) is like 200ish. 1070ti is like 300ish. That's used. I've also been looking at deals on the new RTX 2060 which is around 300-320 on sale. This card looks like it will be close to the same price as the 2060. If I go nvidia though I'll prolly nab the 2060 even with the 6GB limitation. Nvidia used is a crapshoot since the drivers are garbage once the new models are out. You'll never see them make gains like AMD. To be fair though AMD's stuff is always a refresh on a refresh though.

I'm too fucking old to keep up with vidya cards.
What should I replace my AMD 270x with?

Attached: kermit coffee.png (957x751, 28K)

>tfw upgrading my rx 570 to a card that doesn't exist with unknown performance
*

How much money you spending?

>I'm too fucking old to keep up with vidya cards.
There's literally two lines of cards that almost always follow the pattern of "bigger number = newer = more powerful", and then an easy, incremental series of steps between lowest and highest range cards. Unless you're 85 with dementia, you're not too old.

Attached: 1510077070510s.jpg (221x250, 5K)

When someone says that though it usually means they want to be spoonfed other opinions on what to do or buy.

Seems like I gotta up it but I'm thinking $300.
Not playing anything really demanding but wanna try for 60FPS @ 1080p.

amdjeets btfo

Wow, you sound like a retard who watches too much Linus cuck tips

1080p you can do fine with a cheapo 580 or 590. Unsed and new they're around a hundred to two hundred dollars. I have 1440p monitors now so I need at least something like a RTX 2060 which is around $325 on sale perhaps cheaper at the moment. The used market you can get 1070/Ti versions for around your budget as well.

>Wow, you sound like a retard who watches too much Linus cuck tips
Your lack of understanding stems from your source of information. Is that where you get your computer knowledge from? YouTube videos?

Attached: MBvcORO.jpg (961x535, 82K)

I don't really know about that. It looks like it is mainly to compete with the 590 and 580 new right now, and they are staggering the releases because when I went into my local Frys to buy something, they were still swamped with 1050s and 1060s so I think they are trying to take it slow. They may even delay the 1650 depending on reevaluating the stock of the 10 series card.

When are they going to launch a $200 card? Never again?

>where you get your computer knowledge from
Primary sources are: Jow Forums, /b/, /a/ and Jow Forums
And /v/, forgot about /v/

No wonder some no name low quality pajeet brand leaked it.

3440 x 1440 or 2560 x 1080

Attached: 1541517465342.png (320x320, 90K)

That's gay af.
Where's my 3D fairy?

waths the point of this card? you can already play at 1080 ultra with a $180 rx 580 and even some games at 1440p thanks to the 8gb of vram ....

Why didn't they just call it the 2050 and 2050 Ti? That's essentially what it is.

No gaytracing

what happened to 11 12 13 14 15

lol no, those cards are getting too old and need to turn down setting.
This will be the new ~$225 1080p/60 king, until AMD can finally release something that isn't another rebrand hopefully later next year.

It isn't though, this replaces the market segment of the 1060, not the 1050/Ti. Other cards will come in that range.

>15% faster than 1060, at 40% price increase

Is this the power of nGreedia?

>~$225
Its price is $280 ...........

>15%
Weren't you memelords saying 25% yesterday?
Still with zero proof of course.

Just buy it ok?

>Pascal was able to switch tasks via software, but it was a half assed solution.

ALL GPUs have hardware schedulers m8. All Nvidia did post Fermi was move instruction scheduling/optimization within a thread/warp to drivers. Since instruction execution can be predicted, they can simply optimize instruction scheduling to reduce/prevent divergence before it's actually done, fine work Nvidia, very good. The actual issuing of work to the hardware, i.e. issuing instruction batches to warps, switching context, etc. was always done by hardware. It would take too long to tell teh CPU what warps are loaded and what are free, or if there are raster tasks that need to be done, so it's handled immediately on the GPU. The expanded ACE scheme that Sony came up for their PS4 (and GCN gen 2) is why AMD is so good at async compute. Turing simply expanded the nvidia equivalent of the ACE, which we literally know because they told us they doubled up on scheduling hardware for Turing.

I'd rather wait

Attached: 04300430543034.jpg (1080x1397, 154K)

>No VGA out
Ew.

>delayed till october

>Believing the fake lies of an AYYMD asslicker

vega 56 has been $300 with game bundle pretty regularly now. Sub $250 used.

>590 performance for $300

BASED NVIDIA

Yea vega 56 cards used are a steal but so are the 1070ti's.

>590

Attached: obama-laugh.jpg (660x440, 267K)

Attached: 1532001808924.jpg (300x300, 10K)

>used
>buying mining cards

Attached: lelfacegolden.png (327x316, 208K)

Buy a used 1070, find some good deal. Literally the best solution

crypto fags undervolt their cards, if there aren't any problems fully stressed then it'll probably have a normal lifespan.

>1660 doesn’t even refer to the number of cores anymore

Attached: image.jpg (700x690, 67K)

If I was looking for the advice of some meme-spewing retard, I'd have asked for it.

>previous series 10
>now 16

why

16 is bigger than 10

This is fake as fuck, why the hell would Nvidia release new GPUs without their key differentiating feature?

Marketing. Makes people with 6, 7 and 900 series cards feel like their cards are older than they really are.
Nvidia can only increase sales if they get people to upgrade sooner even though they don't actually have a need to do so.

That looks like a light-skinned person you dope

>their key differentiating feature?
and what would that be

cause for most people $300 is the absolute maximum they would spend on a GPU

so nvidia is giving them the shittiest deal possible for that $300

basically maximum jewing

Attached: 53cd18fb019d3b0234f7873044821c16f6ebe696fcf430e19fcb70a64998b810.jpg (958x775, 661K)

amdrones are truly the appletards of pc hardware market

Turing can do INT and FP simultaneously and support mesh shaders. Newer game engines are INT heavy.

THE NEED FOR ADDITIONAL POWER
THERE IS

Attached: PALIT-GeForce-GTX-1660-Ti-Specs-740x677.jpg (740x677, 38K)

literally an overclocked 1060 and nvidiots will actually pay for it

>OC trash
This is not the amd GCN thread.

1660, yes. 1660 Ti? Fuck no.

>tfw still using my 980
At what point should I start caring?

not for at least another 2 years

Im with the 970, it chugs along in newer heavy titles like shadow of the tomb raider, but i dont really care about aaa games so i think this will last me at least one more generation.

>NVidia anything
you never did, apparently.

Seems like this is what would normally be the 2050 but they wanted to keep 2xxx specific to premium RTX capable cards.

Just replaced mine with vega56, about 40-50% more performance

If I was still 1080p I would have stayed with the 980, but at 1440p it wasn't cutting it

>buy used mining card
>suddenly VRAM dies

RTX is not a meme. All the good cards will be RTX forever. You scrubs are going to have to stay on bottom feeding cards like 1660 and whatever succeeds later it if you don't want RTX.

Just like all those cards that had physx initially amiright?

Wait for Navi.

let's talk about the 590 my son

Navi12 is a workstation card. They meant navi14.
>.t knower