2019

>2019
>3GB VRAM

OH NO NO NO NO NVIDIOTS WILL FALL FOR THIS

Attached: PNUCQli.png (725x255, 21K)

Other urls found in this thread:

newegg.com/Product/Product.aspx?Item=N82E16814125966
youtube.com/watch?v=IghcowGhRBc
twitter.com/NSFWRedditVideo

They already had to gimp the 2060 so that the 2070 made any sense at all and now this lmao

>3GB
what do I need 2 for?

Several new games are already hitting the 6GB limit on regular 1060, so 3GB is way too little especially for a Ti card.
Then again, fanbois will happily eat whatever shit the green jew squeezes out.

>no competition
This is what the people who cheered on AMD's death actually wanted

>but Nvidia would N E V E R take advantage of consumers. they aren't jews.

WTF? 3GB vram in 2011+8? JFC this company. I fucking card with Ti in the model name no less. What timeline are we in lads?

Thanks for only buying Nvidia for the past 10 years retards

1 is enough already.

I like how no one here is even attempting to consider this card as a cheap and lightweight media workhorse and are instead judging it almost entirely by its capabilities as a gaymer card.

The absolute state of nu Jow Forums

>2020
>Geforce 30 rolls around
>GTX 2660 Ti will have maybe 6GB VRAM at most
>non-RTXfags will forever have to stay on low end shit

Attached: 1517734730923.jpg (400x386, 25K)

Its going to be sold for $150?

Perfect card.

It's the model name you stupid faggot. 1660Ti. It's not a GT730 or the like. How fkin dumb are you?

Attached: 1530552942090.gif (800x800, 3.35M)

Because buying a new gen video card from the (((Nvidia))) group at a premium price for the purpose of "media workhorse" is insane.

You could get a 980, a 660, any AMD card released since 2010 that'd be just as capable as a "media workhorse" for 25% the price this thing is going to list for.

People buying new, new video cards do so for gaming and related.

Which means releasing a new gen card that's clearly gimped compared to video cards released almost a decade ago for "current prices" is just...wow.

512MB should be plenty.

Ti's aren't exclusively gaymer cards you mong, the lower model number they are using clearly indicates that it's not meant to be a high-end product, the Ti simply represents a more powerful variant of the original 1660. I'm all fine for competition, but don't be a tittysmoking retard and harp on a company for providing consumers variable hardware performance and price points.

The only way this is a viable card is if you're building a small "smart tv" box, it's single socket and or passively cooled and draws less than 100w.

It's Nvidia, you know none of that's going to happen. It probably runs at 70c idle on founder versions, weighs 7lb and takes 3 slots.

>a nvidiot has defended this

>It's the model name you stupid faggot.
That sentence is missing the important vocative comma.
>How fkin dumb are you?
It's spelled "fucking".

Some people don't have a desktop PC at all and will want new components.

>you don't need more than 3gb

Attached: 1537862629975.png (900x600, 492K)

The worst one.

I'd be fine with that price if the performance increase over 1060 3GB is ~40% as claimed and the price is $150. Thats how much I paid for 10603GB 2 years ago.

No they're not. Allocation doesnt mea usage. And most modern engines use asset streaming so loading into memory is becoming less important after a certain threshold

there are games RIGHT NOW that won't run on 3GBs of VRAM
i've never been happier that i bought a cheap 1070ti

how much cum do you drink? 1060 card is their midrange
cheap ass tier was 1050Ti
>I like how no one here is even attempting to consider this card as a cheap and lightweight media workhorse and are instead judging it almost entirely by its capabilities as a gaymer card.
>
>The absolute state of nu Jow Forums
>doesn't know why people criticize it
keep defending your shit tier opinion
iq test when?

Realistically, it's one of the middle timelines, because at least we have mainstream 6 and 8 core CPUs thanks to AMD.
The worst timeline would include 4 core Incel stutterfires.

I hope this is trolling.

at least it will see all 3GB. Radeon VII doesn't even support UEFI.

t. Ryzen owner

There are thousands of good games that run on 3 GB VRAM, far more than you ever could play in a lifetime.
Gaming hardware enthusiasts are like audiophiles who listen to their equipment instead of the music.

>People buying new, new video cards do so for gaming and related.
This is not true, there are many people who work with media that upgrade their hardware to the latest generation of cards, many people who do not game yet are still going to be looking for an affordably priced and new card, not something from the previous generation. Few of Nvidia's customers will go out and buy older hardware, they look at the cards available and think "oh, this is better but not as expensive as the 2080, 2070 etc" because they look at the model number, amount of VRAM and nothing else. You are projecting your level of technical competency and purchasing prowess onto the overall consumer base when you are undoubtedly part of a small minority in that particular base.

Anyone buying a new video card for "gaming and related" likely isn't even going to consider a 1660 over higher-end models.

Those older faster cards will start to blow up soon. You can't rely on them anymore and will have to swallow the bitter pill of RTX.

Attached: 1506549126694.png (251x203, 20K)

Will we get good memes like the GTX 970 vram fiasco?

Attached: 1524525767654.jpg (960x960, 73K)

>1060 card is their midrange
1060 is from the last generation retard, the 1660 will be the new mid-range card to buy.

What are those new games anyways? It must be unoptimized shit like Ass Creed and Glitch Raider. Been playing RE2, MHW, and FFXV on high/max settings absolutely fine on my 1060 3gb.

Exactly dumbass which is why 3GB vram is retarded and being criticized. A Ti version too. Now you're catching on Forrest!

>shills grasping at straws
LITERALLY INDEFENSIBLE 3GB GARBAGE

Attached: 1519382003263.gif (300x169, 1.03M)

Why would I need 256MB?

AHAHAHAHAHAAHAHAHAHAHAHAHAHAHA

newegg.com/Product/Product.aspx?Item=N82E16814125966

Attached: 1532388377355.png (3508x4961, 2.1M)

The only thing Ti indicates is that it's a more powerful variant of the card, clocked higher and with more cores unlocked, it has nothing to do with gaming and they could use an entirely new model number if they wanted, but they prefer this labeling because it's a marketing gimmick that actually makes people buy more expensive card options.

Imagine playing modern AAA western titles like a retard.

Yeah man, fucking love that nintendo-grade graphical quality.

This has nothing with optimization. It's high-resolution textures not being able to fit inside the 6GB. You can only compress textures so much before they start looking like shit.

Who buys a new card to play old games? In newer games this GPU will be bottlenecked by the VRAM alone, it's like having a 4K display with HDMI 1.2 ports

What the fuck would you use a 1660 card for besides gaming? It's not going to be any good at workstation tasks.

Technically, 128MB is enough for a 4k desktop.

3GB of VRAM on a mid-range card is perfectly fine, most people are not streaming in 4k textures (those that do will not buy a mid-range card anyways) and most games will not use more than 3GB of VRAM using the recommended settings for mid-range cards that most developers set by default in their games.

>imma good goy gais

Attached: 1524887009704.jpg (1280x720, 116K)

O,vy....

Ti should have never been an indicator of power. I mean, yeah power in the mid class, but they should call their top card Ultra like in the old days.

the naming scheme is really confusing me

now tell me how much does 16gb help the housefire 7

sounds like it could be a good upgrade from my 660 if they don’t fuck the pricing

High res textures do have to do with optimization. You can have amazing looking textures without using excessive VRAM amounts. Games from a few years ago used to manage this but games today just say fuck it bloat this shit for a minor visual difference.

Why even do this? There's already a 3GB 1060

Yes. Art style > Throwing bloated textures at everything

It will work perfectly fine as an affordable rendering workhorse and outclass mid-range cards of the previous generation for this particular use-case. The majority of people working with media and rendering photoshop designs and youtube videos do not have expensive quadro cards in their workstations.

Pretty sure it only has 16GB because of how you stack HBM2 memory.

>tfw developers gave up on making etficient game engines because of there ridiculous amounts of VRAM

last one was probably RedEngine Witcher 3, I played ultra with 2gb vram

I bought a 1060 3GB for $150 so I could replace my 460 and play old games and new games. Tales of Vesperia looks absolutely amazing at 4k w/ 4x MSAA and it only caps out at 2.6GB VRAM usage.

>rendering workhorse
No
>outclass mid-range cards of the previous generation
No
>do not have expensive quadro cards in their workstations
True, I bought a 1080 ti for $500 last year after the ReTardX press conference. My friends use similar options. Or you can pay $250+ for like one third the power of a 1080 ti.

what do you need 16M colors for? you can use 65K with 64MB and your eye won't notice the difference

The point is obviously to provide an upgrade over the previous generation for mid-range users without a hefty price increase for a couple of extra GB they likely will not use.

Whose ass did they pull the model number 16X0 out of?
>7X0
>8X0 (mobile only)
>9X0
>10X0
>20X0
>16X0 ????

Hahaha, no.
There's a few things you can't cheat on, and textures are one of them. You increase the resolution, the size has to increase.
>Games from a few years ago used to manage this
...by using lower-resolution textures.
>games today just say fuck it bloat this shit for a minor visual difference
Should we stop all progress in visual fidelity just because Nvidia wants to sell garbage to idiots?

The human eye can't see beyond 8bits

see
Witcher 3 on ultra looks and runs better than the latest Ass Creed on medium.

I bet you watch blockbuster movies unironically.

Peak graphics was when you needed a 2d and a 3d card though

Attached: 1519407481157.jpg (3360x1890, 1.16M)

>t. still plays left 4 dead with shitty anime character mods

Yes it will, there are still people running 600 and 700 series cards and using them for rendering tasks, they do the job just fine.

Not everyone can afford to throw $500 away on a high-end model. The 1060 I am currently using was $180 on sale and there is only realistically a 20% - 30% performance difference when playing games at the current 1080p resolution I'm using, the 1060 is still getting me framerates above 60 on the latest Battlefield without expensive anti-aliasing options set to high, that's perfectly fine by my standards and I do not expect I will need to upgrade for another generation before I start rendering below 60FPS

Only plebs play FPS games.

Most game engines are capable of streaming in textures at variable resolutions, releasing a 1660 will have zero impact on what quality of textures are available to you.

Looks better than the base game

They should release this card for $200 just to kill AyyMD once and for all. But we know they won't do that.

you were so close yet so far.

Attached: vesperia_part2.jpg (3840x2160, 1.12M)

No

The models and textures are inconsistent with the map textures and models, it looks like shit and you know it.

>10801660
>11G
>3G
what
what

Perfect for us old schoolers that prefer 1280x1024 resolution

I can see you have nothing left to say in rebuttal, I'll take this as a concession of my point. Have a nice day.

Rainbow 6 uses 6.5gb of vram on ultra lol.

Tfw 11 gb vram

You're being told by someone who actually uses a 1080 ti for rendering that a 3gb 1660 ti for $250 (optimistic) is not good for such a task. You're wrong and you don't want to be told the truth. So in short, no.

It has always been this way, 980 > 1050, 680 > 750 and so on. The 2080 is much faster when it comes to multi-rendering than the 1080 (upwards of 50%), the 1080 does not outperform the 2080 when you account for the overall benchmark.

RE2 says it uses 8GB for textures in the settings but when you inspect task manager it is nowhere close to that. So I assume games are only allocating VRAM as much as you have, just like how Windows 10 will cache as much excess RAM you have.

I'd rather wait than buy nvidia pozzed garbage

Attached: 04300430543034.jpg (1080x1397, 154K)

my 980 died, i really don't want to have to spend this amount of money on the rtx meme

>tfw have 1060
now I really don't know what to buy this year, will there be 1770? I dont think I can "wait for navi" for a year

Attached: 1536148594958.jpg (290x281, 15K)

youtube.com/watch?v=IghcowGhRBc
they have invented a time machine

It is perfectly fine for such a task, both cards will render large illustrator and photoshop designs at 4k with only a minor difference in ttc. The only major time difference you'll see with a 1080 is rendering 4k video, and even then the video would need to be over an hour in length for any noticeable difference in ttc. Don't project your standards and requirements onto everyone else, the vast majority of people rendering youtube videos and the like aren't doing batch processing, they don't need the power of a 1080 and shelling out the money for such an expensive card isn't going to have an impact on their revenue.

If you are using a 1080 for batch processing you're retarded anyways and should be using a Quadro.

No.

If you're not running a 4K G-Sync/Freesync monitor you don't really need an expensive card. Unless you're playing Star Citizen, in which case enjoy your poorly optimized alpha.

I still don't get why the fuck they chose the name 1660 instead of 1160

>I can see you have nothing left to say in rebuttal, I'll take this as a concession of my point. Have a nice day.

Rx 580 will outperform it

Both of you are retarded, especially you for gimping yourself with a 1080 instead of using a Quadro for rendering.

No, 1660 is a cut version of 2060. I am considering 2060 if there's something worthwhile. Most probably not and the job sucks out free time anyway.

Again, you're being told by someone who actually does rendering that the 1660 ti for $250 (which is an optimistic estimate) is a bad deal for such a task, especially when used 1080 ti cards were snapped up by people like me for $500. Everything you said in defense of the 1660 ti is wrong. Here's your (No).

I already made that point here:
If you need a card to handle basic rendering tasks, it's idiotic to shell out that much money for what is going to be a negligible difference in rendering completion times. The only reason to buy a 1080 is for 4k gayming.

You don't actually do any rendering do you? $500 1080 ti was the hot fucking option for us.

No. Bad user. No. Noooo.