Hey Jow Forumsayniggas, i bought pic related in late 2016...

Hey Jow Forumsayniggas, i bought pic related in late 2016, how many years do you think it'll last me before i have to upgrde?

Attached: 20160616163536_big.png (1000x1000, 189K)

Other urls found in this thread:

youtu.be/BfSi-Z8r12M
youtube.com/watch?v=wFKBN3MGUGI
twitter.com/NSFWRedditGif

anus

when its not enough for your needs anymore

2020

given i run 1080p how likely is that

>1080p

a long ass time. devs arent pushing any boundaries so its safe to say youll run everything at 60fps that comes out for the next 4-6 years

people still game comfy on 770's. once they tweak the games settings

you only need to upgrade when the performance hit triggers your autism

No new mid-level GPUs like 1070 in the past 2 years, you're still good. New nvidia unreal builds are going to have real-time raytracing with that stupid noise filter thing they keep bragging about, probably release some more voltas to cover that, but only 1 game is coming out that uses it, and that's the new metro, and if you've ever played metro games before, you will know, that they suck huge cock. Boring slow-pase ez point and click shooter that most of us could beat blind-folded by the time we were 9.

till the end of summer

A long time, next gen consoles will probably be using something as powerful

this.

my 1950x has an R9 290x which I believe is like 5 years old or something and it does 1080p just fine over 100fps in most AAA titles and if it doesn't, I can Xfire another one in

Depends on what you do with it.
They do, but not comfy unless they only play LoL and CSGO. A 770 is worse than a 1050 now, bad drivers and lack of VRAM.

7970s and 290s ages much better.

>be op
>post thing on v
>3 replies
>make thread on gee shitting it up even further
good thread

>playing on easy with your mouse in the dominant hand

Why do people lie about this?
They traded blows when released and still trade blows now in 2018.

You're accusing Nvidia of anti consumer practices with no evidence to back it up. Do some research first cunt.

Not an Nvidia shill, just tired of seeing this falsified bullshit being constantly posted here.

youtu.be/BfSi-Z8r12M

Attached: S80617-052725.jpg (2560x1440, 267K)

I bought that exact same card for $100 a few months back.

>posts one game
Look for benchmarks for games like BF1, its not even the performance, 2 GB VRAM arent enough 2018

Why a 1070 and not a 1070 ti?

>i bought pic related in late 2016
>1070ti

when intel release their new GPU in 2020 that will be way ahead of 11series from NVidia and then in NVidia will release huge jump in power as response and so will intel so gpu in 2020 will be way ahead maybe 150-200% more powerful than 1080ti etc. at that point every one will have to upgrade because the games in 2021 will literally not support GPU from 2018-2019

>given i run 1080p how likely is that
You are set for the next 3-to-3.5 years.
1080p is the new 720p.
1440p is the new 1080p.
Developers are still choosing to use DX11 over DX12, so that card may carry you for some time to come.

>at that point every one will have to upgrade because the games in 2021 will literally not support GPU from 2018-2019

Attached: 58e50dd6d98a57e686cee40c1118813b4750d7020f8ef480e6b8629a4b0347c4.png (567x893, 374K)

Watch the linked video you fucking idiot

Hahaha hahahahahahahaha

Attached: S80617-081040.jpg (1440x2560, 481K)

i just got a 1050ti and will expect to use it for the next 10-12yrs

I've got a 680 and have no trouble running games at 60fps1080p still.

Should I sell my 1080 ti now while the prices are high and get the 1180 ti when it comes out?

its happened before GPU from 2005 didn't run crysis and in 2009 lots of GPU from 2007 didn't run Black ops when it forced SM 3.0 support.

its not uncommon for 2-3 year old gpu to not run new games

I have a 1050 Ti and plan to keep using it until 2025 at the very least. The key is to only play civilization and shooters over two years old.

The 2 GB versions suffer from microshuttering. Even if you have 60 FPS there will be frametime variances.

Keep trying user, you're not convincing anyone.
I'm saying that as a GTX 670 owner FYI, it manages to run bf1 at smooth 60fps when I turn it down from ultra to high and cap the fps to 60, it never dips below.

Attached: goal-posts-moving.jpg (640x426, 151K)

Wow, so proof that there is no noticeable difference between three whole generations of nvidia graphics cards? Thank you for proving my point.

Wrong. Much like 720p lcd monitors 1440p was just a stop gap. 4k is the next 1080p because it is 2x the resolution in each direction. And with this coming generation of cards we will be able to push most games at high settings at 30-60hz with a mid ranged card, high end 1180 ti and the amd version (lamo) will be able to go higher and support the 4k 144hz monitors releasing in the next 6 months.

But Jow Forums said the 7970 was best because the 680 got gimped with driver updates?
What the fuck Jow Forums I can't believe I trusted anyone here...

Wow, they couldn't even make a 10 percent improvement. Literally worse then muh 5% intel.

720p/1366x768 monitors/TVs were not a stopgap, they're still making them. Hell most laptops are still 1366x768.

I know, it's amazing how far behind AMD is. They still can barely compete with a 1070 with their latest and greatest Vega, what a joke, and the 1080 and ti just smoke anything from AMD.

>laptops
Laptops are universally shit because intel integrated graphics are shit and companies didn't want customers to notice the lag if they put 1080p monitors in them.

>top end card of 2012 vs. budget card of 2017
Also the 1050Ti has drastically better power consumption.

Wrong. Vega 64 is equal to the 1080 and lesser Vega is equal to the 1070. Why do you think Nvidia panick released the 1070ti? Even after all of that shaddy buisness prctice shit the AMD stillls beats intel on the entire cost tp prerf aratios.

HAHAHAHAHAHAHA funny howyou keep making thesestatements without the ability to back them up. Vega 64 needs liquid cooling and still struggles to keep up with reference 1070 ti. cannot beat 1080 at all.

Attached: Zotac-GTX-1070-Ti-Mini-Unigine-2-Superposition-1440p-Benchmark.png (625x529, 21K)

Jesus Christ these AMD fanboys are getting desperate.

It's only the ones retarded enough to be fanatics who lie about that, look at how defensive they've become upon being proven wrong. so far in denial he couldn't even watch your YouTube link before replying. Pathetic.

>1440p instead of 4k
>Unigine 2 Superpostion instead of a real game or use
Keep trolling NVidtard.

Attached: maxresdefault.jpg (1277x717, 281K)

It's not Nvidia gimping their cards it's AMD fixing their horrible drivers

youtube.com/watch?v=wFKBN3MGUGI

If most people buy green, and red basically don't make money from pc gaming gpus in comparison then is it any surprise red can't keep up? Who really expects AMD to compete at the highest end without a multi-gpu miracle?

And which texture setting?

I had the 780 Ti myself, ran 4K for 2 years, a horrible time. Especially Witcher 3. I also tried 1080p and 1440p, but you had around 30-40 % less FPS than a 1060 3 G. I finally sidegraded for free in 2016 until the 1080 Ti finally came out. It felt like a breakout from prison.
This. I always tell people who already HAVE old parts to keep them unless they are unhappy with the performance. But I would not upgrade with old outdated parts. There are still retards buying 780 Tis and FX processors.
I wish every game was like this, because then I would use Linux.

>how many years do you think it'll last me before i have to upgrde?
This is a meaningless question. What resolution? What target framerate? Do you want very high/ultra settings? Are you overclocking or leaving it at stock?

Futureproofing GPU's is a huge meme in general. It's pretty much impossible to tell where games are going to go unless you're actually a developer. Also there will always be some games that run like shit maxed out because of one or two settings.

Nvidia btfo by competent developers? Just goes to show how much money talks when you can pay to gimp your completer by providing "specialist engineers" to optimize games or Nvidia dynamic lighting (tm) and Hair works (tm).

High settings, one down from ultra.
I don't notice any stutters and when I'm monitoring my fps it very rarely down below 60, and even then I don't think I've seen it go below 54-55. Only when it's a clusterfuck on my screen and explosions and shit are all happening happening.
DX11 btw, on windows 7.

>windows 7.
who the fuck runs this old ass shit

Too lazy to upgrade and all my programs still run fine.
Problem?

Attached: tumblr_ofvmarKQ4G1rl2k7fo1_500.jpg (413x422, 70K)

>mfw I still use a 560 ti

Attached: 20180611_161619.png (600x580, 572K)

760 here and still playing new games

people who think the 9/10 series will last well into the 2020s are delusional
cards from 2006 would literally only run crysis at 20fps when it came out in 2007 and cards from 2007 would literally not boot games in 2009 that required shader model 3.0

no reason there couldn't be a huge power jump soon or that SM7.0 comes out with new xbox and doesn't support 9/10 or even 11 series cards and games released in 2020 for both xbox and windows require it.

People who say their cards will last them till 2025 are basing that on the fact cards from 2011 still run all games today like a 6970 will still run BF1 at 60+ fps. but this era has been shit technology wise because Microsoft have purposefully not developed new shit since that time.

Microsoft patched in SM3.0 support to the Xbox360 and fucked over PC no reason they cant do that with the new Xbox and literally make cards released in 2019 literally not boot games in 2020.

sorry guys shit happens Microsoft doesn't fucking care about pc gaming and the chance they put SM7.0 in the next xbox is about 100% and the 11series doesn't even have support for that.

>he doesn't know about plateauing processor performance

doesn't matter SM7.0 will still make cards unusable. if next xbox uses SM7.0 or PS5 uses some new advanced openGL or some other propriatry thing literally cards released next year might not support games made on them.

exact thing happened in 2009 SM3.0 games started coming out like blackops and you required a SM3.0 card to run it or els it literally didn't boot and SM3.0 cards had only been on the market for a year and a half.

I actually downgraded to a card with less performance but SM3.0 support for blackops

the thing that will kill your cards usability is its support of feature sets that will make it litearly unusable and unable to boot games long before your performance drops.

I still use an AGP Riva TNT2

if you buy a top end 11series in 2019 you might be literally forced to buy a cheap slower card in 2020 that supports SM7.0 just so you can literally boot the game.

Oh my bad

Games usually aren't limited though.
BF4 ran alright on my 8800 ultra because even though it was a dx11/mantle game it still supported dx10. Same with GTA V. And I'm pretty sure every dx12 game so far also supports dx11.

You'll be at a disadvantage but it won't be a downright crippling.

i've had a 770 since 2014 and it can still run modern games @ 1080p + 60fps + medium-high settings

I bought a GTX Titan (not xp, not black, just Titan) 6GB back in 2012. Still haven't found anything it cant handle.

Will be obsolete in less than 3 months
>novideo GPU support

I'm still using my meme card (970) from 4 years ago.

>5 years on and were still on the 1080ti ti ti

It really depends on how soon tensor cores and ray tracing become integral to games utilizing modern graphics API's.

Technically just a lower clocked black iirc

I'm not talking about DX support I'm talking about shader model support. a SM7.0 game wont be able to have SM6.0 support. go research what happened in 2009. it was the last time a big upgrade forced people to upgrade cards that where only a year old we are due another one that one was triggered by the xbox360 update this one will be triggered by the new xbox I think.