This combo will never be obsolete for gaming. Prove me wrong

This combo will never be obsolete for gaming. Prove me wrong.

Attached: never obsolete.png (925x395, 327K)

Other urls found in this thread:

computerbase.de/2018-10/assassins-creed-odyssey-benchmark-test/2/
twitter.com/SFWRedditImages

>>/v/

wait 5 years and the GPU will. back in my day the gtx 480 was the top of the line shit and now 9 years later it's literally obsolete and all of them have caused a fire. unless a revolutionary advance is made in computer engineering processors will still steadily and slowly advance as they have done for 10 years, but more cores will be available for parallel computing. however, parallel computing tends to be a PITA so unless there are major advances in that software design and algorithmic area then we're still probably not going to fully exploit the extra cores as well as we could, or as often.

>pls make me feel good about my purschase
Your post is proof enough.

Wrong.

Why do you need to be proven wrong? Top of the line GPU's last about 6 good years, this has always been the case.

It will last a while sure, but not forever. He'll look at the abysmal performance of real time ray tracing on it. All it would take is RT to become a standard in the gaming industry and you'd be forced to upgrade. And this isn't to talk up Ray tracing, but just an example. If any great strides are made in gaming engines/rendering options, expect to upgrade. As for the i7, I lol no. You'll either have the chip did of heat stroke, or the motherboard you're using will have it's VRMs blow up due to trying to maintain a higher clocks 24/7.

I say this as a guy with a top tier 2080Ti and R7-1700X. Hoping 7nm Ryzen will be my last upgrade for least 5 years

When progress will continue like now, the GPU will be outdated in like 6 years but the CPU might still be enough since now you can even get away with a 1st/2nd gen Core i

A 120$ chip will destroy that CPU by EOY.

yeah right, just wait bro!

4K

Do you remember the 7700k?

I don't even feel bad for those guys, but it must have sucked given how much better the 8700k turned out to be. Not even counting Zen, Intel as well will have CPUs that utterly annihilate the 9900k this or next year. Stacking and chiplets will give an enourmous performance increase, we're not even close to the limit of what can be done.

> 7700K
still a decent chip for gaming

Also, 6-cores is on Ryzen 5 levels, and the 8700K gets to 5GHz with ease.

Linuxfag here, I haven't turned on the m$ partition for the last 3 years. Gaymen on Linux really does work in $CURRENT_YEAR. I won't be delusional, however, since I usually don't play new games as they release. To give such an example, just recently played Hollow Knight despite the DLC being done last year.

You gotta go back

Ironic, since you're stuck on 14nm for the last 4 years.

>stacking
A meme that will make the CPUs ~30° hotter.

Intel's node is still superior since zen2 isn't out yet.

Imagine actually believing this.

Are you telling me the 12nm node that AMD is currently using is superior to Intel's 14nm?

*1440p and VR blocks your path*

Intel's 14nm proccess is superior to AMD's current lineup, correct, but they're still just ovenized skylakes.

>gtx 1080ti
>blower model
NOPE NOPE NOPE

>not gaming on 8k res at 240fps

Attached: 300px-U_Wot_M8_Original.jpg (300x300, 13K)

>Kingdom Come Deliverance
>GTX 1080ti on ultra settings at 4k
>23 FPS
top fucking keke

amd is for poor niggers

Seething

There really is no such thing as never obsolete. There is however been longer and longer gaps where you really "have" to buy something new. I had an i5 2500k, last me almost 8 years (RIP either the chip or the mobo it was on I really don't know) and that was longer lasting than any other computer I ever had so far. I see the gap getting longer where whatever my next is will last me maybe 10+ if I'm lucky.

>What's Moore's law?

what cpu is that?

i7 6700k GTX 1070 16GB 3200Mhz DDR4

8700K

Trust me. The AyyMD® JustWait™ Zen2 3000 series will blow your tits clean off.

bump

> Not a 2500k
> Not a GTX 750 TI

nice 30fps medium you got there bro

> t.

Attached: 1552743485884.gif (350x409, 437K)

> 2019
> 4 threads
t. inteljeet

>Not having a rave in your bedroom every night.

Moore's Second Law is also a thing.

>mfw 9900K + 1080 Ti

Intel's only demonstration of stacking is a single Core core and 4 Atom cores. It's not going to be a 9900k beater anytime soon, if ever. I don't know where people got that idea from.

The state of Intel damage control.

I'm from the future and they are correct. Gaming get banned worldwide in the next couple of years because of the drain it has on society. After complaints that ravaged guatemalan throat singing boards and others like it subsided as the general population just ignored them, all that effort went into bettering mankind and we are now a uptopia.

So just hold on there if things aren't going well now. It'll turn around soon.

Ryzen 3000 will make it obsolete.

>still decent
Lmao i remember this 4c shit being called the best gaming cpu and that more cores aren't needed
Where are you faggots now, the 1800x is still relevant wheres this turd shits itself in newer games, don't need to talk about actual productivity programms

There still are 4c/8t Ryzen 5s user...

user... the 7700K is still more relevant on gaming than the 1800X.

Why not 9900k/9700k

Yes retard but they are under 100€ and not 350€+
They provide a badic entry and give much more value than the pentium 2c abominations

1080ti can barely run asetto corsa and subnautica at max settings in vr so It's already obsolete.

> why not 9900K + RTX 2080ti
we're talking about 2017 tech here user...

You're telling me the only card that's not obsolete is the 2080ti?

In sc2 maybe, in newer games like ac odyssey this crap shits itself like no other

post the source on the 1800X beating the 7700K on ac odyssey

>never be obsolete(tm)

computerbase.de/2018-10/assassins-creed-odyssey-benchmark-test/2/

I don't see any 1800X there user...

This.
Back in my day ATI 5970 was too of the line shit. Now, after 10 years, it's only good for browsing. And I bet, some webpages could lag your desktop with that GPU.

2080ti is a meme card. It has wasted space on it for memetracing that bloats the price way out of proportion. There is no good card at this moment that is really future proof.

Are you retarded? The 1800x is faster than the 1600x and roughly on par with the 2600x
Do the math mouthbreather

>roughly on par with the 2600x
kys

> every GPU is obsolete
the absolute state of Jow Forums

Anything wrong with that Statement cocksucker?

Every GPU became obsolete the moment VR became a thing. Everyone was hoping nvidias next gen 20 series cards would be a huge step forward but we all know how that turned out.

Now our only hope is AMD pulling a Ryzen out their ass and saving the gpu market. Stuff like eye tracking that is supposed to reduce GPU usage is still far off and will probably be too expensive.

/thread

Just bought one of these hope I will get a couple years of 4k ultra the high-med after that

Attached: 2018082011230996_m.png (460x460, 52K)

Yep I have a 1080 it just isn't fast enough for 2560x1080p 75hz+
Its a good 1080p 60 card but after tasting 4k high refresh rate for years I need it

>VR became a thing
It didn't? We still don't have good enough GPUs for that, therefore VR is still a meme

Anything around a 1070ti 1080ti 2060 2080 is good enough for 2k vr they only push like 90hz on potato settings.

>shit resolution
>shit refresh rate
>shit graphics
If you wanna go blind or get aneurysm, sure, it just works

My issue with vr atm is the wires shit fov shit price shit hardware shit resolution (2-5k isn't enough for a screen shoved up against your eyes) no games
If it doesn't flop again after gen 2 I'll go in gen 3

>waste money for 2080Ti
>for a 1700X
AMDrones are a new level of retard

15 inch macbook pros have very similar gpu performance to the 480

>iphone with a bigger screen and battery
>throttles because it can't handle itself