Will the new consoles make 4 core CPUs show their age?

Will the new consoles make 4 core CPUs show their age?

I bought a 7700k in 2016 without doing any research :/

Attached: 395F1639-5E8E-480A-B602-F30A85587799.jpg (300x360, 38K)

Other urls found in this thread:

techpowerup.com/reviews/Intel/Core_i9_9900K/13.html
digiworthy.com/2018/04/28/ryzen-7-2700x-memory-opt-benches/
youtube.com/watch?v=PHBsR1Y68G8
twitter.com/SFWRedditGifs

3 years is the perfect interval between upgrades, goy.

still okay for medium settings

only game my 3570k was holding my fps back was assassins creed odyssey which looked like fucking trash even on max settings anyway. fucking liberal sjw trash engine full of bloat. there was nothing complex going on in the scenes at all to warrant such shit performance.

Attached: 1550981889682.jpg (1438x1402, 938K)

what's wrong with the 7700K?
>t. 6700Kfag

nothing.

They're already showing their age. A 7700K at 5GHz+ performs worse than a stock Covfefe Lake i5 in modern games.

Lul need proofs faggot

WRONG

Game devs are mentally and physically retarded, they require a 100+ team of pajeets JUST to use more than 2 CPU thread. This is why using a i9-9900K nuclear reactor overclocked to 5.1 GHz with an industrial water chiller gives you less than 5% performance advantage on 90% of vidya compared to an i7-7700K with a low end water cooler at STOCK settings.

techpowerup.com/reviews/Intel/Core_i9_9900K/13.html

Stay BTFO, corelet.

Attached: 4 cores in current year.png (2560x1440, 1.67M)

explain

lol hell no, quit spewing shit

this makes me wonder with does the 8th and 9th gen i5 suffering from those shit lows and stuttering like a boss so much?

Congrats you found 1 game out of a thousand that properly utilize more than 4 cores. Even then not very efficiently else you'd get 200% the FPS of what you'd get on a 5GHz i7-7700K.

I5 7600k niggas represent

i'm gonna use my 2500k till it dies and there's nothing you can do to stop me

Just based on this, the jump from 7700k to 8700k is the most significant for a number of years (although it’s still not earth shaking).

I might wait until the summer after the new consoles are released and buy a new board & CPU then.

you and me both.

* the other point though, and other PC people will be in the same boat, is I use a large screen TV as my primary display, so my framerate is always capped at 60fps. That may make an uber strong CPU less relevant.

At the moment 8k TV seems more likely than 4k 120hz.

I have a 6700k, and at 1080p 144hz my GPU is always my bottleneck, even on intensive cpu games like MonHun World I only get like 60-70% utilization.

>tfw bought i9-9900K late last year
>still ranks on top
>use it on Gentoo to compile software

Gamers btfo.

The first 2 years nothing good will be out anyway. First 3 years games will be iterative.

>still running a 3570k
it just werkz

Attached: f3218572f7391f16e964f04f2e08e3cdb7e9291fc08903834b79afce8e52e46e.png (644x591, 149K)

What jump? I don't think you would notice the difference, unless you timed tasks.

Attached: screenshot-www.cpubenchmark.net.jpg (1208x755, 652K)

>Still believing we will see resolution and framerate standard increases across the market.

Attached: DAzIA-KW0AAbRB1.jpg (720x611, 41K)

IDK but 4k TVs have been doing a pretty damn good job at putting 4k resolution in almost every household
You almost can't get a TV over 40in that isn't 4k

And still, essentially zero content.

There is some content mainly in streaming, the main issue is that there isn't a whole lot of channels broadcasting in 4k and it's going to take a fucking long time to get the momentum going for all the broadcasters to switch over and the US needs to rollout ATSC 3.0

Consoles have been 1440p/4k since 2016

PC can get 1440p60 at least on new games without a ridiculous video card

>twice the fps from a cpu
brainlet

It's true. In a theoretically 100% CPU efficient vidya with no GPU bottleneck you'd see twice the FPS going from 2 to 4 cores and 4 to 8 cores. But you DON'T and for a large part using a dual-core i3 processor still gets you an average of 80% the performance as an 8-core processor of the same frequency at 1080p.

Meaning that most vidya still can't use more than 2 cpu cores efficiently yet. This is easily proven by looking at total CPU utilization.

>Ubishit

Attached: thLKYX7JWY.jpg (474x474, 19K)

Even if Poozen 2 hits 5GHz, it'll still be a long way off the 9900K's performance.

>bottleneck
hahahahaha not that old meme hahahahaha

It already matches the 4.7 GHz i7-8700K with good RAM at 4.3 GHz. It's really really close to i9-9900K performance.

this

>zen 2
You mean zen+? Because if it could reach 5GHz then it would destroy the i9-9900K since it has 5% higher IPC.

>Consoles have been 1440p/4k since 2016
Yeah totally
4k is the peak. We have no reason at the moment to move onto an increased resolution, in fact, we will probably stagnate for a while while we get VR sorted out.

>ubishit
>1080ti
My 7700k isn't doing that bad, ironically I find myself playing older games because everything is shit.

Before you guys talk about hardware and plan out upgrade paths, have any of you realized that the only demanding games are the AAA ones and that they are all absolute garbage that are only bound to get worse? I was hooked in the upgrade loop a few years ago until I realized I spent more time thinking about upgrading rather than playing anything because modern games are fucking trash. I see this an extremely common phenomena among PC gamers and our own never ending backlog of untouched diarrhea.

This. I have a 8700k + GTX 1080ti and ironically just play WarCraft 3, CS: Source / GO.

I have a 1080ti and a Dell AW3418DW, which is a 120hz 3440x1440 g-sync monitor. I'm loading up a LoL match right now. I was also playing Terraria earlier.

PC gaming is a meme.

I'm actually impressed how quickly the 7700k became obsolete

I have the same graphics card and monitor. Does terraria scale to 21:9?

Cope harder

>1080p
Stay BTFO, screenlet

/Thread

KEK you bought a 7700k didn't you?

Attached: 15498786959882776093672832957963.jpg (229x220, 38K)

Yeah

>same performance of a 18core cpu

>still using ryzen 3 2200g
Good enough for min graphics on most games, I suppose.

Consoles have been 1440p/4k since 2016
>upscaled
>30 fps
>low graphics settings
>tonnes of motion blur to compensate
yikes

>1080p in current year

>implying I use my hardware to game
Gentoo won't compile itself you know

What was yours clocked at? Mine was at 4.8 Ghz and handled the game well enough. There were a few instances of it pinning my cpu at 100% but not all that much.

>looked like fucking trash even on max settings anyway. fucking liberal sjw trash engine full of bloat. there was nothing complex going on in the scenes at all to warrant such shit performance.

Agree with all of this.

>what is spectre

Attached: 1550940040965.jpg (419x481, 25K)

You guys are fucking retarded, look at the link again. The STOCK i7-7700K on average is only 4% slower than the i9-9900K OC'd to 5 GHz. If you OC the i7-7700K to 5 GHz it's 2-3% slower in worst case scenarios. And that's on 1080p, at 1440p it literally makes no difference whether you have a 4 or 8 core processor, performance is extremely identical.

>Will the new consoles make 4 core CPUs show their age?
no, game developers are migrating to one of three (not including rpgmaker and the like) engines, none of which effectively use more than 4 threads. The last game to do so was ashes of the singularity, which was really just a tech demo

I upgrade GPUs once every 4 years and CPUs once every 8 typically.
Might make an exception now that my 6600K lost hyperthreading, but only to switch to zen2

But why were they able to make use of all cores on the PS3 and 360 when they can't do the same thing today

Attached: 1543421007147.jpg (800x800, 97K)

They only had 1 powerpc cpu core to deal with, the rest of the "CPU" was really a frankenstein GPU they had to work with. In the end most devs simply ran their games at 480p or similar and upscaled them to 720p if an HDTV was used. Stuff like FXAA/MSAA got thrown out the window and all but the fastest and crudest lighting, shadows, and FX got left. The real witchcraft was cramming somewhat high quality textures into the 256MB of vRAM and the rest of the game into 256MB of system RAM which had a significant portion of it already occupied by the OS itself.

literally anything above 60 fps is a waste of money

t. console user

there only three cores on the 360, it wasn't actually powerful enough to hyperthread, and the ps3 was universally reviled by developers who weren't factor 5 as being difficult to use well.
Both consoles were just starting out with the "big three" engine gig, Unreal Engine 3's flagship UT3 and Gear of War released after the 360.
Developers were both more willing to and more capable of putting in work to make the games run well, and what previously worked hasn't expanded all that much to fill newer hardware with more cores.

The new consoles have performed like older gaming PC for the last few generations now, why would it be different this time?

Of course it's within 2-3% of a 9900k when it is GPU bottlenecked. What are you trying to prove?
See >69958772

How the FUCK is a GTX 1080ti at 1080p a bottleneck when the card is meant for 100+ FPS at 1440p?

see

Anything with less than 8 threads is already having issues with current games, even the 8600k will bottleneck a system in the last two Assassins Creed games. With next gen consoles almost certainly moving to 8 core / 16 thread systems expect your 7700k to be an issues within 2 years. For Intel systems anything below z370 should be considered a stop gap right now, at least with Ryzen systems you are not locked because of the socket.

or X99 and up for Intel systems

Agreed.
>Tfw 6 core cpu and a gtx1070 just to play minecraft

Irrelevant

Are you me? Then again, it's possible to mod Minecraft's graphics to a point where all that horsepower becomes justified.

>mfw I7700k and R9380x
kek
my CPU gets bottlenecked not the other way around.
Navi when?

Attached: 1541334964058 (5).jpg (803x1024, 74K)

>this level of delusion
The 9900K at 5.2GHz is 46% faster than the top Ryzen chip. Poozen 2 would need to hit over 6GHz to match it. And that's before you get into the fact that it will be even more of a latency-filled mess thanks to the chiplet design.

AMD cannot compete. Simple as that.

No, he's right but there's a huge catch: expensive RAM.

digiworthy.com/2018/04/28/ryzen-7-2700x-memory-opt-benches/

>with good RAM
>4.3 GHz

Attached: 1464819568425.jpg (653x726, 92K)

Don't take my word for it. A 4.2GHz 2700X already matches an i7-7700K OC'd to 5 GHz which is like 2% slower than an i7-7700K in most games.

youtube.com/watch?v=PHBsR1Y68G8

slower than an i9-9900K*

Yeah they're all going to be cross gen games running on the same engines as now for at least the first couple years, maybe longer. Build a new PC in about 2022.

>2700 getting beat by a 4790k
>2019

Attached: Tom Cruise laugh.jpg (500x333, 28K)

Look at the RAM timings and then look at

>Look at the RAM timings
That's the RAM they used to retest the 7700K, brainlet. There are DDR3 CPUs on that list, so clearly they didn't use the same RAM for all of them. Poozen was given its unicorn 3200MHz CL14 RAM in GN's testing. It's just shit, and fake Jewtube videos from nobodies don't change that.

my condolences

Loom again, BOTH the ryzen and intel chip were tested with the same 3466 VLL RAM. The point is the user making the claim about ryzen having a 5% IPC advantage over coffeelake is true since there was no IPC uplift from kabylake to coffelake.

A 4.3 GHz zen+ chip matches a 4.5 GHz coffeelake but only if good RAM is used. Which basically means said low frequency zen+ chip is only 10% slower AT WORST than an OC'd i9-9900K. It's a good deal when you consider all the high end exotic cooling an OC'd i9-9900K requires.

Read the post again, you stupid Pajeet fuck. Nobody cares about that fake video. We're talking about GN's testing.

>and CPUs once every 8 typically

if you've done that more than twice shut the hell your mouth

Idk if you are retarded af or just 11 yo.
You compare a 280$ cpu vs a 500$ cpu.
If you want compare Intel and AMD, compare them in the same price range.

GNs testing used 16-18-18-36 RAM timings on the test. You know, the thing ryzen actually loses performance with even with 3400+MHz RAM?

See

I would rather deal work 5% less performance with Ryzen and/or buying more expensive ram than support Incel's kikery at all. Fuck Intel and fuck Israel. Enjoy your 8core $500 housefires just so you can play shitty AAA games at 600 fps

>seething

I would rather deal work 15% less performance with Ryzen and/or buying more expensive ram than support Incel's kikery at all. Fuck Intel and fuck Israel. Enjoy your 8core $500 housefires just so you can play shitty AAA games at 600 fps

>housefire
>my 9900k runs at 5Ghz
>runs at 65°C at load on a NH-U14S air cooler

65C gaming rather, ~80-85 on AVX

>Who i7 9700k here?

Attached: 1551269845638.png (864x771, 661K)

>muh stress test

i upgrade every 10
recently just went from a core 2 duo to the 32 core Threadripper.

stop LARPing

This, I got my i9-9900K to run a 5.2 GHz on a hyper evo 212 and temps never go past 70C.

>70°C
the absolute state of intelfags

i went from pentium 4 to celaron