I bought this NINE YEARS AGO this month, have kept it overclocked at 3.8ghz on an X58 chipset all this time...

I bought this NINE YEARS AGO this month, have kept it overclocked at 3.8ghz on an X58 chipset all this time, and have still never met a game or program that could even get close to taxing it. Most things can't even pass 20% CPU usage.

Just what in the fuck have hardware manufacturers doing for the last decade? Did we already pass the golden age?

Attached: intel-core-i7-920-D0.jpg (713x586, 66K)

Other urls found in this thread:

youtube.com/watch?v=8caCYb0G0ec
youtube.com/watch?v=Ic7whUoAGWc
mp4upload.com/qgmvb7hcby49
mp4upload.com/iizmy1ub5kvr
youtu.be/XtftUHGfoNc
twitter.com/NSFWRedditVideo

unless your gpu is a potato there's no way it has no impact on fps.
At least 30% in newer games.

Yeah, we went from halving micrometers to jumping 20 nanometers, to fighting over 2 nm improvements.

You'll see performance gains based on the coming ISA-wars.

who said anything about video games? fuck off faggot

>who said anything about video games
>and have still never met a game

Attached: 1532444496523.png (1439x1543, 916K)

I have an i7 930 clocked at the same speed

sometimes I hit 100% when doing multiple things, I guess you're just not a POWER USER

Attached: CPU.png (400x112, 4K)

OP BTFO!

Attached: 1543792836261.jpg (946x693, 267K)

didnt read lol fuck off faggot

user you need to get an X5675 or something, that 6 core chip will beat my FX-8350 rig at stock and will destroy it when overclocked to 4GHz+.

Seethe op, seethe

I ran an OCed 920 at 4GHz till last year. The upgrade was nice. I'd suggest you do the same.

OP said most things don't use the CPU more than 20% so how would a new CPU make a game faster if the old one isn't even being used close to its full potential?

Im starting to wonder what GPU OP has because a modern game with a 1080ti will skullfuck an i7 920.

if I had to guess I'd say he's using a GTX 400 series :^)

Doesn't that just mean you are almost using up your single threaded performance? 20% as in almost one of your 4 cores is at it's limit?

yeah didnt read (lol)

Not bad, I'd suggest picking up a Radeon HD7970 as an upgrade for cheap, I just picked up a 7970 for 80 bucks on ebay and it's amazing.

That and a 6 core Xeon X56XX and you're still extremely relevant.

tfw bought a xeon to replace my 930 months ago and it's still sitting in bubble wrap

Still a factor in video encoding /editting

Yeah 30% is about the highest I've ever seen a game use the CPU. That leaves 70% unused. How would a CPU upgrade help?

What are you doing that's getting anywhere near maxing out any of those CPUs? The upgrade was nice how, what changed?

Doing what? If I run a particularly taxing game it might use 20-30% CPU. If I then have a couple of youtube videos open the browsers might use 10-15% each. Even adding background functions and services that's not even 50% CPU usage.

Other than rendering or stress tests, how could you possibly get a CPU to capacity?

Huh? That's not me, I don't type like a retard.

I have been upgrading my GPUs over the years, I started with a GTX 580 in 2009 then a GTX 770 and now a GTX 1070. This thread isn't about GPU though, I watch my CPU usage all the time and I have never, under any circumstance, seen it get above 70% with multiple programs. I've never seen a single program get it above 35%.

A modern game will skullfuck my 920 how? BF5, Destiny and Uncharted 4 don't even use a third of the CPU. Are you saying that new mhz are somehow better than old mhz on a 1-to-1 basis?

Sure, but who still uses CPU for those things? GPUs are way faster and pretty much everything supports those these days.

>A modern game will skullfuck my 920 how?
user these new GPUs, especially since pascal, can draw the frames so fast, older high end chips cannot keep up with it, even an FX-8350 with 8 threads rocking at once on GTA V will bottleneck a GTX 1070.

youtube.com/watch?v=8caCYb0G0ec

>The upgrade was nice how, what changed?
Cities Skylines was hitting the old CPU hard. Similar for extraction of compressed files. Also with a new board I was able to move to a M2 SSD off of a very strange (but functional) PCI-E SSD.

My GPUs were being bottle-necked by my CPU in a few games.

Well now wait a minute you seem to be saying two things here. First you say the GPU is so fast that the CPU can't keep up with it, then you say GTA V will bottleneck the GPU.

So which is it?

Huh. What about Cities Skylines was so CPU intensive?

I bring up GTA V because the game is multi threaded as hell and the 8 threads will not push that 1070 to 100% even with the game maxxed out.

If you have a D0 stepping and that shit is overclocked to 4.2ghz+ I believe you OP. Hopefully you got triple channel 1600 or better on that shit and didn't get cucked with 1333

Fx8350 is a pile of crap with only 4 FPU cores, and is slower than the i7 920

FX will bottleneck anything user.

Well you can say whatever you want about the 8350, an i7 920, especially stock, is gonna be a bottleneck, I can actually get the performance of a 920 if I use 4 cores and 4 FPUs at 4.5GHz (sad i know) and it will be an even bigger bottleneck than four FPUs 8 threads.

I've pushed my 6800K over 50% usage in games easily, and that has way more performance than the i7-920. So there's no way that you can't find games that are taxing, unless you're specifically avoiding anything taxing.

you do realise that to this day game engines cannot utilise multicore processing properly so if you're seeing 20-30% on the total CPU usage, it translates to 2 threads maxing out?

I don't have that one but I still find it hard to believe that a game that old could ever stress a modern CPU, let alone an overclocked old one.

It is, and I have it at 3.8 stably at I think 1433mhz memory.

I believe you, but 50% still isn't 100%. My 920 is faster than your 6800K but you have 50% more cores so I don't see there being a huge difference in CPU utilization.

I'm certainly not avoiding heavy games, I have Codblops 4, Destiny 2, BFV, Witcher 3, Uncharted 4 and lots of others and I tell you I have never seen one of those games utilize more than 35% of my CPU (and that's including background processes).

Why are you angry? Doesn't what you just said show that new flashy """""fast""""" CPUs are pointless wastes of money?

user, Bloomfield was awesome and the westmeres that came after were even better, but it is a bottleneck, doesn't make it a bad CPU, GPUs are extremely fast now.

youtube.com/watch?v=Ic7whUoAGWc

Congratulations on owning CPU worse than bulldozer and being proud of it.

> Most things can't even pass 20% CPU usage.
> Just what in the fuck have hardware manufacturers doing for the last decade?
Good things. However, software devs were lagging behind, but Electron devs are bound to change that. Now every single program will max out your 6-core CPU.

I have a 2600k OC'd to 4.1 that's still going strong. My 970 just shit on me tonight though, been trying to figure out if I should go 2070 or 2080 (I know CPU will bottleneck, plan on building a new one altogether but need a GPU ASAP).

Your i7-920 is not faster.
1) it's on Nehelem Architecture which is significantly weaker than sandy bridge and all architectures beyond it. In fact, I actually ended up testing this with my 2.8GHz Lynnfield CPU (which is stronger in single core clock to clock compared to Nehelem) but was beaten by a 2.3GHz Haswell CPU in single core load.
and 2) when OCd to 3.8GHz you have a minuscule clock advantage over the 6800K. Which runs 1 core at 3.8GHz and all others at 3.6GHz. No advantage whatsoever in single core clock.

So given how post-sandy bridge designs have a clock to clock advantage, you would need a fairly notable clock advantage to beat the 6800K in core to core performance. Something like 400-500MHz.

>GTA V
>that old
>could ever stress a modern CPU

>i7 920
>faster than 6800k
Okay, I'm curious now, could you run a couple of those games with a and take a pic with CPU and GPU utilization, with timestamp pls

2 P9X79 boards and 2 6-core chips for $500 feelsgoodman

3.8 is higher than 3.6, that's all I was saying.

Also, you are saying that 1mhz from a CPU is better than 1 mhz from another CPU. Explain.

I don't have GTAV installed at the moment, here is Destiny 2 on high settings, this is my second monitor with the game fullscreen on the main monitor while running a strike mission with a swarm of NPCs. I censored my chat programs.

Attached: destiny.jpg (1088x989, 263K)

watch 4k anime with temporal interpolation

>Why are you angry?
what?
>Doesn't what you just said show that new flashy """""fast""""" CPUs are pointless wastes of money?
i'm saying that when you say "If I run a particularly taxing game it might use 20-30% CPU", it might be using that amount from the total capacity but in reality 1-2 threads are at 80-99% while the rest is running the background stuff. so there is room to improve single-thread wise, which later CPUs are significantly better at

What's there to explain? If ghz was the one performance metric that unified all architectures the amd 5ghz fx cpus would be dominating anything short of 9900k

Want real core performance? Run a benchmark, like cinebench r15 in single core mode.

LOOL OP CONFIRMED RETARD

If programs can't take advantage of more than 1 or 2 cores then what is the point of a CPU with twenty nine cores at the same frequency?

It's not the one metric, number of cores is also obviously applicable. Explain how 1 megahurtz is not equal to one megahurtz.

I don't care about secondary performance as a result of additional pipelines or chipset updates or bus lanes or bike lanes. While a chipset is sometimes tied to a certain CPU line, the CPU is doing the work. So let's compare apples to apples, unless you're trying to say that apples aren't apples. Which is literally what you're saying:
>stronger in single core clock to clock compared to Nehelem

>If programs can't take advantage of more than 1 or 2 cores then what is the point of a CPU with twenty nine cores at the same frequency?
many programs can take advantage of multiple threads efficiently, but games can't

>Explain how 1 megahurtz is not equal to one megahurtz.
IPC, retard

>IPC
Well jesus tapdancing christ why did it take 45 posts, 27 users, 2 hours and a prying line of questioning to get that out?

God damn anarchosperglords just want to be so angsty and confrontational.

>Also, you are saying that 1mhz from a CPU is better than 1 mhz from another CPU. Explain.

Sure, CPUs ave different instructions per clock ratios. Namely, yor i7 is nothig against a last gen intel. Benhmark your shit.
Low CPU usage = shit cpu

i see you never used gentoo.

That was a given. You are underage or retarded. AMD athlon whipped incel's ass backwards pen 4 but idiots like you just though moar megahurtz

Did you disable HT?

Vsync to 60fps?
Doubt it would use just 30% if let rip, particularly with a 1070.

>AMD athlon whipped incel's ass backwards pen 4
l o fucking l
amd was the one on the mhz/ghz warpath you absolute fucking shut in and intel still crushed them for two decades after

No vsync ever, it makes me nauseous, and this is on a 120hz monitor.

Uh user AMD were the ones racing to 1GHz during that time not Intel, and with a few interruptions Intel has been ahead of AMD in since. There's no need to be rude.

Attached: prescott.jpg (801x1500, 314K)

The game is minimized, of course it's only using 30%.

Patch day for ESO.
Feelsgoodman.

Attached: patchtime.png (1920x1080, 2.53M)

>gtx580
>2009
found the newfag bullshitter.

>Most things can't even pass 20% CPU usage.
BULLSHIT you've obviously never installed gentoo
>or rendered anything

I know you're full of shit or playing games with a 60fps cap or dual channel memory. I have footage of a 3.8ghz i7 920 with triple channel memory running 40-50% utilization in Doom, 70% in BF1 single player, and 50-70% in GTAV. All while managing to at least slightly bottleneck a 1475mhz gtx 970 using ultra settings.

Either you're lying or something else in your system is holding the cpu back. Additionally, there's newer instruction sets that Nehalem does not support being used that will widen the IPC gap from newer CPUs.

mp4upload.com/qgmvb7hcby49

mp4upload.com/iizmy1ub5kvr

mp4upload.com/iizmy1ub5kvr

>Most things can't even pass 20% CPU usage.

Because you're only using single-core shit, and that core is being utilized nearly 100%. 100/4 = 25.

>Did we already pass the golden age?
We passed the golden age of computation. Golden age of storage and security vulnerabilities is upon us.

Using a 8370 @ 4.7ghz and two 1080.

Runs Fallout 4 at 60~ fps locked at 5870x1080/3840x2160.

Don't see any reason to upgrade if it can do that with a 60gb unoptimized, uncompressed texture pack.

It'll probably be TES6/F5 that gets me to build another system and that'll be 10+ years now, so fuck it.

>have still never met a game or program that could even get close to taxing it
You're either really dumb, a liar, a poorfag with a shitty GPU or don't play anything except the most casual, undemanding shit. Download the new Hitman game that came out yesterday, install a decent graphics card and then tell me again about how your CPU isn't holding you back.

Looks like some creature out of berserk.

I'm sure (((they))) would be pleased if I spent $60 on a 10hr title, $2000 on video cards, $1000 on mb/cpu, $300 on ram and $500 on a 4k monitor so I could run the same game at 93fps in 1440 vs 60 fps in 1080.

>More fps makes it funner!

Rocket dock oh man that's old nice thread op so far its hilarious I really hope you're pretending to be retarded.

Thuban > nehaLEL

the 'new mhz' quite literally are better because of IPC (# instructions per cycle) improvements which result from improvements to the pipeline/architecture

This video demonstrates what one architecture generation can change at 8:58

youtu.be/XtftUHGfoNc

The speed of the processor can be completely unrelated to how the processor handles a task.

Jesus, what's wrong with you americans

ArmA 3 and DCS will kill that CPU