Is overclocking even necessary these days?

is overclocking even necessary these days?

Attached: overclocking-community-AU-EN-981x653[1].jpg (981x653, 173K)

I got my i7 2600k at 5ghz
That's a 1.60ghz increase over the base clock, that makes a huge difference.

it's free performance.

Realistically, no.
The computation speed will be bottlenecked by the max clock of your RAM and system bus. The extra GHz will only speed up those computations which can fit entirely into L1 cache.

For gaming? Only if you're at 1080p

At 1440p you are GPU bound. The 2700X, 8700K, 9700K, 9900K all perform withing 10fps of each other

I don't think the cost of increased power consumption is worth it, if anything, seeing a cpu clock to 800mhz when idle is more interesting because it saves power when not utilized

My 2600k ran at 5.0ghz 1.4v for years.
But now it's degraded to a point where I need 1.4V just to run 4.1ghz.
Shit kills chips over time.

no, necessary was never the point of it

No

okay, i just went ahead and attempted my first oc

Ryzen 2600 at 4 GHz at 1.3v

should i bother testing if this is stable at lower voltages, or is 1.3v completely safe?

>whats its power consumption
KYS

Attached: 1536672392827.png (638x359, 408K)

its like 30% performance increasing but you have to pay like 50$ premium on processor and 100$ premium on cooler.

depends I guess is 150$ worth 30% more cpu performance.

not many games are CPU focused but if you play Blizzard shit or some obscure mod DayZ kind shit it might be worth it to you.

>necessary
No
>Beneficial
If you KNOW what and WHY you're doing

you should undervolt to save the planet

Talos fag says overclocking doesn't belong to Jow Forums, it belongs to /v/.

1.3v is fine. Personally I wouldn't go above 1.325 but that's just me.

my 1600 is actual hot garbage
can't hit 3.7ghz stable at under 1.35v kill me now

It never really was to begin with. It was of some use if you wanted just that little bit more performance for some software or other, or wanted to stretch your budget as far as you possibly could by overclocking a slower CPU rather than just buying a faster but more expensive CPU, but I've certainly never felt it necessary.

people who are scared of overclocking are pretty funny

its literally impossible to kill hardware in 2018 unless you literally solder shit onto it or flash it with non official hacked bios.

Immediate kill it? No. But you can still fuck its longevity.

Buying a CPU out of the box brand new just to overclock it? No, that's fucking stupid. Spend a little more and run it at base clock.

Buying a CPU with adequate clock speed, using it for 2-3 years till it starts to fall behind on the benchmark, and then riding that puppy into the ground by OC'ing it then? That's fine.

Oveclocking was viable back then because processors were slow as fuck.

Processors today are fast, you can't go wrong with either company and it should only be done when your PC is old and can't run most of your programs anymore.

There are no games or professional programs out there that require you having 5+ jiggahurtz every second.

people who overclock are buying a new CPU within the first 4 years of their purchase so it doesn't even matter

Then I think they're better served by just buying a better CPU to begin with and leave it a little longer before upgrading again, but whatever. Their money.

1.3 is safe but don’t go higher
1.35 is the highest I would go

It's also about wanting to feel special. E-peen and perhaps just wanting to get the most out of what they bought no matter what.

Intel doesn’t have this problem

Not really.
It was never NECESSARY, it's more of a fine tuning thing.
But every year the manufactorers sell their hardware closer and closer to it's actual limit making the margin of possible overclock smaller and smaller.

Which is a good thing, you basically get your part pre-OCd for you.

Oh and by the way Nvidia completely crippled OC on Turing and Pascal.
Looks like they will remov an option to OC entirely very soon.

Then stroke your e-peen all you like if you want to. But overclocking still has never been actually necessary.

The lower the woltage the better.
1.3 is OK, but if you can go lower - go lower, that's a lot less heat and general wear on the chip.

As people have said, not neccessary unless you are stuck at 1080p and the chip has aged.

I have a 4770K and honestly, I just use Asus's lazy but simple one-click overclock feature. The best part is that it only clocks up to 4.4ghz when needed, then goes back down to 800mhz

People who set a constant OC shorten their cpu's lifespan and waste energy

One of the best processors ever made but he is talking about modern processors. 2600k is 7 1/2 years old brother. Ancient in computer hardware terms.

Pascal was an OC beast senpai.
My 1080 was able to hit 2,150mhz on water.
The Nvidia Bios even has a 1,987mhz boost clock build in, if you can run the GPU cold enough. (Under 35C at load)
Turning has some issues, but based on the insane VRMs, Nvidia didn't gimp OC.

no

Overclocking doesn't do shit.

its not about necessity
its just for fun bruh

I like your style

Attached: risitasww3.gif (136x102, 516K)

>free
electricity isn't free

>My 1080 was able to hit 2,150mhz on water.
>The Nvidia Bios even has a 1,987mhz boost clock build in,
WOOOW so you hit like 7% boost with water? Sick gains.

The BIG problem with Pascal is thai it is voltage locked on 1.1. You can't overvolt it without shunt mod.

I bet Pascals would go to 2400 if they weren't hardware locked.

How much more does extra 100W cost on a monthly basis assuming that your PC is doing daily 2-3 hours of resource intensive tasks?

No because AMD can't do it

Google tells me the average electricity price in the US is 12 cents per kilowatt-hour
100W for 3 hours is 0.3 kwh. For one month that makes 9 kwh. 9*0.12=$1.08
So, yea it's roughly $12/year.
So I guess that's not much. I dunno how much extra the overclocking would consume (not as much as 100W, I imagine).

It depends a lot on the CPU and how it's overclocked but generally we're talking about pennies. There are better ways to save money on electricity, like heating and lighting. Heated bathroom floor can double your electricity bill.

"Intel DO what AMDon't."
Fuck. Why is this sounds so good? I'm just a poor little AMDrone and I really enjoy my 200GE + RX 570 8GB.

The memory controller runs at the same frequency as the CPU, so the memory controller is at 5ghz. L2 and L3 also run at speed of the CPU.

The bottleneck would be DDR3 but with 2400mhz DDR3 it's not far off new systems.

Overclocking is always worth it, especially with how simple it is to set and test everything now.

>Heated bathroom floor
Lol, not all of us are rich.
Old monitors can consume a lot of energy as well. I have an old 1080p 32 inch TV and on the back it says it draws 108W. I guess most modern LED equivalents consume less than half that much.

I overclock my ECC RAM. Samsung B-die 2400 MHz.
Perfectly stable at 2933 MHz CL15, stock voltage.
mc0: 0 Uncorrected Errors with no DIMM info
mc0: 0 Corrected Errors with no DIMM info
mc0: csrow0: 0 Uncorrected Errors
mc0: csrow0: mc#0csrow#0channel#0: 0 Corrected Errors
mc0: csrow0: mc#0csrow#0channel#1: 0 Corrected Errors
mc0: csrow1: 0 Uncorrected Errors
mc0: csrow1: mc#0csrow#1channel#0: 0 Corrected Errors
mc0: csrow1: mc#0csrow#1channel#1: 0 Corrected Errors
edac-util: No errors to report.

after a few weeks of uptime.

Attached: 1515611937146.jpg (720x707, 26K)

>is overclocking even necessary these days?
wrong way to pose the question i would argue
Between stock and boost clocks, the current CPU gen from both sides is driven much closer to it's physical limits than previous ones.
This means that while tech savvy users have less to gain from tinkering, the average consumer with their OEM shitboxes and boxed coolers is in many cases getting less than advertised.

It's high time for a real architecture overhaul, we're clearly at the limits of what they are doing right now. (though admittedly AMD just arrived there. Intel really hasn't moved in 5? years)

the typical american spends thousands each year on climate control devices running full blast to keep their poorly insulated McHouses at roughly the same temperature throughout the seasons.

It's not that uncommon around here. Even the cheap single room apartments have it. It just consumes a shit ton of electricity.

if cheap-ass used xeons count as "these days" then yes, absolutely

FUCK THE PLANET, PEOPLE ARE DISGUSTING.

>unlocked cpu is free
>z motherboard is free
>aftermarket cooler to handle higher temp is free
>more power usage is free
>time spent overclocking is free
Nice definition of free faggot.

>2018
still believing in global warming

It is warming for certain, but the reasons for it are questionable.

Yuropoor here. I find the ideal temperature indoors in Winter to be roughly 18°C (64.4 F), but I know people who like their homes to be like saunas in the Winter.

If it can't be overclocked by 20% you've been memed by the manufacturer.

1.3v is not much at all. I use 1.4675v to get my 2700X to 4.25GHz all core. Anything past that and I would start to get very skeptical.