Is overclocking a waste of time Jow Forums? Why waste your time getting a 10% performance boost?

Is overclocking a waste of time Jow Forums? Why waste your time getting a 10% performance boost?

youtu.be/ShCK2x7tDXk

Attached: maxresdefault (6).jpg (1280x720, 137K)

Other urls found in this thread:

instructables.com/id/Upgrade-your-Casio-fx-82es-into-115es991es570es/
twitter.com/NSFWRedditVideo

Yeah, it’s a potential performance boost with increased risk of incurring damage. High effort sidegrade.

Absolutely. Your manufacturer already gives you the product in the perfect state. No need to mess with it like that and risk damaging something you paid hundreds of dollars for. This is the same reason I'm against de-liddnig a CPU

Yes. It's just people interested in technology that want something to keep playing with rather than just using it. They'd be better off tinkering with literally anything else than wasting time overclocking

its actually more like a 30% boost in cpu intensive games but ok.

on AMD its a not even a 10% boost but thats beacuse all their cpus are already pushed to the limit and not as stable as intel that sells them to people who actually have jobs and need 100% reliablity.

Tell that to the guy (not me ofc) that bought an 80 dollar chip with 4 cores at 3Ghz and unlocked it to 6 cores at 4Ghz, then bought two GPUs and did essentially the same thing while undervolting. Spending about 400 dollars to do the same as those spending thousands was extremely satisfying. Luck of the draw? Okay. Was it fun? Yes. Did I need to upgrade due to different architecture eventually? I did. Did it make me a fanboi? Only in being able to get more for less.

I run budget builds. I don't need the best of the best to get a few framerates on a game thats only draw is that everyone else is playing it and it taxes all resources. I will admit I don't have the money to buy top of the line every time. After growing up frankensteining my PCs, I've learned to make due with what I can afford to the best of my ability. It's a niche market, but we're still here.

Because since I live in a country with high import taxes I would have to spend 30 hours working at $10 an hour to jump to the next gpu tier (1070) , while overclocking my 1060 takes at most an hour and bricking a Pascal card from overclocking to play games once in a while is unheard of, at most it will shave off a few months in its last leg of life where it will be probably almost worthless anyway. Also the perf gains are more like 30-35% not 10%.
Of course if you make $100 per hour, live in the US and have top of the line hardware anyways, then it becomes a much less rational choice.

>Tell that to the guy (not me ofc) that bought an 80 dollar chip with 4 cores at 3Ghz and unlocked it to 6 cores at 4Ghz, then bought two GPUs and did essentially the same thing while undervolting. Spending about 400 dollars to do the same as those spending thousands was extremely satisfying.
There is more than a difference in clock speeds between cpus and gpus /v/irgin

Free performance.

getting a 20% performance boost is pretty common
chips from the factory are rated to run with an ambient of 90F. Since only people who live in shithole countries who have no AC experience such a retardedly high embient temperature, youre gimping your performance for no reason.

also considering its a good enough reason to get an aftermarket cooler to reduce noise levels alone, and that right there provides the thermal efficiency required to cool overclock CPUs

basically if you dont overclock its because youre fucking stupid

Absolutely yes. Zen+'s XFR2 is paving the way on killing manual OC. XFR was meh, but XFR2's handling on a wider range instead of two states is pretty fucking good. It can only get better from here on out.

No shit. If you didn't get the (not me ofc) and understood it was a facetious quote from those that agree with overclocking in general (while typically being those that buy K chips and attempt to overclock but give up because they can't spend the time to do it), while also realizing that I do indeed enjoy paying less for more and understand that time is money and at a certain point you're pissing into the wind and it's fun to do.... oh god, I've gone cross-eyed.

Faggots that think you're overclocking in the sub double digit percentages or incapable of understanding why those percentages are fun to achieve do not get it. It is indeed like the emperor's new clothes, and when they "Crash N Burn" while hackin the gibson because they don't have the wherewithal to do so, I don't snub them for their ineptitude. It's more of a personal achievement.

>overclocking a 1060 yields 1070 performance
>performance gains of 30-35% from overclocking a Pascal
Unless its on LN2 and modified on a hardware level, pic related

Attached: images (7) (1).jpg (375x266, 14K)

I laugh at the retards who still think overclocking will shorten the livespan of their hardware. my i5 3570k has been running on 4.6 ghz at an abysmal 1.4v core for an average of 10 hours a day since its release back in 2013 and is cooled by a 600rpm 140mm fan with the cheapest air cooler I could get my hands on back then, a coolermaster 212. overclocking takes 1 hour tops to figure out the sweet spot. free performance. if that is not for you maybe you should stick to consoles for your gaymes

You've probably knocked off 15 years off its 30 year lifespan.

wrong
see
if you get an efficient cooler, your overclocked CPU can actually be running COOLER than at stock speeds on stock cooler, thus, INCREASING its life span

>2030
>people still say overclocking reduces the hardwares lifespan
>people still recommend prime 95
meanwhile, my cpu is still alive 17 years later and prime 95 passed by it crashed on the first actual game I started so I had to reduce the 4.7ghz to 4.6ghz.

whenever some faggot recommends these shitty artificial shit benchmarks I'm actually more infurriated then the oc lifespan meme. It makes the npc meme all the better.

XFR2 kinda makes the CPU clock like modern GPU these days doesn't it? Sounds pretty neat.

I never said it gets to the level of a 1070, but it's the next best thing for almost free. It's not like you could get the slight increase you get from overclocking by spending, say, 20 bucks.
And going all the way to the next model is not always worth it, otherwise everybody would use the best card available.
BTW Pascal is not current limited but voltage limited %90 of the time, so simple hardware mods like shorting shunt resistors does almost nothing for performance.

OC'ing GTX980ti from 1200mhz to 1600mhz
>10%

It is because high-end CPUs SKUs at stock are already near their clockspeed, power/voltage ceiling. The lower-end stuff is lock down so hard so no more "300A Celerons"

Overclocking only exists for bragging rights. It isn't worth the headaches and constant tweaking to keep it relatively stable.

Overclocking does reduce the lifespan of hardware. I have seen and heard overclocked units died well before their time. They weren't suicide runs either.

Motherboard is usually the culprit (extra stress on VRMs) and then memory (assuming it is overclocked as well). Overclocked CPUs typically start to require more and more volts to keep stable or reduce clockspeed.

Prime95 isn't even at good at determining stability either. I have seen CPUs that go throug Prime95 fine but some pieces of code completely freeze/crap out on an overclocked CPU.

Yes, at this point it's basically a pissing contest.

Yes.

Not on my 2500k.

With my 4790k it wasn't a waste of time. The recommended voltage for chip longevity is a max of 1.3v for that chip. So I bumped the voltage to 1.275v and then started to bump the clock speeds up and test for stability until it became unstable and crashed.

4.8GHz was my max on air and that was with all the power saving functions left enabled. So the chip, instead of turbo boosting to 4.4GHz would "clock up" to 4.8GHz and then drop to almost nothing when not being stressed.

It took me less than 15 minutes to achieve and has been stable ever since. It greatly improved games in which single threaded performance was more important than how powerful your GPU was (such as ARMA). It was stupid how easily and how quickly it could be achieved.

I had a similar result with a my AMD 8320E. The max boost clock for that was 4.0GHz. At 1.4v I was able to achieve a stable 4.6GHz before thermal throttling kicked in. Again, I benefited from the increase in single threaded performance. I actually got rid of that system and replaced it with the 4790k, because of Linux compatibility problems at the time. But it was another overclocking config that took me around 15 minutes to achieve max overclock and stability.

I've been overclocking since the very early 00s though so ¯\_(ツ)_/¯

it's more cost effective to buy buy a better cpu than to overclock a hunk of shit silicon.

>instead of turbo boosting to 4.4GHz would "clock up" to 4.8GHz and then drop to almost nothing when not being stressed.
Oh yeah. And that would turbo boost to 4.4GHz on only two cores. My overclock boosted to 4.8GHz across all four.

If you look at most "turbo boost" and whatever AMDs is called, not all cores get the boost. Only some of them. I had Opterons that boosted only some of the cores to a particular GHZ, and then some more to a lower GHZ, with the rest remaining the stock boost.

Overclocking allows you to distribute that boost / clock across all cores because of the limitations Intel or AMD have put into the chip.

Clockspeed doesn't always scale linearly with performance. But thats impressive.

>it's more cost effective to buy buy a better cpu than to overclock a hunk of shit silicon.
Silicon is binned. Lower brand / price chips are more expensive chips that have broken parts inside.

Remember when Phenom II had the tri cores (Heka)? Those were quad cores (Deneb) with a dead core disabled. They could sell a tri core for more than they could a dual core with a perfectly functional core disabled just to have a dual core. Callisto was the dual core Phenom II and those were Denebs quads with two dead cores. AMD also allowed the re-enabling of those problematic cores incase you thought you could try and get them to work again. I don't think many did.

The hex core Ryzens? Octa cores with two dead cores. And so it goes.

Intel does the exact same thing. They switch off functionality of damaged Xeons and turn them into i7s and i5s or whatever. They have very few unique wafer desgins because it's not cost effective.

Silicon is silicon that's been engineered for a price point. It's like the automotive industry moving to global platforms (such as VW did with MQB and a couple others, across everything in the VAG range).

I always found it weird that Jow Forums doesn't discuss overclocking more this thread has been eye opening on how much Jow Forums doesn't know shit about overclocking.

This reminds me of the gumdrop circuits in casio calculators:

instructables.com/id/Upgrade-your-Casio-fx-82es-into-115es991es570es/

if you dont overclock, your intel hardware will be even worse
yea i have a phenom iix3 i unlocked into a 4. :)

changing a setting in bios, takes 10 seconds and gets you 10% more performance for free....
why not?

mild overclocking is good, but when you start messing with the voltage, that's when it becomes a waste of time.

What the hell are you on about?

autism

Only soibois are scared of doing something wrong.

Not really. Intel and AMD have recommended voltage limits for chip longevity. Don't go over them, you'll be fine.

Mostly the voltage relates to the nm of the chip. I still remember people killing 45nm chips because they were pumping voltage through chips that were safe with 65nm chips. Q6600 to Q8400 era.

The maximum for the Q6600 (65nm) was a maximum of 1.5v
The maximum for the Q8400 (45nm) was a maximum 1.362v

I had both of the above and they were great chips. However people were pumping 1.5v or more through Q8400's an then wondering why their chips didn't last very long.

Provided any voltage adjustments are kept below the maximums, your chip shouldn't see any more detrimental reduction in lifespan that if the voltage hadn't been touched. Intel used to carry recommended voltage numbers of their chips ark page. But removed it.

As the nm continued to drop, the safe amount of voltage has hovered around the same.

The maximum recommended for the i7-960 (45nm) was 1.375v.
The maximum recommended for the i7-980 (32nm and tock of above) was 1.3v
The maximum recommended for the I7-2600K (32nm) was 1.3v.

It was around the time of Sandy bridge (2600k) that Intel stopped providing VID voltage range information with their chips on the ark. Which was also when they implemented the K series of chips for overclocking, kind of leaving it up in the air for consumers to figure out yourself what the maximum is. Consensus fell on 1.3v because of the 2600k and the physics surrounding decreasing die sizes and pumping more voltage through them. Without delidding chips, people still stick to the absolute max of 1.3 - 1.35v. If you delid and/or run water loops, you can afford to push more voltage because you have better heat dissipation.

Why not? If you find it boring don't do it..

Anyone got any of the latest Intel chips that turbo boost to 5.0GHz? Do they boost across all cores? And what does the voltage spike to when the boost is active?

I'd be interested in seeing what Intel is doing with "non K" chips that are capable of those kinds of boosts.

kinda, people who spend like shit tons on better coolers and stuff because they overvolt so much for that extra 5% is pretty dumb. (Same idiots who buy a turbo-charger and cat-back for a Civic 4 cylinder)

>oy vey, don't overclock, goy, buy a new CPU
(((holzman)))

Attached: 1514655974388.png (611x756, 59K)

Overclocking is only a waste if you use some generic motherboard that can't handle overclocking. Most people buy a Z series chipset motherboard with good quality power delivery so it'd be a waste of money to NOT overclock. Also a lot of the unlocked CPUs are capable of running on any motherboard they fit in.

If overclocking is a waste of time, then why don't we see more 8700 cpus placed in a generic, chinese green pcb, motherboard?