ITT: great CPUs

Attached: L_00001441.jpg (655x587, 103K)

Other urls found in this thread:

anandtech.com/bench/product/1403?vs=551
twitter.com/SFWRedditVideos

I finally replaced my Q6600 with a 9700k back in november.


Q6600 rig still worked, just wasn't as fast as I needed/wanted.

Currently sitting in a closet, might pull it out for a project someday.

>expected to see a 2500k
>it's an actual good processor instead
good thread.

That CPU will outlast any motherboard that you might find

It already outlasted 2 motherboards I ran it in.

Sadly as the years went on, it's peak OC started to fall, used to be able to hit 3.7-3.8Ghz, by the end I had it down to 3.2Ghz for stability.

Still a fair bit better than it's stock 2.4ghz though.

YOU CAN'T HIDE, YOU CAN'T RUN

Attached: intel-core-i5-2500k-socket-1155-lga1155-quad-4-core-processor-cpu-electgen-1806-20-electgen@13.jpg (500x461, 36K)

And it can be OCed with tape, making it a good choice for mobos with no OC options.

Attached: poozen.jpg (700x460, 61K)

Have you considered that if your mobo doesn't have OC options maybe it's not well suited for OCing?

oc'ing is literally nothing more than running more electricity into the cpu. and not even always that. all boards and cpus should support overclocking. non-overclockable processors are a scam.

I have a Q9550. I'm thinking of getting the "s" version which consumes even less watts. I'll give it 12 GB of DDR3, a GTX 1050 TI 4GB and then let it age like fine wine for the next 10 years.

A 3GHz Q6600 is nothing special, voltage isn't even raised.

that sounds like a waste of money

Attached: Malay.jpg (579x635, 34K)

I wouldn't bother unless the s version is literally free.

There is simply no good argument to be made for throwing money at such an old platform unless you just have a fetish for old tech.

Was the decade of stagnation really great for you gyus?

FX processors weren't generally well regarded but they do hold the distinction of dominating the charts of world record overclocks, with an FX-8350 in first place at 8.794 GHz.

Attached: AMD-FX8350-FX-8350-Eight-Core-CPU-Processor-4-0G-8M-125W-Socket-AM3-CPU-FX.jpg_640x640.jpg (640x640, 76K)

Had this bad boy in my first PC in 2005. It was great to take a break from the slow-ass P4s at school.

Attached: 19-103-483-04.jpg (640x480, 24K)

nah, been suffering for multiple years now, but I honestly didn't want to "upgrade" to a fucking quad-core

Fuck you

Based

Attached: 2500k4.2.png (460x435, 21K)

is there any task or benchmark that actually benefits from just an insane frequency?

Also the Netburst (i.e. Pentium 4) based singlecore Celeron D units, particularly the 352, 356 and 360, are easily the most overclockable Intel processors of all time, topping out at 8.532 GHz for the leading example. No other Intel architecture could even reach close to 8 GHz so far.

Attached: intel_celeron_D_cpu_360_processor_3_45ghz_512kb_533mhz_socket_775.jpg (835x800, 484K)

in pure clockspeed, sure.

But that doesn't really mean much when a lower clocked 9900k for example, can still beat it in performance testing.

Attached: 2019-06-13 09_42_02.png (1259x102, 12K)

Perhaps, but at these clocks the processors would throw the towel at the very first sign of an actual workload anyway. The records are considered validated when the person manages to open CPU-z, take a screenshot and save a validation file, and that's generally all the system is even capable of without crashing, and that's all while cooled to -120°C with liquid nitrogen.

ITT: poorfag cope

Attached: s-l300.png (300x300, 162K)

It ALMOST seems like Intel made the swap easy on purpose.

good goy, always buy the latest and greatest for $$$.

>Q6600
I was going to get a Q6600 a few years back, but I ended up getting a Q6700 off eBay instead. It sold for only $5 more than a Q6600, and every Q6700 had guaranteed G0 stepping. Plus, it ran at 2.66Ghz with its 10x multiplier out of the box!

Got it up to 3.51Ghz before realizing I didn't want more than double power consumption for a 30%-ish clock increase.

Still, a damn good CPU. Being a poorfag, I got mine in 2011, and I kind of envy the people who had the foresight and cash to plunk down for a Core 2 Quad when they were new.

Also, the K8-based Semprons were an incredible value back in the day. At the time, Intel was market-dominant even though the performance sucked, so *literally any* program ran just fine on it. A great value; I used one I picked up for less than a hundred bucks in my first gaming PC.

I don't know if it was a "great" CPU even for its time, but there's a special place in my heart for the PowerPC G5 that IBM and Apple put together. I was kinda sad when Apple threw in the towel for PPC and switched all their gear to Intel a few years after it debuted, even though it was clearly the right decision since they couldn't even manage to get a G5 chip in any of their laptops.

Anyone with more tech knowledge care to weigh in on the PPC G5's merits and drawbacks?

coping with?

>and every Q6700 had guaranteed G0 stepping
You could've read the stepping off the CPU by looking at the sSPEC number on the photo anyway. B3 Q6600s have SL9UM printed on them, G0 Q6600s have SLACR printed on them.

world record for SuperPi - 32M for an FX-8370 is 10 minutes 27 seconds at 7.4Ghz

For a 9900k it's 4 minutes 3 seconds also at 7.4Ghz

Just because the 8370 can potentially OC higher doesn't mean shit compared to modern CPUs.

>25 dollars (intel and motorola were asking 200 dollars for their chips)
>As fast as the intel chips
>Accidentally created the personal computer

Attached: 6502.png (584x263, 156K)

The photo in the eBay listing was just a generic stock low-resolution photo of the box, probably taken off google image search.

I knew about what you're saying at the time, but all the sellers were big-time merchants with dozens of used CPUs in stock per listing. Didn't seem worth it to even ask, like they'd fucking know or care.

smoothest dual-core I've ever used

Attached: 1534144890183.png (469x480, 499K)

Just bought 3 SLACR Q6600s for 10€ a piece for some cheap builds.

Weren't the AMD FX cpus really good for video encoding? And I'm not just talking about the core count (that certainly helped).

I thought it was something about the high clocks + lack of branches in the code working out nicely for performance.

For their era, sure.

It was mainly the thread count though.

x264 encoding scaled up to 8 threads decently, beyond that was shit.

H265 encoding scales well up to 64 threads thankfully.

Intel pentium 4 ,autistic shits

fucking undervilt your present CPU and call it a day. jesus

>netburst
>great

christ
all
mighty

if it wasn't for Intel's israeli division developing Core 2, the company would have gone under due to netburst.

Yeah, I mean, if you're running Folding@Home on a cluster of P4s you got for free in order to act as a space heater.

>bentium 4
>good
it's one of the worst intlel cpus of all times (only been surpassed recently by garbage like 9900kys)lm

Attached: spurdo_ok_joy.jpg (1080x731, 77K)

My FX-8320E was great for video editing, even without GPU acceleration. It was smoother than the i7 3770 I had at work.

My modded x5460 still rocks together with 8gb of ram

Attached: Screen_Shot_20190508_at_10.37.07.jpg (1648x930, 167K)

Back then, when a company wanted to show how fast their CPU was, they used the pentium 4 as the punching bag, because it was a quite easy CPU to surpass.

It can't have been that much smoother.

Even OC'd to 4.8Ghz, the 8320E barely keeps up with the 3770k in encoding performance.

On this graph, blue is the 8320 @ 4.8Ghz, the red is a 3770k at stock.


source
anandtech.com/bench/product/1403?vs=551

Attached: 2019-06-13 10_12_57.png (681x121, 16K)

Honestly the m0 2.4c was pretty good, OC beast (easy 3.5ghz) but shit at stock.
Until K8 anyway...

Video encoding is a pretty brutal test for the FX CPUs, because they have shared FPUs and are pretty weak at that overall, because video encoding is basically an endless flood of fmuls.

That was the best case scenario too, stock clocks on an 8320e have it falling FAR behind.

Attached: 2019-06-13 10_24_41.png (662x119, 16K)

>2500k

let's be honest, this was the real last great intel processor

Attached: 3570k.jpg (1595x1599, 678K)

On Vegas, preview playback with 1080p50 from my Panasonic cam (which uses 4 ref frames and is brutal on CPU) was flawless on the FX at stock even without GPU acceleration, in comparison the i7 shit itself with or without the integrated graphics. Only when I got an R7 240 did the playback get smoother with that.

>not the 2600K
Weak.

It wasn't great though, it just was a favorite of OEMs, that's why you're having such nostalgic childhood memories of it.
Phenom literally obliterated it.

Attached: 5e5.png (501x506, 208K)

If all you're doing is editing without doing any of the heavy lifting encoding that would normally follow the editing process, I guess that's fine.

But you'd have to be retarded to give up the superior encoding performance for smoother timeline previews if that's the only machine you're working with.

Attached: Running_way_too_hot.mov.webm (690x1008, 2.81M)

I wish I had the extra threads today, but I wasn't willing to spend the extra money eight years ago.

I never had it, but I've been using them for budget builds. I used Phenom, and it spanked the Quad, but it was not as prevalent.

what do you lot think about the 5820k?

Encoding was fine from my experience, it didn't take much longer, and not having a miserable editing experience was worth it.

>it didn't take much longer
that's objectively not true, and can easily be disproved.

At stock (as you claim you were running), an 8320e would be easily 25-45% slower than a 3770k.

And the 3770k was a pretty decent OC'er as well, could easily get another 15-25% out of that.


I guess if you're only encoding small things, or very rarely, it'd be fine. But if that's your job, and you're needing to wait for the encode to finish to make $$, then it's obviously not the CPU you'd want to be using.

*cracks* yup, those were the days

Attached: 1555870935389.jpg (383x353, 106K)

>finish editing big project
>hit render/encode
>go home for the night
>send out finished work first thing next morning
Yeah, doesn't sound like a big problem to me.

That's what i'd call casual use.

If i'm leaving it going overnight, i'm batching encodes and running several encodes over the night.

P4 was amazing. You kids/3rd worlders are confusing the celeron rejects that eventually became P4 when intel decided to not have a poverty line anymore.

The i7 was a non-K 3770. I don't know if Vegas works better with FX, but that was my experience.
Pentium 4 with hyper threading is more usable now than an Athlon 64 because of that extra thread.

The one that got me into computers

Attached: CPU_0805.jpg (850x1100, 314K)

FX is the best at editing rigs.


.t Vegas user

Attached: Untitled.png (1683x1021, 165K)

idk bro. might make more sense to do the lga 771 to 775 xeon mod and get an 80 watt e5450.

Here's a snip from it.

Attached: watching the chips fall.png (509x397, 63K)

>The only thing that was absolutely clear then was that we were a very long way from the computing power necessary to accurately simulate the real physics of light, to say nothing of doing it in real time. The most popular technique at the time was ray tracing, which closely approximated a real simulation of light physics and could produce stunningly realistic results. Although very powerful, ray tracing was extremely computation-intensive, requiring many hours or even days to produce a compelling result.
>Years later, the expertise in graphics that resulted from those early academic years of discovery was distilled down to a bag of clever tricks that could produce realistic-looking 3D results
while skipping the heavy computing task of actual light physics simulation. These “tricks” were more computation-friendly, and although they did not produce the same “realistic” results that better light simulation achieved, they were fast and inexpensive to produce.

Attached: raytracing07.jpg (850x1100, 379K)

I've had 3 of them
Athlon 64 3000+
Intel Q6600
Intel i5-2500K
Currently sitting on a really shitty 3570K and will probably be getting some 3xxx ryzen this Fall.

P4 in general (again excluding the remodel poverty Celeron ones) were great for games. AMD had a lot of problems when OCed. The only real reason to hate P4 is it wasn't as good a PII or PIII.

The 2500k aged like ass. I have a 3570k and it is barely fucking chugging along.
Now the 2600k, THAT was a good cpu. i7's actually aged well.

4C8T i7 and the FX-8350 with it's 4 modules and two threads each are the bare minimum these days

Attached: Bulldozer%252032nm_thumb%255B1%255D.jpg (504x446, 80K)

Anything worthwhile is playable on it. I'll replace it with a 9590 before I use Windows 10.

Attached: 8350.jpg (500x500, 63K)

My Xeon E5450 can run Sekiro
Waiting for DDR5 and Cyberpunk 2077 to buy a new PC.

You're a retard. The 2500k was always about bang for buck. 2600k was like 350 CAD more at the time

Attached: 20190220_100152.jpg (2048x1152, 221K)

Mine runs at 47°C @ 4134ghz, what's your cooler?

Thicc ass chip

>intel
>great CPUs
nah this aint it chief

nice I had a Q6600
maybe I should have picked one without hyperthreading and put the extra money into a better GPU, but overall I think I benefited from being able to multitask better.

kys

Had a 2500k, but the cpu and or board died. I'm now forced to use an old core 2 duo til I can get a new comp. Bitch lasted 8 years, I mean that's fucking impressive compared to other chips I've had as far as usability.

>the celeron rejects that eventually became P4 when intel decided to not have a poverty line anymore
u wot m8

The last great CPU before NETBURST, and the design Intel went back to when NETBURST failed.

Attached: s-l300.jpg (300x300, 11K)

>The 2500k was always about bang for buck.
This. I bought my 2500K for 169€ eight years ago expecting to use it for four years, and I'm still using it. The value is SHOCKING.

Based fellow Vegas user.

P based, I remember the late P3 actually blowing the early P4 out of the water at equal clocks. Only thing P4 had over it was bruteforced clockspeed eventually, and new instructions. Had a 1000EB myself, for some reason those came without a lid, just the bare die connecting to the heatsink.

I would say this is the best chip intel have so far made.
If you own one of these and are still running it, you will know what I am talking about.
If you run one of these, all the later developments are just so much ball fluffing for people who havent a clue what they are talking about
I dont know anyone who has ever owned this chip that has a bad word to say about it

Attached: Intel-Core-i7-2600K-8M-3-4G-95W-Quad-Core-Processor-5GT-s-SR00C-LGA-1155.jpg_640x640.jpg (640x614, 125K)

Tualatin vs Willamette was embarrassing for Intel.

Attached: p4vsp32.png (510x319, 31K)

I bought it in 2011, now I'm running vr with it. Best 189 I ever spent

I ran a pair of these in slockets on an HP Visualize X-Class. Great times.

Dual P3 1.4S
2GB RAM
Ti4600
dual VooDoo 2's, because glide
And a pair of 146GB 15K SCSI drives for good measure.

The chipset makes a difference. The 810 bottlenecks where the 820/840 does not.

Then there's the ability to run SMP...

P4 could handle games of the day fine... because Intel had such great market share it made no sense for developers to target the superior performance of an Athlon XP and later Athlon 64.

>The only real reason to hate P4 is it wasn't as good a PII or PIII.
Not being as good as your own company's previous generation of product is all the reason you need to dislike a series of CPUs, especially at the prices Intel was charging for the P4.

Really, that's the whole reason the P4 was "bad". Relative to AMD's offerings at the time, and Intel's previous offerings, it was inferior.

That's not a great CPU.
THIS is a great CPU.

Attached: KL_AMD_5x86.jpg (1194x1278, 291K)

Quake killed it.
Quake pretty much killed all the FPUlets.

>Dual P3 1.4S
Neat, until what year did you run them?