What make a cpu become obsolete

i've Always wondered what make a cpu become obsolete, i mean what make you wanna change your cpu for a new one ? is it for more cores ? for more clock rate ? and how to make your cpu "future-proof" ? for exemple if i wanna buy a new cpu and i have to choose between i5 8400 and a ryzen 5 1600 would i choose the R5 1600 because it has more Threads (12 Threads) ?

Attached: cpu.jpg (700x394, 63K)

Other urls found in this thread:

extremetech.com/computing/283114-new-utility-can-double-amd-threadripper-2990wx-performance
twitter.com/NSFWRedditGif

When sleep mode and associated shit stops working for that chipset in Windows 10, or when GPUs start to bottleneck.

When you need the better performance you dumb nigga

>not supported by latest Windows
>no meltdown/spectre/newvariant mitigations

Nothing. 2 cores is all you need

>tfw
it hurts

Your questions are all over the place.
You have no idea what you want and you have no idea what you're asking about.

This
Based and redpilled
/Thread

/thread

Whats funny is that was the straw that got me to go from X5650 to Threadripper and now since 1809 my Threadripper system can't have the monitors go asleep or it BSODs (but sleep mode works)

yeah I hate it when people ask like 5 questions in one post.

Funny enough my 2600X/SHITUS Mobo causes Windows to occassionally shutdown the PC instead of rebooting. Or it wakes up from sleep with a "powerbutton" wakeup source after a minute or two.

Linux? No problemo. Does not give a fuck.

Also this shit: extremetech.com/computing/283114-new-utility-can-double-amd-threadripper-2990wx-performance

Normie OS my ass, can't even do the basics right.

When your motherboard dies. There aren't any good 1150 replacement options that aren't overpriced. I'd rather buy a new chio for a 50% performance boost, than search for old tech.

Currently a non-obsolete CPU needs 4 cores and be unaffected by spectre and meltdown. Basically Ryzen CPUs are the closest to this, so anything non-Ryzen is obsolete.

But no laptop uses Ryzen
Besides isn't spectre/meltdown super speculative and not easy topull off?

spbp

the joke in here is that for years, Intels new CPUs did not offer better performance than the old ones

if you had for example i5 2500k, you could have used it right until this day

Attached: intel is finished.jpg (992x1043, 199K)

There are a bunch of Ryzen laptops, what are you talking about?
>isn't spectre/meltdown super speculative and not easy topull off?
You have all the time in the world to pull it off. And no, it's not difficult at all.

>tfw still using core2duo CPU. Its still comfy for shitposting at least.

>and be unaffected by spectre and meltdown
So none? The first microarchitectures with hardware level mitigation will be Ice Lake and Zen 2. Everything up to this point has only had performance-crippling software band-aids applied.

Same here, not right now, but I have a Core 2 Duo machine that I still use on a regular basis. Nostalgia/shitpost machine.

>none
Which is exactly what that post states, but AMD is affected by significantly less bugs and they almost didn't lose any performance after patches whereas Intel performance dropped by 50% in some scenarios.

In my case I'm poor. And mommy don't want to buy me new PC.

When it draws too much power compared to new cpus with similar performance. Power efficiency is key when the system keeps running day and night.

if you had for example i5 2500k, you could have used it right until this day
until meltdown and spectre

is Pentium 4 effected by these meltdowns?
fastest P4 dual core chip (it is rare) could still be useful as a mediocre web browser machine and is surely fine for everything else

web browsing is where the highest computation requirements on CPUs part is lieing these days

Oof, I feel ya buddy. I was using a Pentium 4 630 up until about October 2015, when I started saving up my own money for a new PC. Hang in there lad

What pisses me off is the different sockets used for cpu's.
I have 1150 socket, then suddenly 1151 came out wtf is this shit?!.

Meltdown affects all Intel CPUs since around the mid 90s, aside from Itanium and pre-2013 Atom (i.e. a bunch of enterprise servers and old netbooks).

Spectre affects all Intel desktop CPUs since the Core-i designation (therefore you'd still be safe even with a Core 2 Quad), all Xeons since Woodcrest in 2006 and seemingly all AMDs since the Athlon 64 in 2003.

Therefore the only way to be safe from both at a hardware level right now is to run an ooold AMD Athlon (non 64), an outdated Intel Atom netbook or an expensive Itanium server. Easier to simply wait for the new immune generation of CPUs or deal with the performance impact of software mitigation on a more modern CPU.

I remember seeing a scan of a magazine page from like 1950 where they predicted that within a decade or two nuclear power would make electricity so cheap and abundant that it wouldn't make sense to bother metering it, you'd just pay some low flat rate and use whatever you needed.

fuck flying cars and jetpacks and shit, that's the world I want.

Nothing makes a chip obsolete
The chip operates within a given set of parameters
It is software that becomes obsolete.
A 286 cpu will still happily run everything you need it to run providing you remain within the bounds of reason. Example, you dont need an I7 to control a model railway signalling system. You dont even need an I7 to control a satellite or space craft - you can still use a 286 or 386 for that

Attached: 1515586842297.png (682x792, 339K)

Inability to handle a new instruction set or industry standard

Thanks random user.

If you're going to use something this weak then use an ARM CPU or raspberry Pi 3B+ or a PC with the same CPU since those were pretty much unaffected by spectre and meltdown

Is it really?

Is an i3 6100 any better than an i5 4460?

sure, if you're a casual who doesnt actually do any processing.

>been using an x5650 for 7 years now
>6 cores, 12 threads, overclocked to 4.5ghz on air
I can see myself not needing to upgrade for atleast 5 more years

>Nothing makes a chip obsolete, It is software that becomes obsolete.

Now that i'm thinking, what u said is right and make sense, it's so obvious that i actually feel ashamed for asking this stupid question. i mean i have an old Ibm aptiva pentium 133 that work perfectly fine, the only reason why i don't use it is because the soft that will run in this system is outdated ...so it's not the cpu that became obsolete it's the softs that is outdated and the new one require more powerful cpu. for exp if u look at some clocks u will find that most of them use the integrated circuit "n555 timer ic" which was made in 1972, it didn't became obsolete cause what is used for and the program in it didn't change(i hope i understood correctly what you were saying, i'm kinda retarded) . So... is the best way to future-proof a cpu that will run new softs is to choose it with more than 4 cores (cause most softs now only use 4 cores)+ can be overclocked + u can find it MOBO easily+not choosing intel ?

This. RIP Phenom II users

What the fuck is an instruction set?

Future software becomes less optimized and requires more processing power for the same task.

lurk moar

UMA DELIDCIA

Attached: 1502055755975.png (672x794, 343K)

>>>/reddit/

These issues are becoming more and more common on windows.

I swear I had less issues before when everyone used to complain about bsod, my systems used to work fine.

Now I have daily new bugs with windows.

Why? You could buy a 50$ Ryzen cpu that outperforms it, runs cooler and is way more efficient.

The absolute state of Jow Forums

Mainly power consumption, ST/MT performance. Clocks don't directly correlate with performance due to IPC/FPU variances across different architectures even within the same manufacturer and year released. Another reason is motherboards support new bells and whistles like booting and running directly from an MVME M.2 SSD and higher performance RAM.

Attached: aoeeqbw_460swp.jpg (10000x10000, 1.95M)

>what makes a cpu become obsolete?
bloated software

It depends solely on your use case. Sometimes it's about performance, sometimes it's other things.
At one time I replaced a Phenom II with a FX 6300 because the former's outdated instruction set was causing problems with virtual machines running newer Android versions as they expected stuff like SSE4.1/2 and other that Phenoms didn't have. It wasn't about performance at all, possibly there was even raw performance loss.

1809 is a disaster in general though, they had to pull it for what, like 2 months? Right after releasing, so broken it was. That's what you get when you just decide that you don't want to pay for QA of your product.

Byytecode programs, interpreted programs, managed code.

If you're just web browsing, programming, and doing spreadsheets you can use a 10 year old processor just fine. Any lack of support for these older chipsets on Winshit is artificial. All my hardware works flawlessly under Linux. You can even add "nopti" to your kernel boot options in GRUB to shut off kernel page table isolation. It makes Intel machines much, much faster at the cost of less security.

Attached: snapshot2.png (679x436, 45K)

>programming
That's wrong though, I mean you technically can but for example programming for Android without a functioning device emulator is a pain in the ass.

When my games start reaching 90% and above CPU usage. Just by themselves without any multitasking.

That is why I am finally going to put down my 3570k. He did well for 6 years but really struggling in modern titles that need more threads.

Except I don't do programming for Android. That shit is for Pajeets. I write portable C code and I use Motif for anything GUI related. My programs will compile and run on any *nix platform with Motif 2.3 or later. Unix, C, and Motif are superior in nearly every way and have yet to be replaced.

Attached: 2005-11-11--Dilbert_Unix.jpg (600x197, 58K)

Easy list is as follows:

>Need/ want more performance.
>Chipset associated with the CPU does not support your upgrades or give the upgrades their full performance (more of an issue for Intel where many of their CPUs are locked to some gen of chipsets, less of an issue for AMD where a BIOS is all that is needed if CPU and motherboard have the same socket).
>CPU vulnerabilities as covered in recent cases (depends if you need your PC to be secure against an obscure group of hackers that may never consider you a target if you are a normal civilian, affect Intel worst than AMD but all almost all CPUs including ARM are affected to some degree).
>Software and potential game support, not necessarily for performance but for supported features or functions that only newer CPUs (this is vary rare and unlikely to affect most people).

Those seem to be the majority of the ones that would be most likely.

Still sitting on a 3570K
This year should be good

install gentoo

i still use core2 duo u sissy little faget

how do i know if an atom is safe? is there more specific than pre 2013?

>CPU needs 4 cores and be unaffected by spectre and meltdown

What is Spectre and Meltdown?

Attached: 1332980667051.png (148x214, 10K)