Was FX the worst CPU architecture of the 21st century?

Was FX the worst CPU architecture of the 21st century?

Attached: amd-fx-zambezi-1.jpg (550x297, 53K)

Bigger bibeline, user

kaby lake was

No, there's probably several thousand you never heard of because they weren't good enough to even tape out, as well as several hundred you never heard of because they were so shit nobody even tried to market them, as well as hundreds that you never heard of because they were such failures that nobody every talked about them.
The only knowledge you have are of the moderately commercially successful personal computing ones ones. So we can go on about the gimped netburst celerons, Atoms, whether Itanium was a failure of the architecture or the ecosystem, dig into Via/Cyrix, PowerPC, and Faildozer, but none of them would be "the worst."

AMD was right though, moar coars is the future.

Attached: 1457294956960.jpg (801x1500, 177K)

netburst was worse

Still not as hot as a non-delidded *Lake.

No. It is obviously all of Intels family.

Attached: 1525533793432.jpg (640x360, 25K)

Toothpaste is needed to keep the core warm and cozy.

That's just Jew Jizz™ inside

A 8 core phenom 2 would have rocked.

Pretty good arch if you had multi-core biased loads, aka real computers not glorified xbox. Ironically also powers the xbone and pos4.

Attached: Ovens-in-the-Auschwitz-crematorium.jpg (600x450, 92K)

Underage newnigger b&

ITT: constipated amdpajeets

i would have put intel-wojack inside™ (kek)

LOL no. It was on the same level as Sandy for twice the power.
I encode all day and I literally would not take a FX system for myself if it was offered for free. (I'd take it to sell for $100 to some sucker, but I wouldn't use it myself)

found the butthurt israelinigger

That's just a white guy who has his colons flooded with cheese.

>encode

Enjoy ur meltdown.

Their only mistake was assuming Moore's law was already dead and buried, but desu with mulithreaded applications FX shines long past its heyday.
T. OCed FX 8150 fag

>literal meltdown
That's why he avoids FX cpus, my fellow Jow Forumsamd user.

Encoding is one of the least affected cases.

You couldn't pay me to run cucktel trash since the P4 disaster.

No, it's not even that bad.
For x86 only that'd goto Via/Cyrix Centuar.

>Core2, Nehalem, Lynnfield, Sandy, Ivy, Haswell, Skylake, (i forget the names of the newer shit because I'm not in the market)
wat

That's like saying you wouldn't buy a Celeron 300A because the 300 was ass.

>missed out on the Conroe/Bloomfield/Sandy Bridge/Haswell eras
Intel had some pretty good shit going for a while there, now AMD has caught back up.

-t Pentium 166MMX, Pentium 233MMX, Athlon 500 ES, Duron 850, Duron 1200, AthlonXP 2000+, Athlon XP2800+, Athlon64 3500, Core2 E6600, Core2 E8400, Core2 Q9550, Core i7 4820K, Ryzen 1700 owner.
All my personal machines, that doesn't include family machines from before I got my own PC, for reference an average PC was a P3 800 by the time I had a personal P166MMX.

Family PC had a Celeron 333 on a PCCHIPs 440BX motherboard (can't remember the model no, it had AT power plugs, AT keyboard plug, AT mounting holes, onboard SiS6326AGP 8MB with no AGP slot, Slot1)
Overclocked to 450mhz with B21 trick.
I'm That was before I had my own machine.

That SiS6326 was hell, no OpenGL at all and slow as arse direct3d. Still It played Half Life and SW Episode 1 at an acceptable framerate (but not Max Payne)

I had a 16MB TNT. Smooth sailing.

>MELTDOWN, MELTDOWN,MELTDOWN,MELTDOWN,MELTDOWN,MELTDOWN,MELTDOWN,MELTDOWN,MELTDOWN,MELTDOWN,MELTDOWN,

>Was FX the worst CPU architecture of the 21st century?
Well it is the last non botnet CPU that can still run with a regular BIOS. OC'ed to 5Ghz and it is comparable to the Ryzen 5 1500 which is still fucking decent. Especially multi core performance is nothing to laugh at.

Didn't have an nVidia card until my own personal MX400 64MiB, which replaced my Voodoo3 3000 because I couldn't play GTAIII.
I was a full on 3Dfx fanboy back in the day.

The only plebs who would talk shit about FX are brainless shitlords who drink the intlel koolaids/

[email protected] would probably still be better.
Dual Yorkfield Xeons would beat it at most everything and still be botnet free depending on the motherboard chipset.

FX and the associtated APUs had terribad FPU performance.
But they they had excellent Integer math, even better than Intel for the associated generations.

Too bad most everything needs reasonable FP performance nowadays.

Which to be frank, was the Intel problem with P4.
Excellent Integer math, but terrible FP compared to both AthlonXP and Athlon64.

AMD and Intel somehow managed to reverse positions entirely.

Just adding extra cores to Phenom 2 wouldn't help much. Don't forget it really was just a die-shrink of first Phenom with DDR3 support. By the time 6 core Phenom 2s came, the architecture was really outdated, lacking lots of instruction extensions. They'd need to work on it, make some kind of Phenom III, sure it'd be better thab Bulldozer but we don't know how good exactly it'd turn out.

Instruction extensions aren't a bother so much.
Pentium 4 had loads of them compared to lowly Pentium M, yet Pentium M beat P4 clock for clock in every benchmark.

Then Conroe was a desktop focused Pentium M with all the fancy extensions from late netburst and it wrecked AMD overnight.

>[email protected] would probably still be better.
A FX at 5Ghz would be about 2 times as fast

>Dual Yorkfield Xeons would beat it at most everything and still be botnet free depending on the motherboard chipset.

Yeah no it will be at about 80% of the FX at 5Ghz

I still love my fx6300, so glad I got it over the i3. At 4.7ghz it still offers plenty performance for me, outperforms any i3 that was available at the time, and should easily last me another couple of years.

Can match an ivy bridge i5 if tomshardware is to be believed, despite being much cheaper.

Attached: Combined-Applications-Performance.png (450x2364, 44K)

x86 in general. It deserves to die, it deserved to die since the 80s.time for it to actually go.

>Instruction extensions aren't a bother so much.
Maybe to you. Until you notice the software that doesn't run because it lacks them.
And I'm not even talking gayming, but developer tools like Android Studio that at some point will sperg out on you because Phenom lacks SSSE3 & SSE4.1/4.2

lel, ARM was supposed to take over the server market according to some Jow Forums fags, and yet is is dying now.

CRAY just signed with AMD to build EPYC based super computers.

>CRAY just signed with AMD to build EPYC based super computers.

AMD CONFIRMED DEAD AND BANKRUPT

t. Jow Forums 2011, 2012, 2013, 2014, 2015, 2016, 2017 and 2018.

At 5Ghz you'd be using almost 300w though.
For that you could have 4x Yorkfield @ 3ghz...
16cores...

FX had some great integer performance but overall it was garbage.
nah, x86 just needs to be forcibly opened up.
Everyone is just decoding every arch to internal blackbox arches now anyway, x86 is just a frontend and amd64 is a pretty good one.
The only problem is that only Intel, AMD and VIA can make CPUs with that frontend arch.

>Instruction extensions aren't a bother so much
Just because YOU never had any trouble with it, doesnt mean its not an issue.
Lacking instruction extension have fucked people over for YEARS.

Sorry, I didn't mean it in that way, just that extra new instructions don't *necessarily* make a CPU faster.
I should have been clearer.

>At 5Ghz you'd be using almost 300w though.
The FX-9590 is 220 watt with 5ghz turbo and 4.7Ghz on all cores, you would reach around 245 Watt max with 5ghz on all cores

Doesn't include the rest of the system (excluding disks)
A 4 seat 4x yorkfield@3ghz blade uses less than 300w for the entire blade.

Do you think Cray is an acronym or something? Stupid fanboy niggers.

How much less though?

Perhaps you can refrain yourself from going full autist for just a few second.

Their logo uses capital letters at least.

well perhaps we can discuss this in 2 years if and when intel patches their cpus again

I'm currently running an FX-8350. Barely OCd and runs like a champ. All the other people in my office are using Intel Chips from the same era and are constantly complaining about slow downs.

No, anything ARM was and still is.

No, that was Presshot

Yeah, my 40% overclocked 8320 (on air) has always surprised me with performance.
But then again, the reason I like it so much is because I got it for less than $170. If it was priced like a performance part, which is what AMD wanted to do (300-400$), it eould be fucking TRAAASH, which is the baseline I think people are working with in their heads.

...why? Is there an actual flaw that you know about that others don't, or do you just not like RISC? Because I aleays thought using RISC on mobile was fucking genius.

I have a Ryzen and I'm very happy with it, but if I had to choose between a Bulldozer (and its refreshes) or a Sandy Bridge (and its refreshes) I'd go for the latter any day.

Well, I mean that's obvious. Everyone knows that Intel hit it out of the park with Sandy Bridge; that's part of the reason so many hate Bulldozer: it was shoddy compared to the competition.
But when we're talking about "worst architecture", we're comparing ALL processor generations. Granted, Bulldozer still doesn't look good, but still.

1.) very low ST IPC
2.) Abhorrent performance/watt
3.) modern x86 is pretty much RISC wrapped with a CISC translator
4.) 99% of native software consists of "apps"

The only reason we see ARM on phones is cost. SD 820 is like $20 for OEMs.

All of that is correct except for
>abhorrent p/w
I mean, there's an argument to be made that the chips are wasting energy while waiting for instructions to complete (there's a term for that, "race to" something, but I forget), but it's a fact that modern ARM cores use like 10% of the power of a Celeron, and get basically the same shit done.

>hurr durr muh hot
>poozen needs liquid nitrogen to match *Lake air clocks

Right but I'm not talking about celery. High-end x86 like epyc or xeon-d runs circles around ARM in terms of peformance/watt.

But again the cost to implement such high efficiency binned x86 ISA would be batshit insane for phones.

I have a used FX 8320 which I got for 70 euro from a guy in Romania. I' ll upgrade for a Threadripper, later, for now it's rock solid and fast enough, whatever bogs it down can and should be run on a gpu. It is faster and cooler than my previously owned 6 core phenom. Great chip, super

Well, fucking exactly. A Keurig is fucking trash compared to a french press or really expensive espresso machine, but literally no one has access to that. ARM chips are an extremely effective (both in cost, and application) use of modern technology.

these

You can still mine crypto on an FX CPU, when a comparable Intel CPU barely can. AMD has always been more forward thinking than Intel.