Was AMD FX Good or Bad?

I have never used an FX CPU in my life. I've heard nothing but bad things about the FX chips.

Here's what I've heard:
>high power consumption
>DIRE performance
>will turn your room into a sauna
>Certain Phenom IIs in some benchmarks beat certain FX chips

I'm a hardcore AMD fanboy now that Ryzen is out but I'm having a hard time finding any redeemable qualities with Ryzen's predecessor. Was it just a colossal failure? Were they that pathetic? I was living under a rock during the time FX was the newest from AMD and was on Ivy Bridge.

Attached: image.jpg (1038x1000, 140K)

Other urls found in this thread:

en.wikipedia.org/wiki/Athlon_64#Athlon_64_FX_models
youtube.com/watch?v=m_xB00PSfYM
twitter.com/SFWRedditGifs

noppe they were not

en.wikipedia.org/wiki/Athlon_64#Athlon_64_FX_models

were

Yes, AMD's consumer CPUs after the Phenom II were atrocious, both FX and the A Series.

I had an 8350 , not very good at most games relying on single core performance. It was my computer thoguh so I liked it .

FX8320 here
Upgrading to Ryzen felt like changing seasons. It was hot as balls and all I could ever think was "why didn't I go intel" but then R5 1600 arrived on my doorstep and it's only a distant memory now

Bulldozer was made for a very limited scope and excelled in that scope. Highly threaded integer heavy computation with little branch prediction, using as little die area as possible.
Unfortunately for AMD, that situation does not corrospond to real life.

Power consumption vs performance was quite high when outside of it's "groove", the GloFlo 32nm process was particularly atrocious which did not help.
Nah it wouldn't turn a room into a sauna, even the topmost skus would use half of what a top end GPU would.
Yes Phenom 2 did beat Bulldozer chips at the same frequency. This is not surprising given they were 3 ALU 3 AGU designs vs the 2 ALU 2 AGU Bulldozer designs. The fact the performance gap is as small as it is at the same frequency is remarkable actually.
Float performance was particularly bad, as each module of two cores only had a single float core with an SMT implementation. This means effectively an 8350 had 8 integer cores and 4 floating point cores that could handle another 4 threads at a fraction of the performance.

i still have an 8350

i still have a 390x

i still wake up thinking about the day i accidentally ordered an AMD mobo when building my first computer

fx high power consumption is because amd tends to give a lot of voltage overhead.
if you undervolt it will consume way less.
my phenom ii 1090t stock at 1.375V. i undervolt it to 1.225V that's a lot. not fx, but you get the gist.

I had a fx 6300 stock clocks at 3.6 but i had it overclocked to 4.9 near stock voltage, on a hyper t4 and it never went above 50c. It wasnt great at games, did the job and was cheap, but it did do a good job working those six cores.

The FX-8350 was and is an incredible cpu. $300 5 years ago and it can still max modern vidya

AMD FX is where that "make more cores" AMD joke came from. You got many cores with bad per-core performance. It was a huge failure in the marketplace because a lot of software and almost all games were optimized for one, or at best two, cores.

Things have changed in this regard. Nobody is laughing at Ryzen 7 1700 saying "muh cores" and bashing it for having slightly lower IPC than Intel because almost no newer software runs on a single thread. Those who are still using FX CPUs are probably getting more out of them now than when they were released.

lmao @ your life

I love mine, for what I use it for, it's great for the little it cost.

Attached: vwmpn.png (412x351, 114K)

Based UD3 board bringing the clocks at 1.362 something volts with medium LLC.

Attached: Cinebench.png (1920x1200, 174K)

> Was it just a colossal failure?
At launch. It was a cheap architecture which was made to cut costs while delivering some more performance. Just like the Ryzen, heh.
7 years ago single threaded performance still did matter, nowadays FX-8xxx are better than Core i5s of the same age.

Did another bench with one core per module, the IPC is really good if your board supports one core per CU mode, helps with old games I have like Halo CE

Basically this, if you like editing video as a hobby, the FX 8350 is the best budget workstation ever, there's just not many of us so they end up benefiting from the much better sandy bridge IPC.

Attached: Cinebench One Core per CU.png (1920x1200, 1.21M)

I had a fx 6300. It was shit

>Certain Phenom IIs in some benchmarks beat certain FX chips
That's mainly the FX-8150 being shit, the 8300 series is pretty good.

I have a 390X and I regret nothing

I have a FX4300 and did not regret it.
>housefire
No, 95watt max tdp which will only occur when you do those burn in tests for hours.
It does even not that bad in games.

Even the 125w 8 cores aren't housefires, intel's socket 1366 cpus were 130w, consumed a lot of power, and no one ever complained about it because intel can do nothing wrong.

>Those who are still using FX CPUs are probably getting more out of them now than when they were released.
that was the whole point.

vishera went for something that would last, but amd had far less sway with devs than they believed and MOAR COARS didn't become the norm until a couple years ago.

it's the fact that those 8cores shot up to 250w region when OCd that it got its housefire status.

cause stock clocks are worthless on the chip desu, OC puts it within the range of worth having, but again, the power consumption skyrockets as does the heat

>250w
What the hell? I am overclocked to 4.4 on the stock cooler with 1.380V to get it prime stable and it's topping out at 60C.

8320 4,5ghz here
Seriously i was more disapointed years when I bout this space heater than now.
Watching only few cores beying utilised is thing of the past.
And it was cheaper than i3 back then.
So no regrets.

used to have a vishera 8 core opteron i picked up for 40€ on ebay, overclocked to 4.3ghz it was fine for a 7970 though the upgrade to an x5650 was pretty massive

I'm running an 8350 right now, I can still max whatever game I throw at it. No hiccups and it runs cool too. Never have I been sitting next to my computer thinking 'man this thing is really warm' simply because it isnt

i have a fx6300 OC'd to 4Ghz and rarely have any issues. Intel chips can be just as bad, depending on what you hear. it's all opinion.

This, Intel will release glorious cool running chips like the 2600K, and hot ass motherfuckers like the devils canyon 4790K.

The Fx processors are really good, despite the fact that they were slower than the i7 sandys. The fx 8350 can run bf1 @ 60fps wheres the i5 7500 will give you frame drops and stutters and its a way more expensive CPU!

They were great for the price and still can be. I have my FX-8320 in my home server and it runs great.

>home server
That's a cooling about FX boards and chips, and probably why a lot of them are gonna end up as servers is because you can build them cheaply and they have sata 6GB/s as well as e-sata on the back if the board has it.

Attached: AB50837_3.jpg (1000x600, 163K)

>a cooling
a cool thing***

So who else is fx9590 master race?

Attached: 1439453496474.jpg (220x230, 8K)

I'm still using the based 8350, thinking about picking up an 8370 because it's a 125w part at stock, but it's the same binning as an FX-9000 so I can undervolt or overclock and get the most.

youtube.com/watch?v=m_xB00PSfYM

High power consumption and weak IPC, but they clocked well, provided you could cool it and they were cheap.
Used an 8350 until ryzen came out and I would've kept using it if it weren't for my shitty MSI board dying

>shitty MSI board
I know that feel, so glad I got rid of my 990FXA GAMING, the board tiself was cool but the VRM was hitting 111C, so I had to sell it and get a UD3.

Attached: AB50837_2.jpg (1000x635, 395K)

It was always bad. By the time the FX8150 for $200 or whatever was out you could get a phenom II x4 955 for like $130 which was a better CPU with better IPC sand better overclocks. By the time the FX8350 came out it was basically at par with those X4 955s.

Nobody should have ever bought them, and I'm sure everyone that did was excusing their mistake ""hurrr I'll upgrade on this socket later" which is almost never fucking worth it.

>MSI board dying
lol FX killed so many MSI boards.
it deserves the name dragonslayer considering they've adopted that dragon logo on pretty much every msi unit now

>Put LN2 switch on a board with 6+2 phase and no LLC settings whatsoever
This board makes no sense but it did work and I liked the bios, it was also refurbished which was kinda scary.

Attached: 2017-09-26-496.jpg (4000x3000, 2.22M)

>LN2
no pix?
how high did you get?

I have an older 8350 pc which I plan to use as a home server. It was really good for the price at the time and it's still fairly capable. I wouldn't build a system around it but if you already had it or can get it very cheap then it's great.

Also on that note ram was busted any suggestion for getting cheap ddr3 on burgerland? The modules seem really overpriced.

I never used LN2, but that bastard had a dip switch for whatever reason, that board needed so much voltage to OC, 1.44 wasn't even stable because it droops so much, and it claims to support a 9590 with that VRM, I though i had a bad CPU until I got the Gigabyte board with LLC that gives a stable voltage that can be relatively low.

>1.44 wasn't even stable
at 4.5GHz***

>phenom II x4 955 for like $130
lol more like $80, and clock for clock it was better, but good luck getting 4.5ghz on phenom II, whereas 8350 hit that easily.

it was a good buy for cheap multi threaded programs. then there's the fact that intel boards are stupidly overpriced

6+2 is scary when OCing, need at least 8+2 high quality to feel safe imo.
>no fan on the vrms either
>no LLC
i doubt that, but you're just wasting power without it desu
you can put your own cooling on the vrms you know.. you can change the thermal paste if it came with a heatsink, you can put a fan on it, or you could put a fan on the back of the board too..

that's normal
there's tonnes of misinformation about the voltage settings
but vishera is claiumed to be safe to 1.55v by AMD (provided you can keep it cool)

I know 1.55 is normal for FX, but it's stupid for MSI to basically put the 990FX northbridge on a board with the 6+2 phase of a 970 board and sell it as something that can OC, that shit is gonna die.

FX's biggest problem was how dependent its performance was on the speed of the northbridge. You could easily get 20% fps increase in games by overclocking northbridge alone. Ryzen is bound to RAM speed instead. Why are they doing this is beyond me.

lol tbf if you have no use for it you should try n see just how much shit it can take before blowing up

keep a fan blowing on the VRM heatsink and on the back of it.
cooler they are the less chance of failure, and the more power can be passed through before failing.

i've had a 4pin melt on me before my 4+! VRMs because of proper cooling.

OCing NB was only really important because it directly co relates to how fast your can get your memory to run at higher clocks, problem is the NB on the chip was weak as fuck and never really clocked high

I hit 4.2ghz core and 2900mhz northbridge on a $600 build i did for my gf. It still plays games today now with a HD5870, like 7 years later. 4.2ghz wasn't stressed, could have gotten 4.3 I'm sure.

>HD5870
>couldn't get a better GPU
>$300
>year later 6870 came out for $200 and went for 150 by the end of that year
fuck i hate the current market so much. price gauging ram, price gouging GPU, even CPUs are stupidly overpriced.

reeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee

2900NB is insane, do you have it running that high all the time? lol

thinkin bout it now
>$150gpu
>$100 quad core phenom
>$50 cooler
>$50 dollar non LED/sidepanelporn case
>$50 8gb 2133 cl10 RAM
>OC cpu to 4ghz
you'll never get the same price:performance that we had in 2010-2013

True story:
Had Phenom II x 4 955 system. Kept all hardware the same merely swapped in FX 8300 (3.2 Ghz v.s 3.30 Ghz and 4 more cores). Games all ran worse under FX. Sluggish, jerky play. The Cpu patches did nothing to fix it. Games were all older single core titles. Swapped back in the Phenom II, all things ran smooth as babies ass. But it's all not bad, the FX does make a nice Server chip. That's what the FX 8300 is now, my server cpu.

I have a 7650K, it was cheaper than a comparable CPU+GPU combo in my country, and by undervolting it from AMD's retarded default voltages it can get pretty cool while still OC'ing, [email protected], didn't push it further since my old style stock cooler goes off like a jet engine and even at stock it was still fast enough for me
Nowadays it's still cheaper than a comparable CPU+GPU combo thanks to the mining craze, however any Raven Ridge beats it by far for a few more bucks
I wish FX-83xx CPUs would drop in price to C2D prices, it would be nice to get one of the top Piledriver SKUs to pay with

spoiled brat. With your HD5870.
t. HD5670let.

For their price, of course they were.
For a consumer/gaimer CPU it was more than enough.

Yeah, the 83XX was a dumb idea, I used a 6300 and it overclocked fine to 4.7GHz.
Giving good single core performance and scaling well with multicore.

I still have that box and occasionally use it has a RX 480 in it behind the TV for games. Still does things like Witcher 3, Fallout 4 or Doom fine at the highest graphics settings and 1080p without going below 60FPS when having it locked at 60FPS.

So in short, AMD innovated by making CPUs with moar cores but no one wanted to optimise their shit for moar cores.

>innovated
is a bit of a stretch, they hoped they'd get the ball rolling in that direction though devs and intel didn't feel the need to follow until housefire skymeme and zen came out
>why bother with moar cores when we can still sell quad cores for expensive as fuck and sell hyperthreaded psuedo hexa cores for twice the price!

but in general AMD is usually the one releasing substantial innovations technologically speaking

More cores does not equal more faster. The FX-bulldozer chips excelled at a workload most people simply didn't use (heavily multithreaded).

>getting the ball rolling with moar coars
Ye, that's what I meant by innovating. And yeah it seems like AMD usually is the one to innovate. They currently have the world's best APUs.

Currently running 8320 with 280x. The current state of prices has seriously turned me off of pc gaming. If I do build another, it will be my last.

lol? they knew single core performance was nearing its peak and decided to opt for producing high clocks (provided you can power it, and iirc vishera STILL holds the title as highest clocked CPU ever) with MOAR COARS as a means to future proof it as they were going through a massive corporate restructuring at the time iirc

guess what? they were right. high clocked vishera was competitive with sandy bridge chips at the same pricepoint, albeit hot and powerhungry (but so were OC SB, which admittedly had a lead) but now that things are utilizing MOAR COARS these days, the chips are really starting to shine and even putting OCed SB i5s to shame.

i say that to say this though, faildozer was a massive flop and some of the more decently binned thubans (the 6core phenom IIs) shit all over it, but alas it lacks certain instruction sets where thuban today is hopeless. and thus the thuban was way wayyy ahead of its time.

you're not missing out on anything, games these days are garbage. my next build will be around emulation, but by the looks of things i'll just get a switch and not bother with PC as console games are also looking rather shitty too. sad times >: (

Ye I've heard that some of the FX chips are now beating Sandy Bridge CPUs. It's such a shame that wasn't the case from the beginning.

Thing is, Intel's socket 1366 CPUs eat FX for breakfast. Even the FX9590 can't stand up to an overclocked Westmere-EP or Gulftown. Intel can do a whole lot wrong, but that wasn't one of them.
t. planning on finally upgrading from 1366 to Ryzen at some point

FX8350, fullfills my needs for Programming and Some Gaming
Still using btw

Buy upgrading to ryzen within a year

Well it is now, bby.

Attached: ayymdeee.png (777x818, 54K)

>muh superiority
Hold your horses cunt. If it took over half a decade for the full potential of the FX chips to be noticed, it was a failure. Sandy Bridge was great from day one.

Just admit it, FX was shit. Sandy Bridge/Ivy Bridge are the clear victors. This is coming from an AMD fanboy.

My fx6300 was fine 2014-2017
Now have r5 1600

My 8350 now runs a windows DHCP server and a couple of VM's

Does the cpu perform? I'm thinking of using an 8320 for a 4HDD NAS system.

FX-8350 owner here.
its bad.

Just ordered this, hopefully this will solve my stock cooler noise problems, even that thing is pretty impressive.

Attached: Hyper T4.png (935x264, 42K)

FX6300 boost - off, APM - off
4.4 GHz (blck FSB 260 m 17X), NB 2.6 GHz HT 2.4 GHz, mem 2400 MHz - the best for all.

Have you oced your ram and your nb? If not plebs like u dont deserve such cpus

The initial release, bulldozer, was fucking terrible. It sucked crazy hard.
The iterations that followed were merely mediocre. Piledriver and vishera fixed many of the problems but you can't fix that much and so it became an okay processor in the end, especially for multi-threaded workloads and the price.

Ryzen though is actually worth buying which is nice.

>hurr after OC its good
i dont have a motherboard good enough for OC. also if it needs OC to be good then its by definition BAD

Not him but I think it's good at 4GHz stock, even a crappy board should do 4.2GHz since that's the turbo speed of an 8350 anyway.

8350 with a nice OC was unbeatable for the price, but only if you weren't doing gaming. It could get just below 800 in Cinebench r15, and and scaled fairly well with workloads that could use it. 6300 was the same story, but only if you were an absolute poorfag and could get it on sale for like 120 USD with a combo deal or something. The multithread performance on a heavily overclocked 6300 was about 20% or higher above base clocks. The reason the bulldozer arch killed AMD was because of how hilariously awful the server processors based on it were. In the consumer space they were at least competitive at various pricepoints, but in the high margin enterprise segments, they ran hot, had expensive sockets, were unstable, had awful scaling, and their single thread performance was an absolute fucking joke. They were all but unusable and AMD wanted to charge you out the ass for them, and most people would have experienced performance loss upgrading from older Xeons.

Idiot u dont actually need zo oc the cpu
U get way more Performance if u kill the memory throughput bottleneck, u get even more Performance with faster ram than ocing the cpu to 5 meme ghz
Ur one of those who never read about the fx cpus and claim they are shit dumb nigger

They are shit though. They're so unbelievably bad that I as an AMD fanboy didn't touch them.

Ye, it's nice to see AMD back. FX was worse than AIDS.

They can get u fluent gameplay in bf1 wheres a i5 7200 cant despite the price and age gap. Not so shit if u ask me

FX CPU has FPU units shared between cores. Intel's HT was good idea for maximum theoretical performance but was shit for actual multithreaded programs, Bulldozer architecture was even worse than HT because it wasted die space on integer processing blocks - so FPU performance for each thread of full load would be half of one working thread.

Not only that but also no programs were ready for that bullshit. Windows scheduler update took several years to complete.

you speak like a retard and argue like a retard
i wanst sure if you were being ironic when making this posts
BF1 can use DX12 so its a exception. BF1 runs like absolute shit if you use DX11 with FX-8350. I speak from experience. The CPU is absolute shit for gaming because most games will only use 4 cores or so and single threaded performance on the FX lineup is absolute garbage.

it would get you by without complaining but it was pretty mediocre

I think the biggest mistake they made was marketing it as an i7 killer, instead of a budget powerhouse, that for less than an i5, whoops it in overall compute power, because it does.

They certainly weren't great, but they also weren't THAT bad. At release 8350 cost less than 2500 with multithreaded performance on par with or 3770k. Even with the higher power consumption, that was still a pretty good deal if you didn't need single-threaded performance. The main problem was that when they were released, the rest of the ecosystem hadn't quite caught up with parallelization yet, so the better multi-threaded performance wasn't quite as big of an advantage as it could have been.

i5 7200? Do you mean i5-7200U? The i5-7200U is a laptop CPU.

Your just one retarded pleb, the fx are superb cpus for money, they still can play all older single or dual core games and actually are better than 4c no ht shit in new multi thread games and thats not only cause of dx12 but actually because are using 4+ cores and you need cpu recources in mp games which the 8 cores provide

>i5 7500 will give you frame drops and stutters
I'm having a hard time believing that a shitty FX-8350 will run BF1 smoothly but an i5-7500 won't. I think you're full of shit.

>FX CPUs can play older single and dual core games
Wow, really? Who knew that?

>I bought my old HD 6850 in 2010 for only 140 bucks.
Nvidia pseudo-monopoly ruined the market.(Fury and Vega delays almost killed AMD)
I miss the days when you could easily found a high end GPU for 220-300 bucks and the last gen one for 130-180 bucks.

Even at 4ghz its lackluster single-core hurts it. This is my 2500k running at stock 3.3ghz, 700mhz lower clockspeeds than the 8350 and yet its single core is nearly 1.5x faster, obviously multi-threaded it wins, but a 2600k will more or less run rings around it around with its 4 extra threads (even if its not quite the same as 4 full cores). The combined/shared FPU is what hurt it the most.

Attached: cpuz2500kstock.png (809x403, 49K)

U will be surprised if u had play bf1 mp on both of them

FX-8350 can run BF1 well only if you activate DX12. Without dx12 its shit that can't even reach 60 fps.

Nice may may