What actually, went wrong? On paper they seem so powerful

What actually, went wrong? On paper they seem so powerful.

Attached: 220px-AMD_FX_Logo.png (220x186, 37K)

Other urls found in this thread:

en.wikipedia.org/wiki/Fast_inverse_square_root
youtube.com/watch?v=8zTzpYjQ2MM
twitter.com/NSFWRedditGif

They where also strong on select benchmarks, and so was Kaveri, and after a few years people just stopped believing them.

They're powerful as fuck when 8 threads are leveraged, and people act like the single core is the worst thing ever, but my stock 4GHz 8350 single core is still better than a lynnfield i7 CPU, which people would think is decent for budget gaming because intel made it.

Attached: Why the fuck everything single threaded.png (800x522, 265K)

because it only good if you use all core fully, most work dont do that

The single core was pretty bad though. The fx 9590 SC was around a laptop c2d t9900 level.

The core count is somewhat fake, because the core pairs only have one single math unit, so for many things the 8 cores perform more like 4 cores. The architecture also has very poor memory performance due to a weak, lower clock memory controller, and the L3 cache shared between cores is also weak.

AMD underestimated the importance of the FPU

it's a shame

Nothing, they were good, still are
t.uses an 8350 and used to have a 6100

>No improvement to single-core performance over previous architecture
>Highly temperature sensitive, high power consumption
>Only shines in strongly multi-threaded tasks
>Shitty stock cooling (barely keeps up at stock clocks, jet-engine noisy)
>First generation was overpriced on top of all the other issues

AMD were overly optimistic in regard to the adoption of multi-threaded code, and by the time more games/apps could FINALLY leverage 8 cores, everyone had Intel CPUs.
First generation was awful.
The FX 8320/8350 were just...OK, while the FX 6300 is arguably the only objectively good CPU in the whole lineup (it was perfectly usable with its stock cooling and typically beat Intel's similarly priced offering)
By the time they changed the stock cooling to Wraith for the 8 cores, it was already too late, and even the FX 8300 just wan't competitive enough. AMD have nobody but themselves to blame.

AMD did nothing wrong

Attached: UPGRADE2010.png (836x768, 17K)

I had that fx-8310 for 3 years. It wasn't all bad. Gaming performance was slightly above okay, i just had lots of random crashes. Pretty good for day to day use. But now i have a ryzen 2700x and it isn't even in the same ballpark.

>8350 single core is still better than a lynnfield i7 CPU
But that doesn't make sense, a Bulldozer has worse single core than a Phenom II and Piledriver is just marginally better. A Lynnfield beats a Phenom with ease, so your statement doesn't hold up.

Ye they sort of a had an 80s/90s mindset when many CPUs didn't even had an FPU yet

modern games didnot use FPU
FPU meme was used in QUAKE for durty hack, but now have nice new inctructiong for same

Nothing.
Still on my FX8370, never felt like my CPU is slowing me down, everything i do on my pc working grate

>everything i do on my pc
And what exactly is that?

Software wasn't multicore ready. That's obviously when looking how well they aged.
They still performed well a few years ago, because even though single core performance was lackluster, programs took more advantage of multiple cores, than they did at launch.

Literally what

>needs watercooling to not get random crashes

en.wikipedia.org/wiki/Fast_inverse_square_root
the rearson of FPU hupe

PFRSQRT depricated this

also GPUs

I feel bad for you, son

Westmere-EP/Gulftown/Lynnfield/Nehalem was just so immeasurably better even at smaller core counts (4v4, 4/8v6, 6/12v8)

I'd rather have my FX8350 than a quad core Bloomfield with a slightly higher 130w tdp, but those 6 core 1366 xeons do wreck my shit and still only use 130w.

Attached: 1546052593322.jpg (2576x1932, 2.04M)

>he trusts the very fucked up CPU-Z bench mark that believes the E8400 is the end all cpu

Nothing. It's a comfy family of processors, especially in the winter.

>ryzen 2700x
is it noticeably faster than an FX chip?

OC your FX-8370 to 4.6Ghz if you can, and bump up north bridge frequency. If you can do it, push it to 4.8GHz with a good cooler. You're welcome.

Not him, but I just upgraded from an OC'd fx 6300 to the 2700x and it is honestly night and day in terms of games.

Nothing, it was a working, cheap CPU series.

>tfw 8350 paired with a used 7970 for max warmth

4.6ghz on the 8350 needs about 1.5375v or 1.4750 with good LLC, what id do for any board is set the vcore to 1.45 and see how high the multi will go without failing prime, having a nice board with good LLC helps because you can have a lower idle voltage and it will supply more vcore as it drops under load.

Attached: IMG_0360.jpg (1600x1542, 429K)

he's not lying you dingbat. bulldozer clock for clock was slower than phenom ii. phenom ii was just slightly faster than core 2 - penryn refresh clock for clock. piledriver improved single threaded performance but still was slower than phenom ii clock for clock. the only reason why piledriver was recommended over phenom ii was because piledriver could out overclock that deficit. causing it to match around a first generation nehalem 920 - 930. a moderately 4ghz clocked 2012 piledriver matched around a 2.6 - 2.8ghz nehalem from 2009 in single thread.

bulldozer was an absolute failure and deserves its faildozer nickname. worse than amd's own previous generation, barely matched first generation mainstream nehalem core i sku's, and was an absolute raging housefire. and this is coming from someone who ran a 8350 for five months.

and today its multi performance of something like a 8350 matches a stock, locked i5 from haswell. when all 8 threads are used. utterly pathetic.

I know the ipc of FX is absolutely terrible, that's the main flaw of the damn thing, but the CPU at stock voltage runs at 4GHz, and people act as if the thing is unusable, and I'll bet that Haslel i5 cost more than the FX8350, a locked 4690 is way over $200 and the FX-8350 was a $160 CPU that's the little thing people leave out, the fact that the old ass 8350 was competing with skylake i5s and costing less.

Is this thing not impressive when it comes to price to performance? the FX-8350 at launch was $220 at most, then it dropped to a $195 CPU even in early 2013 when it was still really new.

Attached: image_id_1580162.png (890x944, 219K)

It had some cool things going for it but with sandy bridge coming out along with a decrease in ipc coming from the phenom II in the first version of it. I remember it being really underwhelming when first came out. I'm using a 8320 at 4.3 ghz on a cheap 4+1 phase motherboard and it gets pretty warm. Not a bad processor, just bad for it's time since it had the ipc of a q6600

pic related

Attached: poo fx.jpg (1278x721, 228K)

The 8320E was a great chip. Or maybe I got lucky. I got mine to 4.6GHz @ 1.404v across all cores. IntelBurn test would try and set the chip on fire, which it does to every chip. Asus Realbench though it was perfectly stable and ran like a dream for the time I had it.

The 4790K I replaced it with clocked even better. 4.8GHz @ 1.275v across all cores. My mum still has that computer at her house as her home office computer and it's still running that overclock years later.

Piledriver was like 20% faster than BD clock for clock what are you talking about?

based 32nm SOI

didn't they get sued because it turns out the 8 cores aren't actually 8 cores?

It's a class action lawsuit, or "let's see if we get lucky and get the company to settle out of court for a big payoff". Very unlikely it'll pan out because, while you can argue the performance implications of the design, there's still 8 functional cores in there. From a technical standpoint the case is lost already.

>tfw owned an FX-9370
Got it for $220 brand new. Had it overclocked to 5.5GHz on a 990FX sabertooth board. Popped the VRM section when I got greedy and pushed for 5.6GHz.

we already talked about this. Software at the time was single threaded. Use something modern on both of those platforms and get back with me.

They're useless chips, designed to be cheaply designed by computers and cheaply manufactured to sell to ignorant goyim. Useless back when they were relevant because nothing used all the cores, buying a cheap 4+1 or even 6+1 motherboard was risky at best unless you were intentionally gimping yourself with the non-8 core chips and the massive power usage meant that you were going to pay the difference between this and a comparable Intel chip to your power utility company over time. Still useless nowadays because you can buy way way cheaper way way more powerful used Xeons if you aren't just buying a new Ryzen chip. There's never been any reason to invest a single cent in FX unless you're a braindead fanboy, and I genuinely find anyone who uses one as stupid as someone who would proudly flaunt the fact that they still run a GTX 480.

They are better than i5-3xxx at working tasks

95W FX8xxx were also good, i had FX8320e and it worked ok even with stock non heatpipe cooler. After i got better cooler i oc it to 4ghz and i used it for few more years, 3 months ago i got ryzen 2600. It's not huge difference in games, but when you look at power draw yep, damn worth to upgrade.
I can't even hear my build now, with FX and 280x it sounded like jet taking off .

All I know is my 8350 was blown the fuck away when I upgraded to an i7 8700
gayming performance was immensely better with Intel. AMD is garbage

>I bought an i7 from 2017 that shits All over an AMD chip from 2012 that everyone knew was shit from the start
Yea no fucking shit.

>upgraded to an i7 8700
>upgraded
lol

Ryzen 7 2600X is a housefire and my 8700 shits on it in games

Same for the 2700X.

>2600X
>Housefire
Lol. Just fyi, X series AMD chips have a 20°C offset.
>But muh games
Enjoy your 5fps then I suppose. Oh and your paste TIM. And already outdated socket.

lol my 8700 will last me for another 3-4 years, who gives a piss about sockets? Buy a good thing once and use it for over half a decade.

I picked up a 4690k for $230 back then new from newegg. You could have gotten a 4570 for $170-$180 back then. the locked 4690 was around $200-$210.

the 8350 was NEVER a good value at anything unless you got it for under $150. nothing then, and still hardly anything now really uses more than 4 threads. 6 - 8 tops and stuff that can use 6 - 8 don't really max it out. it was garbage plain and simple. only true amd fans or people not wanting to admit they bought shit bought it.

I always regretted buying mine. why I replaced it with that 4690k. was pure day and night experience.

I had the FX-4100/6300/9370 back when I was an idiot. I popped my motherboards VRMs and ended up getting a 4690k. Was great. Although the sheer amount of heat FX was capable of pumping out got me into custom liquid loops, and that was a really fun experience so I cannot say I completely regret it

>I was an idiot. I popped my motherboards VRMs
To be fair, even though I think FX CPUs were aite, AMD should've never allowed 4 phase boards to be used with 8 cores, because they are 8 physical cores drawing their own vcore, it cannot handle it at stock, let alone an OC, you need an 8 phase.

Attached: IMG_0252.jpg (4032x3024, 1.75M)

>Need an 8 phase
Lol I had the Asus Sabertooth 990FX. I had 5.5GHz @ 1.57v on the FX-9370 with custom liquid cooling running an Ethynol Glycol mixture and was pushing for 5.6GHz and was having trouble staying stable under prime95. I set my voltage to 1.61v after the PC crashed but forgot to re-enable LLC.

Motherboard voltage was set to auto and tried to boost to 2.7v. Managed to pass minutes of prime95 before the magic smoke came out.

my kaveri is overclocked to 4.3Ghz and it runs like shit in VR.

>2.7v
Shit family, I had seen Buildzoid crank an AM3 sempron to 1.95v and it died, and FX is only rated for 2.1v on LN2.

FFXV is multi threaded and it still runs worse on a 8350 compared to a 2500k at the same clockspeed.

Yes 2.7v for about 5 seconds before it went bang. SmallFFT prime95 ain't no joke. I remember it well. I ended up claiming my warranty on Microcenter for the motherboard (claimed manufacturer defect, bro employee at the shop was also a big dick clocker and just side eyed me). Figured I might as well return the chip too.

Ended up leaving with an i5-4690k, decent Asus Z97 board, and a 500GB SSD for the price of my 9370 and Overkill board. i5 performed well but was really boring to OC.

>Meanwhile there were 3+1 boards with support for the 95W FX octacores (e.g. FX 8300)

Kek, MSI 970 Krait was a mistake.

youtube.com/watch?v=8zTzpYjQ2MM

This happened.

Attached: MUH GAYMEN.png (1911x447, 56K)

I made that t9900 cpuz, and owned the fx, so yea I believe it.

>Is this thing not impressive when it comes to price to performance
Not when considering whole system price. I remember that the motherboards, psu, and cooling were the bulk of the price, and at best would perform like a locked i5 from a used lenovo.

>cooling
>expensive
I'm on a $30 air cooler and it does 4.5Ghz and hits 59C under prime small FFTs while performing like a locked i7-4770, it is extremely powerful for what it would've cost compared to a 4770 non K rig, my PSU is a cheap 70 dollar single rail thing and my board was the most expensive at about $130, but the 8350 didn't cost alot compared to a locked i7.

Attached: IMG_0084.jpg (4032x3024, 2.28M)

They falsely assumed software developers to fully use multi core just like Intel assumed compiler will fully optimise Itanic.

>4.5ghz
It's not performing like a locked i7-4770.

Yeah it is, I get 730 on Cinebench R15 and a 4770 would get 734

Attached: 4.5GHz 1866 bench.png (418x720, 50K)

Intel assumed that they could force everyone into switching to 64bit, and give up all their perfectly functional 32bit software.

Comfy, but wouldn't that case exhaust be better at the top of the case? That way it'd be helping the CPU cooler move air out.

Now do single core, and then show me all the software you use that uses all those fake cores outside of cinibenches MC benchmark.

I did do the single core, it was kinda shit at 109. :(

But dem 8 threads tho.

Attached: Multicore.png (422x460, 43K)

FX IS ETERNAL

I use an FX-8350 as my home server CPU and it's fantastic for that purpose. Underrated CPU, people think it's trash because it sucks for gaming.

>cpu fan pulls heat off gpu and pushes through heatsink fins
user you may have the dumb.

That literally says as 3770, not a 4770, is better than your chip. Are you genuinely retarded?

A fan up there would be nice, but my 300R has an open top, so the warm air rises out of it.

Gonna have to check those trips of truth, pair a used 7970 with it and it still can game on high settings at over 30fps

It really is the poor man's Intel Xeon, basically a server CPU brought to the desktop, much like a Sandy-E CPU, great for rendering and light home server use.

It only goes on that way.

>50% lower score than 4770
>on par
Pick one and only one

>Multicore doesn't matter
Pottery.

Don't forget to oc the north bridge, you can play bf1 mp @~60Eternals per Sekunde

>But dem 8 threads tho
Those 8 threads were virtually unusable for most it's life. I remember having issues with kvm, freenas, and whatever else I tried to use to put those threads to work.
Maybe it's different today, but today I won't even turn on my fx, and phenom 2 systems.

>mfw I use an R7 1700 just for the SMT
Your chip is just shit and highly dated.

Attached: 1528785213742s.jpg (250x237, 6K)

Fx multicore never mattered especially during the time period that the CPUs where released.

>It only goes on that way.
Incorrect. You unscrew the fan, reverse it, screw it back on and flip the entire thing 180 degrees. This is assuming you somehow bought a heatsink that cant be mounted sideways, which ive literally never heard of.

This. Intel knew software utilized just a couple cores at most, a vast majority just single core. Thats why we saw so many dual and quad core intels with high clock rate.

I do use a 225NB when OC to get that memory controller going faster.

It's really a shame things didn't support moar cores CPUs back then because the overall compute power is what makes an 8350 shine compared to a weak as fuck 2500K, even a 4.8GHz 2500K is still slower than my stock 8350, but it gamed better.

AMD is the same company that brought 64bit to the desktop even though the mobos maxxed out at 2GB of ram. They brought moar cores to the mainstream, ryzen is also maor cores with good IPC, and now intel is cramming as many cores they can with a high ass TDP but it's okay because it's intel.

I shit you not, it's not like the 212 Evo where it can be rotated, it uses a stock cooler style mounting bracket with a lever.

Software barely started to benefit from multicore recently, and FX is still shit in the software that finally does.

Then follow my steps regarding the fan. At the very least it will be pulling and the intake will be further from the gpu.

nigga wat?

Mate the FX is better(or atleast it can run) bf1 mp in contrast to a much newer a way more expensive i5 7500
And it's not like the fx couldnt provide more than 60 fps in the 4c days

Attached: 1547980005550.png (645x729, 105K)

>It's really a shame things didn't support moar cores CPUs back then because the overall compute power is what
Well it's not like more cores would equal more performance automatically. There's only so much you could do. I forgot who I'm quoting, but a certain computer scientist claimed that it's akin to trying to have a baby faster by impregnating multiple women. It just won't work.

>BF1
Oh I accept that I'm wrong then. FX is the superior choice.

They're a good option now if your board supports upgrading to it. An OC'd 8350 will be an OC'd 1090t, for example. My board is a 990FX v 1.0 chipset, so it can handle higher OCs. The Piledriver chips have extra instruction sets and features compared to the Phenom IIs. You can actually get on YouTube and watch benchmarks of it from different sources. GamersNexus also tested the Phenom 1090t/1100 and the FX-8350 shat all over it. Then again, this was done in 2018 and the FX series has had time to mature, more stuff being suited for multicore use and all. IMO if you're broke as fuck, have a compatible board and you're still on Phenom II, popping in a FX-83xx series chip is not a bad option, especially if you can give it a mild overclock.

They probably didn't bench arma, and dcs.

What's wrong about what he said?