Bulldozer shall return

God wills it

Attached: FX.jpg (600x509, 46K)

Other urls found in this thread:

github.com/herocodemaster/it87
youtube.com/watch?v=ZsIMd8TUUGg
amazon.com.mx/AMD-FX-Series-X8-8350-AM3/dp/B009O7YUF6/ref=pd_bxgy_147_img_2?_encoding=UTF8&pd_rd_i=B009O7YUF6&pd_rd_r=4944acf5-cf2f-11e8-9a8b-017d633f01ef&pd_rd_w=R0HQb&pd_rd_wg=OZX0g&pf_rd_i=desktop-dp-sims&pf_rd_m=AVDBXBAVVSXLQ&pf_rd_p=6d0fcdc4-2f82-4771-b5d6-b7ab7de9f68c&pf_rd_r=QYYKD72WXXYKH16MM0NT&pf_rd_s=desktop-dp-sims&pf_rd_t=40701&psc=1&refRID=QYYKD72WXXYKH16MM0NT
twitter.com/SFWRedditImages

Most slept on CPU lineup of the decade desu

Attached: IMG_0360.jpg (1600x1542, 422K)

Too early for it's time. Piledriver was pretty nice though, I had a Piledriver PC since 2012 till recently as a backup machine, it didn't get any slower since software (mostly games) got better multi core support, so in the end more recent games actually ran faster than games from the time of the CPUs release.
Rather funny. AMD was actually ahead of the curve hardware wise, just not software wise. Now Zen fixed that.

I like how AMD hasn't changed the overall designs of their chips since like forever and if you grind down the faces you can't really tell apart a Ryzen from an Athlon 64 right away.

Yeah it's funny how the core race causes by new AMD's architecture will actually make its older arch better as software will use more and more threads

I'm still running on FX-8370
Still works fine and gaming performance is actually improving as devs learn multithreading. There's still a lot of indie crap that doesn't scale for shit, but overall I'm still in favour of keeping it.
Maybe the 7nm Zen 2 will make me reconsider.

AMD is always pushing hardware ahead. Your average PC had no use for the x64 extension on 2003, but they brought it to the desktop anyway, and nos everyone has more than 4 gigs of ram.

There is also no reason not to upgrade either if you have the money. I went from a FX6350 to a 2600X and got around 25 FPS increase in most games on average. Plus it runs real quiet and cool, like games only use 50% of all the available threads and does not heat up at all. Feels literary like magic after the overclocked, power hungry 6530 I had.

don't forget frametimes
fps isn't everything, smoothness and latency are

I don't game much anymore
I have RX 580 and play on 1080p so the bottleneck isn't dramatic either
I agree with that the frame times could use an improvement, but I'm holding out for 7nm which looks like an impressive jump

Same, though mine is an FX-6300. I might switch when Ryzen 3000 APUs comes out, I don't need a discrete gpu, and even the current APUs are really good, even for gaming. I really hope 3000 APUs will beat that

IT NEVER DIES

Attached: ite.jpg (167x147, 33K)

>even latest mobos still come with the chip
IT'S NOT EVEN ITS FINAL FORM

what is this chip

same here but on an overclocked 6300. as long as current consoles exist, games will work fine. i will probably wait for socket am5 to upgrade. by then ddr5 will be there along with pcie4

Upgraded recently
~120% better performance in ST
Chrome pages load 70% faster

Attached: 12321414123515.png (2018x862, 242K)

could have gone ryzen
but nope, had to go security flaws

using a based FX6300

Attached: self intoxicated, twisted detonator.png (1131x1006, 314K)

It's an io controller chip that's been on tons of mobos even since the old days/

Checked, nice fuckin chip, I need 1.44v to get 4.5GHz AVX stable and it will throttle, so I settled with 4.4.

Attached: NB OC.png (1128x577, 245K)

The shared FPU and low general IPC is quite bad.

The ipc is terrible, but you throw a rendering workload on it, and it's incredible, and it cost less than an i5. 3570k, while gaming decently.

Attached: 4.5GHz 1866 bench.png (418x720, 50K)

Yeah but I was also cooling the socket.

Attached: anti-housefire fan resized.jpg (997x748, 485K)

My VRM is super chunky and it never gets hot, my chip will hit 65C and thottle with the multilpier OC, but with the NB OC, it will not throttle at 4.5 and hit 68C.

that was my CPU until i bought a new PC. Got a 1700x and love it.

lol. the fucking FX chips were not good for ITX build unless you basically took off all the panels of your case, the fuckers were blazin hot! i dont miss that.

To give more context pushing 4.7ghz meant turning my chip into a 200w monster and my mobo was never rated to handle it - I had previously killed a 970 board via overclocking.

Thats a R4.

>FX is great guys

Attached: 2018-10-13_144739.png (516x646, 36K)

it's only kind of toasty

Attached: 1517538163695.png (752x528, 41K)

You on crack?

> as long as current consoles exist, games will work fine.

Keep telling yourself that

FX at stock is not hot at all, but you go past 1.44v on that bastard and a Hyper 212 isn't even enough

1.38v and whatever the fuck stable multiplier you can get stable is the best for air.

You gotta get that based 990FX-UD3 that runs crazy voltages and doesn't even die.


>Extremely expensive highest end 6 core made years later is beating the not most high end FX

650cb or whatever the FX gets stock is still respectable, and it was great in 2012

The msi boards seem to not give a a fuck when OC'd and won't throttle at 65C like my gigabyte does

Attached: Stock FX-8350 speccy and benchmarks.png (1498x750, 354K)

>FX at stock is not hot at all

LIES
my UNDERVOLTED FX 6350 was a damn furnace
that was the prime reason I had to upgrade
was literally forced to, couldn't stand it making my room hotter all the time
there is something broken with FX power states or something

>and it was great in 2012
lets be honest, it was never good nor great

Attached: 2018-10-13_144651.png (498x543, 29K)

>tfw a core 2 duo is better clock for clock than Vishera
Damn user, what cooling was it on though? my stock cooler 8350 was hitting 63C.

Attached: Why the fuck everything single threaded.png (800x522, 265K)

I regret getting the i5-4670 over the 8350 to this date.

next update will be ryzen 7nm

Some online stores are still selling 8xxx FX models. Shit's never old. I'd go for it if it had PCI-e 3.0

Haggis man was right.

Attached: Gamegpu Prey cpu test.png (819x532, 49K)

Still happy with this baby. Don't game much anymore, only older titles (1996 - 2k4). Got an FX-8300 in my main server though that does quite good. Shit if your computer still does fine, hang on to it. Let others waste the money, You sit back and watch the money pile up in the bank. At first $100 ain't much but over time $100 per month x 5 yrs = 6K before interest. Double that to $200 per + Tax refund (s) and you could have a nice big pile in 5 years.

Attached: Speccy.jpg (894x588, 262K)

zalman cnps10x performa
I use it now on this 8700k
waiting for liquid metal in mail to delid

Attached: 00046c888367306c194f650bbae335e528a57b2b2e90f74f679ebc776c5d7a28.jpg (800x600, 57K)

>intel
>upgrade

Attached: homr hapi.jpg (426x436, 36K)

yeah, had to swallow the Jew pill
but am alright now

>Got an FX-8300 in my main server
That's actually a really good use for an FX processor, the AM3+ platform has great io.

That's weird it would be spitting out heat like crazy, cause I don't even feel it going from a 95w Q6600 CPU to 125w or more when it's overclocked.

I even had shittier cooling than my Hyper T4 and never had problems.

Attached: 2017-09-26-496.jpg (4000x3000, 2.22M)

as weird as it was
I was living with it for 6 years
and it was enough

I was still running an fx 8350 and 480 until just recently when I gifted it to my dad so he can play World of Warships.
A little hot, but it still runs well for being six years old.

I feel like bulldozer has a soldered ihs, may want to double check.

It is soldered, the glorious bastard.

>had to
>implying
Enjoy that circumcised dick you're sucking on.

used to have an 8-core bulldozer opteron that i got for 30€ when the fx 8350 was still 100€+, decent chip

That's actually pretty good for the money, and the tdp is really low on those with the default settings which is good for server use.

>stuck with 8350 for 4 years
>tfw the ryzen redemption arc

Attached: cry.jpg (186x356, 10K)

I'm not in buldozer gang anymore
I've been embraced by our jewish overlords
they say I must delid

Attached: 2018-10-02_174059.png (990x853, 173K)

I got an 8350 for $99 and paired it with a Noctua and got it to 4.7ghz stable. Best CPU I ever owned and about to upgrade to a Ryzen

If you got good silicone and cooled well you could pull off some serious shit

>dual channel unknown
It's 2018+10 months and 13 days fix this shit.

>he fell for the dozer meme
>he fell for the jewtel meme
>he fell for the ryzen meme
lmaoing at ur lives

Attached: Capture.png (279x59, 2K)

>not VIA

u dun goofed

superio+fan control+temp sensors
also a bunch of extra gpio motherboard manufacturers can use.
some control RGB too.

Also if you want to learn more - github.com/herocodemaster/it87

Why?
Ryzen generation is much better
Stop being this autistic

It's a shame Steamroller never got a full FX chip release.

Only CPU without a "management" engine and the only CPU that can still can be run with a legacy BIOS.

A FX-8370 overclocked to 5Ghz is about as fast as a Ryzen 5 1600 in multi thread workloads

It's only bad due to the crappy FPU.
Integer performance was quite good, better clock for clock than Intel.
At least with my own testing of a 5400K, Core2 E3300 and a Core i7 4820K.

But Ryzen is basically Bulldozer Reloader, that is modules made out of few relatively small cores.
Except they did away with the awful crucial component sharing thing.

>legacy
That's what's so fascinating about the FX series, they're so old school yet modern enough, and the chipset only has native usb 2.0, there are even windows xp drivers for the 9 series chipset, and HD7000/Some R9 200 cards will support XP, it just existed in a weird time that was 2011 where XP was still relevant.

youtube.com/watch?v=ZsIMd8TUUGg

If each core had it's own FPU, FX would be a motherfucker, my single core score of 109 times 8 cores would be 872, but the ipc gets about 20% worse when the FPU is loaded with two cores.

>but you throw a rendering workload on it, and it's incredible

LMAO You call 15% less performance than a real quad core without SMT from the same generation. I would like to also note

>15% less performance than a 4 thread processor
>literally a ghz faster clocked.

This fucking chip would only even MATCH the i5 with a 5.5 OC with an intel brand refrigerated cooling setup.

Oh yeah and on top of all that theyre the same price.

Bulldozer was good in some linux tasks, mostly crypto but that was it. Bulldozer was garbage at everything and it was a retardedly flawed architecture.

Sort of in the way that GCN is also flawed. IDK if you can remember but back in 2011, just one year before bulldozer, AMDs new GPU architecture was a fair jump at the time. But Nvidia blew it out of the part with 28nm and had to create a new teir of flagship, the titan, because their main chip (what would have been the 580s successor previously) blew AMD out of the water so hard that price point they would have been loosing money. Imagine if next year the 3080ti is 4x the performance of the of navi, so then nvidia rebrands the 3060 into the 3080ti and moves the bigger chips up brand.


AMD was shit until zen, they strayed way too far from their original successes. Phenom 2 was literally on the heels of Nehalem with better multi core and worse single core due to clocks (sound familar???), and AMD being retards decide that next generation the better way to compete with sandy bridge (on a newer node and a redesigned arch), that they would also redesign their arch but instead of mimic the single thread performance of sandy bridge, they changed the core completely into something radically new. Though they had a good process that clocked well, bulldozer just didnt have enought IPC to make up for any amount of cores. And it was too power hungry for mobile.

Attached: amd.jpg (800x1000, 183K)

Ryzen's design is entirely based around being as cheap to manufacture as possible when scaling from laptops to monster sever deployments. MCM hits all kinds of software and latency hurdles but it makes physically bolting shit together extremely easy.

Even now FPU is Ryzen's weak point - Intel has 2x 256bit AVX units where AMD only has 2x 128bit units.
For modern software using AVX2, Intel literally has a 2x throughput advantage (since AMD needs to combine the 2x 128bit units to do a 256bit AVX2, where Intel can push 2 AVX2, provided the software can push that))

Zen2 arch isn't really known yet, so hopefully we'll see changes there.

It seems AMD keeps struggling with the same elements for past few architectures: FPU and memory controller. In both Phenoms and Bulldozers you literally have to overclock the on-die NB by like 25/30% to get the most out of the memory controller.

Also I would like you to imagine a world where amd took thuban, die shrunk it to 28 nm, and threw on SMT. You would have had chips running just below sandy bridge IPC, with 6 cores 12 threads, at the same power and same price as the 3570k. Sounds like its pretty familiar huh? AMD could have actually had chips that were better multicore value than intel but instead they shit in their hands and clapped

This, if they could get thuban to run at 4.2GHz with out 1.5v, people would've loved it because it's 6 real cores performing like an FX8000 CPU at 4.2.

AMD chose 2x128bit this time for power reasons.
Intel's cores are so power hungry due to the 256bit AVX and it's why they have an AVX offset to lower the clockspeed when working on AVX workloads (where AMD doesn't need this, nor did intel prior to Haswell and AVX2)

I imagine there was some reason they chose not to do that. K10 architecture was just old and outdated. And it really was just a rather slight improvement over K8, more like K8++ than K10, so really, really old. They must've felt like it's a dead end at that point.

>they could get thuban to run at 4.2GHz

Dude the 28nm they used for bulldozer was the only reason AMD stayed alive back then. It clocked so well that AMD literally only stayed alive from people buying 7970s and 8350s and overclocking the shit out of them. A die shrunk thuban on 28 could have easily been overclocked to 5.2 like bulldozer but with its better ipc it could have taken the single thread crown from ivy bridge.

Keller and co had K9 planned out.
Interns couldn't make it work so they scrapped it.

One of the features was pretty much intel's TSX.

>muh avx

NYET COMRADE. AVX is intel trying to hold off the rapid advancement of gpus.

8320e and an rx470 8gb. Plays fucking everything at 1080p just fine.
I'll upgrade in 2-3 years

Had mine since release. Never had a problem. Don't know why it gets so much hate

Attached: Capture.png (1031x617, 44K)

>8700k
>1080p60
Fucking why would you meme yourself so hard? the only remaining justification for buying intel is high refresh rate gaymes and you;'re obviously not into that.

this is 100% correct.

If you are at a refresh rate of 60hz there is no difference
if you are at a resolution lower than 4k there is no difference
if you care about doing anything other than gaming at all, intel is worse
if you care about security at all, intel is worse
if you care about future-proofing / upgrade path, intel is worse

Even gaming there isn't much of a real difference anyway. Intel just fucking blows.

>Modules

>bingbus

really wish AMD used LGA on their consumer CPUs

I don't doubt they could put more powerful graphics chips on for the next APU lineup, but people like me who have already pretty much maxxed out raw performance of the GPU, I'd be way more interested to see if they could integrate any dedicated video memory in the next series. If the next iGPU is 10% more powerful but they include some high bandwidth mem, that'll net far more than 10% extra performance in games. DDR4 is just too much of a bottleneck to push it much further

Attached: HWiNFO64_2018-10-13_22-37-13.png (374x156, 11K)

No user, you don't.

Imagine a hypothetical APU that has 1-2gb of HBM as L4.

pretty sure I do

Pretty sure you're retarded user but thats okay - it lets the rest of us know how wrong you are.

Is this a good FX?

amazon.com.mx/AMD-FX-Series-X8-8350-AM3/dp/B009O7YUF6/ref=pd_bxgy_147_img_2?_encoding=UTF8&pd_rd_i=B009O7YUF6&pd_rd_r=4944acf5-cf2f-11e8-9a8b-017d633f01ef&pd_rd_w=R0HQb&pd_rd_wg=OZX0g&pf_rd_i=desktop-dp-sims&pf_rd_m=AVDBXBAVVSXLQ&pf_rd_p=6d0fcdc4-2f82-4771-b5d6-b7ab7de9f68c&pf_rd_r=QYYKD72WXXYKH16MM0NT&pf_rd_s=desktop-dp-sims&pf_rd_t=40701&psc=1&refRID=QYYKD72WXXYKH16MM0NT

There's an offer right now, which gets you the processor and an Asus M5A78L-M Plus mobo all for $160 USD. I wouldn't mind getting a Ryzen, but right now I only have DDR3 sticks at my disposal and DDR4 is quite expensive.

Attached: 1501811821029.png (354x439, 321K)

I would literally nut myself. benchmarks have shown that if the stock 2400g had 2gb of high bandwidth memory, it would easily beat the 1030, and overclocked it might even get close to beating the 1050.

But such a thing I fear would also be far more expensive as well, but certainly still a better value than getting a separate GPU

>2018
>thinking about buying FX

you must be joking

Attached: 1231244515.jpg (1440x793, 339K)

It still is good if you're in the situation he described where he has DDR3 laying around.

like I said I wouldn't mind a Ryzen but upgrading everything with the ram would be too expensive for me.

I own a GTX 750 Ti saw some bechmarks in games like BF1 and other recent titles and the combo performes quite well, which is why I'm asking if it would be a good purchase.

In your case it actually is a really good purchase because you're not gonna bottleneck that card at all.