Bulldozer hate thread

>8 "cores"

Attached: 220px-AMD_FX_Logo.png (220x186, 37K)

Other urls found in this thread:

youtu.be/3ZR-3r-RTfU
twitter.com/SFWRedditVideos

>4 cores
>performs less than a dual core
>8 cores
>performs less than a quad core
>IPC worse than Phenom

Attached: 1482574053190.jpg (267x323, 7K)

>AMD was so dogshit that AMD fanboys were forced to buy Sandy Bridge only to start shilling for AMD when Ryzen came out
(lol

literally aging like fine wine

now those 2500k upgrade threads make sense

I am making a 32 core system with these bad boys. I am expecting a warm winter.

Attached: 220px-Opteron_logo.png (220x186, 33K)

There's not a single CPU on the market right now that performs worst than a 8xxx. And 4790k still rapes the fuck out FXs.

Sounds cozy.

Why would someone do this?

>worst than
Go back to India.

It actually does

Everyone who owned the 2500k/2600k eventually upgraded to Haswell, Devil's Canyon and Skylake

The ones who stayed at Sandy Bridge are most likely AMDrones who can't walk the walk with Faildozer and Faildriver

You are seeing the same thing right now with Nvidia, only the most hardcore of AMD fanboys are buying Vega

It will perform better than a modern 8 core 16 thread CPU on multithreaded tasks. Obviously it will output more heat but I won't be compiling 24/7.

>2500k/2600k eventually upgraded to Haswell, Devil's Canyon and Skylake
literally why would you upgrade a quad core CPU with high clocks and decent IPC to another quad core CPU with high clocks and decent IPC. 8th gen is the first generation worth switching to.

why have two heaters in the winter? because it gets fucking cold dipshit

Now that the market has matured to my cozy FX-8320e it feels fit for purpose.
Intlel shills on suicide watch after their master's incompetence to deliver 10nm, outsource desktop chips and exposed their hardware security breaches vulnerabilities as the sauce for their "muh 10% IPC gainz are superior"

PS Faggot OP is a colossal faggot

I bought one of those in 2012 to replace my core2duo system from 2007. I RMA'd in a couple of days it when I discovered that my overclocked core2duo system was faster than it for single thread tasks and tasks that only used a couple of threads (which is like 99% of the work that I do)

nice try, pajeet.

>buying stuff before researching
Or more likely:
>lying on the internet to make a point about obsolete processors

Not him but a overclocked Wolfdale can achieve Bulldozer tier single core performance.

I'm willing to admit my fx 8350 is the shittiest thing since a loaf of bread sliced length wise or circumcision but it's serviced me well up until more recently. I'm planning to retire the old thing here soon in favor of a zen2 cpu when those come out.

idk man, my FX6300 is very comfy

I buy AMD because I hate Intel and Nvidia. If another company had a better alternative I'd ditch AMD in a heartbeat.

AMD dropped the ball hard on that generation.
Their phenoms were awesome before.
I made a nehalem build back then while my friend made a bulldozer, we both still own the same pcs but the difference in performance is retarded, the worst part is that he hates AMD with all his guts since the reason he went with teh FX8350 was because he thought saving 120 bucks was a good deal at the end of the day which only made him regret his decisions even more.
He wanted to make a new build and I told him the ryzen was nice overall nice but he said he's never giving them money again, he became bitter as fuck at that company lol.

I stayed at SB because I don't need more.
It's now been sitting at 4 GHz with the lowest stepping level (800MHz I think) disabled with a CM whatever evo+ cooler for over three years. No issues.

it's not, stop lying to yourself

based nazi

Bruh I still use my 2500k today because I need nothing better. Clocked at 4.6ghz, it doesn't bottleneck my GTX 1060 that's also overclocked. Plus DDR3 RAM is dirt cheap. I got 32GB of 1600mhz RAM for only $90, meanwhile 32GB of DDR4 would cost me $270

vishera master race

Attached: Screenshot from 2018-10-31 08-02-30.png (734x496, 95K)

im just saying ryzen is more housefire than coffelake if you compare the both overclocked and idle

I know that. That is why I wondered why did he buy a CPU before checking benchmarks and reviews from overclockers? Which led me to doubt he actually bought a Bulldozer CPU and he is just making shit up.

Sure, so is my 3ghz Pentium D.

2x more cores and similar temps, I wouldn't call that a housefire.

I got into PC building for the first time around Skylake, bought the 6600K thinking how powerful and future proof it is, only for Ryzen to drop some six months later and Coffee Lake shortly afterwards. Fuck me.

Richlands is Piledriver-derived, but my A10-6600 is still fine. Can't justify throwing out my CPU, mamaboard and RAM for Ryzen yet.

Still using my 8350
Haven't played a game that struggled, emulation also runs well.
Where exactly do you have problems with a 8350?

how much is your electricity bill?

>6600K performs worse than a i3 8350K
AMD surely put Intel in check

So does a mate of mine. He threw a massive overkill Vega 56 at it, though.

Lies, unless you are playing on medium and does not have a current gen GPU.

Muh nigger. Whack it up to 4.7 niggahurts, pack in a 290(X), say "to hell with the power bill", and gaymen are still silky smooth.

>Haven't played a game that struggled
Most games won't struggle if you aren't running on high/ultra or something like that, but FPS will surely be lower than on current CPUs.
youtu.be/3ZR-3r-RTfU
>emulation also runs well
Bullshit, emulation requires good IPC and is pretty shit on general on FXs. Unless you are not aiming for accuracy and is just emulating stuff from gen 5 and before.

Why are you so antisemitism? What did Jews ever do to you

feels good man

Attached: Capture.png (804x452, 37K)

Ah I don't knock dozer I have a shiity a6 apu desktop for me mum and another one in a shit dell 1366x768 meme top absolutely unkillable but terrible performance even when they came out

it's not so much that a 8350 struggles, more like it underperforms for what you pay for and the energy it requires. it'll be harder to achieve a locked 60fps with a 8350 than say a i7 2600k

>that Deus Ex
shit is stuttering like there's no tomorrow

i don't recall the fuckboy wars being this bad around SB launch
>own 2500k
>buy haswell
>being such a goodest goy
face it, even a /v/ tard does not need more than a 2500k right now.
I dumped mine to a family member, playing only the shittiest blizzard games out there on >60fps no problem.
high ddr4 prices make it even more unattractive to upgrade now, might aswell wait for poozen 2

I have a FX-6300 which is perfectly adequate for my needs, and was cheap.
I have no reason to hate it.

Attached: 6756876.png (221x98, 26K)

nice meme bro, can I save it?

Jew out and steal it anyway.

> AMDrones who can't walk the walk with Faildozer and Faildriver
Interesting, I did "upgrade" from FX-8320 to i5 only to upgrade to i7-2600K, because the cheapest i5 couldn't make it. I wanted to boot from NVMe, which was impossible (or so I thought) with BIOS.

I still have an 8150......

I actually put together a desktop PC for mine, currently ex, girlfriend's brother with a fucking bulldozer
Didn't knew back then, my athlon was great back in the day, thought AMD is still best for gaymes.

He asked me what to upgrade on his PC, because he can't play triple A gaymes at MAX settings. Just before I broke up with my GF.
Oh, well, hopefully she'll find another tech illiterate like me, to fuck up her brother's PC. xDD

God. I wish I was dead.

It's easier to say which one isn't a Jew.

Stop, it already hurts enough being stuck with 4c/4t and unable to justify an upgrade.

>Most games won't struggle if you aren't running on high/ultra or something like that
That's the OPPOSITE of how it works. CPU load stays pretty much constant regardless of what settings you run the game at. Most games will run pretty much identically on ultra settings as long as you have a relatively modern 4c/8t or better CPU unless you also have a 1070 or better, because GPU will be the bottleneck.

>he don't talk good muh favorite company therefore he jew!

Attached: 1525082832211.png (645x773, 87K)

I'm perfectly happy with my 8350 to be quite honest. It overclocked like a beast and now that more and more software is getting "real" multi-core support it is pulling its own weight even better. Even in video games, I have an oculus rift connected to this system and it works just fine.

The moment when you get fucked by amd Radeon
I really like your friend
I hate amd to bone

Sure, because shit like "view distance", "character density", general physics related options and some other post processing doesn't exist, right? What you said is only true for resolutions since they are totally dependent on GPU power, but ultra presets will 100% impact weaker CPUs. And I'm not even mentioning frame times.

>Sure, because shit like "view distance", "character density", general physics related options and some other post processing doesn't exist, right?
They do, but the impact is miniscule compared to the performance hit the GPU takes.

I use it at stock speed
I haven't played a single game that lagged on me. But I'm not aiming for 60 fps so there's that.
Regarding emulation, every single core in retroarch works fine. Cemu works good with the games I play, so does citra.
Only RPCS3 has some issues, but even high end CPUs struggle with it

>I use it at stock speed
Impressive. My mate has his 8370 (I think?) at something like 4.4.

8350, and other Piledriver 8000s were/are comfy as fuck.
Was it the best performing processor? No. But you didn't buy it for that.
I bought the 8350 because I knew it would provide ENOUGH performance for 5+ years without breaking the bank.
Today, you can still run the 8350 and get adequate gaming performance AND application performance where equivalently priced Intel processors, which were superior then, but practically unusable now.
AMD saw that the future was multithreading and increased core counts. They made a CPU to match it. Sure, it had compromises, they weren't "full" cores, but they were able to create an amazing price-to-longevity ratio.
And people who bought the 8350 for that price-to-longevity have had their investment pay off.
Sure, it's probably bottle necking medium-tier GPUs now, but it's fine and dandy if you're doing 1080p.

I ended up selling my old 8350 PC for the price I paid for it (thanks GPU prices! Thanks Fortnite!) last year to pay rent in between jobs. It still had hella life in it.

Watch the video I posted and see by yourself then, the FX is in average 30fps behind and stuttering like hell. GPU isn't everything.

What games do you play in Citra? Because Zelda BotW runs like shit even with Cemu hooker. And Citra can run on a Haswell i3 just fine, it's not demanding, just broken.

don't
they are shit

>where equivalently priced Intel processors, which were superior then, but practically unusable now.
lol no. Sandy Bridge and Haswell still outperforms Bullshit to this date no matter what shills told you. 2500k is still decent for gaming and 4790k is still a beast at basically anything.

>I haven't played a single game that lagged on me. But I'm not aiming for 60 fps so there's that.
So it's actually a piece of shit? lmao

Going to build a media center around FX-4300
How retarded am I?

Depends how much you pay for the 4300, it's certainly capable enough of being a media center.

45 bucks

Good news, you're only general populace tier retarded.

Very, even worse if you are going to transcode in real time. Just get a Athlon 200GE or a 2200G instead.

>200ge is about the same in quadcore int as fx 4300
I didn't actually expect that.

It's better afaik. Even the worse Pentium is better than this piece of shit.

*worst

I'm still using the 8350 and still searching for any reason to upgrade

I'm still using the Sempron and still searching for any reason to upgrade

>4790k
These are two completely different tiered CPUs. I was talking per price.
> Sandy Bridge and Haswell still outperforms Bullshit to this date no matter what shills told you
For single core. 8350 wins out multicore.
> 2500k is still decent for gaming
And pucker your butthole and hope no background tasks interfere. It was nice not having to close all other apps to play a game.
Also, FX 8350 shows its superiority when you use DX12 or Vulcan.
FX8350 and 2500K are about equal gaming wise using DX11.
Especially look at shit like BF1, where with DX12, the 6300 competes with the 2500k (where the 6300 was nearly half the price of the 2500k).
I admit -- BF1 is an extreme example, as it is particularly good at utilizing cores. But it demonstrates that the 8350 is taking the cake in modern games and will continue to do so.

I figure you have two more years to hold onto the 8350 unless you buy a >60hz or >1080p monitor.

>fx8350
>4core
>Intel needed the 4770k to beat it on kernel compile-time, one year after, for 50% more price

Attached: 1419035651602.jpg (591x590, 67K)

Zackly. I can game whilst running a days long data crunching task for work, and running VMs. My 8350 handles it fine. I reckon I've got a few years more before I'll need a Ryzen, or Threadripper.

8350 was $20 cheaper than 2500k. Sure, $20 might not sound a lot, but get $20 cheaper RAM, $20 cheaper PSU, $20 cheaper GPU, $20 cheaper mobo with it, and your PC costs $100 less.
That easily makes the difference between "in my budget" and not.
So when I was deliberating, the 8350 was one of several compromises between price and performance I made.
So it was 2500k with superior single core performance, or 8350 with superior multicore performance.
So when the 8350 went on sale (and the Intel didn't), I snagged it. $50 cheaper than the 2500k at the end of the day.
And seeing as I'm still getting 1080p@60fps on high-to-ultra settings in all but the most recent of games after one GPU upgrade, I feel pretty good about my purchase.

>Get 2200G
Good but it is more costly. Off course it is better to get higher end CPU but I don't wanna pay a gaming rig price for an HTPC, eh?

They are cheap enough for me mate. This is basically a 70s car being bought for vanity.
Also dirt cheap DDR3 RAM, tons of it.

I used to have a 6100 can i join the party
I ran it for like 4 years at 4.2GHz on the stock cooler

>all the fucking bulldozelets in here

Get on my level bitches, ride the 9590 piledriver.

epic waste of money.
I plan on buying one someday just for the kicks.

>mfw bought vishera
if i only knew how it will be

Attached: 667787717545609707.png (500x500, 59K)

phenom II x4 965 BE here

all FX chips are dogshit. i got a fx-6100 FOR FREE and it performs worse than my 4 core so i kept this one. amd was so fucking pathetic but the intel shit wasnt worth upgrading to either.
probaly gonna go ryzen at 3700x.

Didn't some guy sue AMD for falsely advertising 8 cores when there were really only 4? What happened to that lawsuit?

Reminder that a retard from Jow Forums named Tony Dickey bought not one, but TWO FX 9590 then tried to sue AMD for false advertising, claiming that the module did not contain real cores.
Reminder that this case was dismissed.
Reminder that a NEET from Jow Forums got laughed out of court
Reminder that Tony Dickey is the retard who made all the treads about Bulldozer not having real cores
Reminder that defacto the module design does in fact contain two integer cores, even if they suck, they're still cores
Reminder that NEETs get blown the fuck out when they try to bring internet shitposting into real life

Attached: BulldozerHotChips_August24_8pmET_NDA-6.jpg (1500x844, 240K)

based vishera poster

𓂸 𓂸 𓂸 𓂸 𓂸 sage

Overclocked FX-8350 keeps my room warm during the winter, with my window airconditioner still installed but not running.

This to be honest

My FX6300 served me well for maybe 4 years, then I sold it for 60% of what I paid for it. Bulldozer was based

I'm still using my FX 6350 @ 4.7ghz. Will retire it for my bedroom pc once I upgrade to ryzen