What was worse?

Bulldozer was like NetBurst all over again.

Attached: image.jpg (1082x811, 138K)

Other urls found in this thread:

youtube.com/watch?v=eu8Sekdb-IE
youtube.com/watch?v=4et7kDGSRfc
techreport.com/news/23808/nvidia-kepler-powers-oak-ridge-supercomputing-titan
itprotoday.com/windows-8/cpu-embarrassment-intel-recalls-pentium-iii-113ghz
twitter.com/NSFWRedditVideo

Both were terrible that's why smart people judge things product by product instead of being loyal to some retarded brand.

I always buy Intel. only Intel.

Good goy

You're a good go- I mean a smart guy!

good goy

Attached: image.jpg (256x256, 30K)

Oy veh gud goy

Attached: image.jpg (719x720, 86K)

netburst was overall useless.
Bulldozer didn't perform well in classic workloads but shined in parallel tasks and heavy I/O operations.
There's a reason why it was picked as the best CPU for GPGPU supercomputers.
BD was on top green supercomputers.

Pentium 4 was a worse choice in every case you could imagine. Bulldozer was great for multicore tasks at low budget. If you needed video editing, compiling, 3d design, and you didn't have money for a Xeon or i7, then the FX CPUs were fantastic. The problem is they marketed it as a gaming CPU, and many dumbfucks fell for the high clocks and multicore meme and realized most games run like shit - so those dumbfucks whine ever since about the Bulldozer CPUs.
This. Opterons also run on the Titan supercomputer - fastest in the US and 5th in the world. However, they will be upgrading it to a Xeon based rig I think.

this
but i'd say FX was worse relative to the other CPUs of the time

yeah, buddy.
youtube.com/watch?v=eu8Sekdb-IE

Attached: 1524352272829.jpg (321x362, 35K)

I watched that video a few weeks ago. Those benchmarks were probably rigged. TekSyndicate was most likely paid by AMD. There's no way the FX-8350 is that good.

I was watching part 2 to that video yesterday in fact, and it's most certainly rigged. In Arma 2, the FX-8350 was beating out the i5-3570K which makes absolutely no sense. Arma 2 is a game that relies on strong single core performance, which the FX-8350 does not have. TekSyndicate either made an error during testing or they were sponsored by AMD.

youtube.com/watch?v=4et7kDGSRfc

>probably
>most likely
>most certainly
there's one thing that you didn't include in your evidence, CPU saturation.

>Tek syndicate.
I like how you guys mention that nigger when he's saying good things of an AMD product but every other time he's shit and wendel is god.
Really nice mental gymnastics right there.
Also the FX was so shit even the fucking nehalems beat the fucker.
Think about it, a fucking x58 cpu beating something newer so you expect anyone to believe a 350 beat a 3570K overclocked?

P4 - all versions.
Narrow decode width. Long latency of commonly used instructions. Replay storms. Small caches.

>350
8350

Bulldozer because AMD didn't pay off OEMs like Intel did with P4

Intel was benchmarking ahead of bulldozer because of their usual dirty tricks like cherry-picked samples, intel compiler and payola. But in hindsight we also can see that intel cut corners in their design to get that IPC resulting in breathtaking security flaws, that can only be attributed to extreme incompetence or utter malice.

Wouldn't that make Pentium 4 worse since Intel had to cheat for it to be more successful?

During the Pentium 4 days, as you said Intel paid OEMs to not use AMD CPUs and also they rigged the benchmarks to make their CPUs look faster when in reality they weren't.

>single tread braindead

No P4 and Bulldozer were both turds but Intel salvaged their situation by completely fucking over AMD through bribery. The fines they received from that were nothing compared to what they gained from their kikery.

AMD on the other hand almost bankrupted their company with Bulldozer's shitty half cores as Intel ruled the market unopposed for years

P4 ran into both process and architectural (bipeline) problems; at least Bulldozer didn't do that, and AMD managed to salvage the design with some clever power designs in later chips like Carrizo. Bulldozer did the pipelining way better but they set themselves on fire with the cache latency and the CMT which introduced even more latency causing a stall on two cores during FPU occupancy. Netburst had these problems and it was just a space heater.

>Titan
They were locked in by contracts signed during the Netburst era. The people on the project had wanted Xeons instead.

Gotta love the Intel damage control group being employed so hard on Jow Forums after the failure of their new shit trying to show off.

Well, I could never achieve stable and cool 30%+ overclocks with my P4 HT chips like I could with the FX-6300 machine. Got it around 2013 for 130€ with the motherboard, still was in use till last year, running all those years at 4.7GHz. Before discounting it, it was paired with a RX 480 and ran every modern game just fine at 1080/60 at "high/ultra" graphics.
OS performance was no different then my current Ryzen or 8700k machines.

There was a reason NetBurst was discounted forever.

Attached: 1472488016741.jpg (450x600, 66K)

P4 was way worse than BD, at least BD was competitive on it's price bracket at heavily threaded workloads even at it's worst
P4 wasn't competitive ever at it's pricing, and it wasn't competitive at anything when Prescott landed

>source: your ass
Titan's compute power comes mainly from the Nvidia GPUs. Opterons were there to divide tasks between those GPUs so they needed as many PCIe lanes on a mobo as cheap as possible and for that only the Opterons are good. Wasting Xeons for that task is retarded.

Of course most of the compute is done on the Kepler cards.
I certainly didn't remember the quote right, but it seems that decision was still locked in ahead of time.
>Incidentally, the Opteron processors used in the system are dual-chip CPUs based on the Bulldozer microarchitecture. We asked Sumit Gupta, General Manager for Tesla Accelerated Computing at Nvidia, why those CPU were chosen for this project, given the Xeon's current dominance in the HPC space. Gupta offered an interesting insight into the decision. He told us the contracts for Titan were signed between two and three years ago, and "back then, Bulldozer looked pretty darn good."
Source: techreport.com/news/23808/nvidia-kepler-powers-oak-ridge-supercomputing-titan

Good goy

Attached: 1523053926779.png (450x488, 269K)

>Bulldozer
i still use it and it perfect
at the same peroiod of time i used Bulldozer ineltards pay 5 times for regular upgrades

...for gaymen

You are a buffoon. If you bought a Sandy Bridge over a Bulldozer, it easily would have lasted you more than 5 years.

i5-2500K and i7-2600K are still good to this day. FX wasn't even good the day it launched.

Keep pretending FX wasn't bad, buffoon.

>being this cucked.

Intel are famous for dirty tricks and payola. Face it, Bulldozer wasn't half as bad as shills made it out to be, it's the guts of both the xbox and ps4.

fx was a turd when it was released but with hindsight and more heavily threaded workloads it doesn't look quite as awful as it did. netburst looked bad to begin with and only got worse in comparison to k7 and especially k8.

Bulldozer still good, all your intel shet is shet now
you pay for shet cose you geneticaly retarded
you will not survive

I am not cucked you buffoon. You are cucked.

Bulldozer was WORSE than it was made out to be. It was supposed to shit on Sandy Bridge and failed miserably. AMD would have been better off improving Phenom II's architecture and releasing Phenom III. FX nearly killed the company.

>being so cucked that you think Faildozer was good and is still good

The only good thing about Netburst, which everyone seems to overlook, is its FPU. Sadly, that's all it has going for it.

Netburst was worse.
Because Intel had a shit product, but still charged premium prices for it.

Hell yeah, I fucking hated my FX-8350 in my desktop, but when I made a new build a few years ago and migrated the AMD to my media server it became a champ. ZFS I/O operations and Plex transcoding love to pile on threads, and the 8350 is down for that. It'll probably still have plenty of life left when I make a new desktop build and migrate the old components downstream again.

True. At least AMD knew they had an inferior product and priced it accordingly.

both were really awful. bulldozer and its derivatives had a really long life though.

no one's mentioning this in the thread, but I'm so thankful they didn't both happen AT THE SAME TIME, like every flatmeme bloated eyesore OS today

>implying flat UIs are eyesores

Bulldozer is actualy best for DDR3

My 8320E still runs fine and was cheap as hell. No complaint. Runs most games great at 1080p.

Was like 60 bucks at microcenter and came with a motherboard 2 years ago.

>tfw people finally use your updated version
feels good

What is that even supposed to mean?

Can confirm. My 8320e + RX 470 8gb runs most everything on high settings just fine.

The FX processors had to be sold for dirt cheap to be viable. It sounds like you got a pretty good deal.

Athlons and phenoms were way better than the pentium 4.
My nehalem and it's successors like westmere were way better than the bulldozer and it's successor the piledriver.
What this user here said is it and the thread should have ended there.
Don't be loyal to brands because they will never be loyal to you, buy the best you can with your money and stop being a nigger.

Attached: 1518297763641.jpg (2048x1536, 725K)

>phenoms were way better than the pentium 4.
Were Phenoms ever compared to the Pentium 4? The original Phenom came out during the Intel Core 2 era.

>Bulldozer was like NetBurst all over again.
except the pricing was reasonable

Netburst was absolutely terrible. FX had some niche in multi threaded integer workloads. Still terrible as well, but not as terrible as P4.

>FX nearly killed the company.

No it didn't. Dirty jew tricks did. :^)

Rory Reid was Jewish?

if you have only DDR3, Bulldozer is actualy best CPU for you

AMD Fanboy here.

I've loved my FX 8350 that I had at 4.6GHz for almost 5 years until I had upgraded to a Ryzen 1700.

But I have to admit that the FX were actually really fucking terrible. They were very okay as a development machine (running two VMs + compiling) but their single thread performance was just absolute garbage.

>but their single thread performance was just absolute garbage.

Yes, I built a computer with an FX 8350 in late 2012 to see what the then-new AMD processor was like. My core2duo system from 2007 beat it in some of the tasks I tested it with. I ended up RMA'ing it within about a week.

P3 was even worse. They fucked up so hard with trying to win the GHz pissing contest at the time they had to recall chips.
itprotoday.com/windows-8/cpu-embarrassment-intel-recalls-pentium-iii-113ghz

I got conviced by pajeets on Jow Forums to buy fx 8320 back in 2013

Thanks to you faggots I'm never buying another AMD cpu ever again

Attached: 1468809006326.jpg (399x388, 43K)

That's the problem, you should have used a less core version with higher clock, a FX-6350 easily clocked to 4.5GHz+ with air cooling and beat i5's in single core while still beating any Intel Quad core in multithreading.

When you buy intel in times like these you literally listen to Jow Forums again

Yeah, what the fuck was up with that? I remember seeing the 6350 having the best single thread performance of all the FX chips, how was it better than the 8350?

Sorry. I was on mostly at night. AMD manchildren ruled the days.

I bought p4 prescott in 2004
8350 in 2012

just

Was this with the Coppermine-Ts or with the Tualatins?

Same bus, less cores = better throughput per core
Smaller die, less core = less power/better overclock

It's really logical if you think about it, the 6000 series hit a sweet spot between multithreading and single core performance in that regard. Most people bitching about the FX series only used or heard about 8000 series. Also utilizing less cores for a multi threaded workload made the cache problem much smaller on those chips.

I don't know what netburst is.

Bulldozer is the AMD fail where people got all hyped and it was garbage.

Indeed, thank god Piledriver fixed it

Well. If buying something else is the fix you can just get Intel.

>buy this more expensive thing that barely does better for your workloads instead of being smart about your purchase and not buying something right away without research
No, not buying something outright and waiting for several reviews is how you buy shit.

It makes me wonder, are you THAT stupid.
That fag from Tek syndicate is just for the camera. He has no clue about tech and benchmarking.
Those tests where done by Wendel.
>a 8350 beat a 3570K overclocked
I'll say it again: CPU saturation.

Piledriver wasn't much better.

Sandy Bridge and Ivy Bridge are better though.