Which retard designed this?

which retard designed this?

Attached: download.jpg (243x208, 15K)

Other urls found in this thread:

rbt.asia/g/search/filename/upgrade2010/page/1/
twitter.com/SFWRedditImages

ur mom

Poos

Some dude but he left AMD in 2012 IIRC

Intel "leaked" the design as a joke to AMD's staff. They couldn't believe it when AMD actually went for it and stuck with it for a decade.

In a business sector filled by autism that is not fucking surprising.

Attached: UPGRADE2010.png (836x768, 17K)

\
It was an effort to make up for the huge process manufacturing advantage Intel had achieved. They couldnt compete on core count so they had to go at it another way. Unfortunately it just didnt work out

Papermaster if I remember. Originally in like 2004 or 2005. The whitepaper for it is super old. They apparently had some early variant test chips in like 2009 that had wider INT cores, apparently 3ALUs/3AGU wide. Allegedly this was too power hungry, despite performing better in serial workloads, so they ended up with the 2ALU/2AGU core sacrificing performance for lower transistor count, lower die area, and better clocks.

Many small cores still is a viable strategy for tons of workloads. ARM however dominates this small niche, X86 never had a real chance here.

No but seriously, WHAT WERE THEY THINKING

>which retard designed this?
not an idiot. An Idiot would save the thumbnail of a photo instead of the image.

>we want to increase total throughput per chip by facilitating many smaller cores and clocking them high as possible inside of power envelope

Thats what they were thinking. Its pretty self evident.

It was me, sorry anons.

...

go drink Jim Keller's piss, maybe you'll become smarter.

It's not as bad as it looks though. It's just that the person who designed it must've been in a 20 years coma shortly before, because the architecture was made with an obsolete assumption that integer operations are so much more important than FP operations.

> some retarded CEO looks into spendings right before or after Phenom II launch and decides that R&D costs too much
> signs the new roadmap "lol let's cut costs by not developing new arches, GPU will save us"
> investors are OK with that because EBITDA is up
> board of directors is not OK with that because their CEO is retarded
> time passes, new CEO is found
> AMD has no uArch, very limited budget because lol no money and CPU engineers
> quickly, develop something scalable for pennies
> they repurpose old AM2 chipsets/sockets with HT bus again and call them AM3+/9xx
> Bulldozer is rolled out, it's very rushed but better than nothing
Aftermath:
People were laughing but after the 2nd stepping (Piledriver) and slashing the prices they found their niche. Getting something inbetween i5 and i7 for $100 in 2013 was great. Unfortunately, its reputation was spoiled and due to the high TDP, probably, they didn't fare well in servers.

>right before or after Phenom II
I'd say around the time of first Phenom launch because Phenom II was just a fuckin' node shrink after all

I wish we had seen Excavator AM3+ CPUs

Attached: pep.jpg (1200x800, 218K)

I only really started building when Ivy bridge was out. At the time it was either
>i5-3750k for $220
or
>AMD FX-4100 for $80

Being the poor student, I chose the cheaper option because "lol they're both quad core, why would I spend more for a blue badge instead of red?" I overclocked the piss out of the 4100 and eventually went onto an FX6300. Then an FX9590 because I'm retarded. I OC'd the 9590 to 5.6GHz and eventually blew up the VRMs on my 990FX sabertooth board.

Yes, when I think about it, Phenom I had some innovations like four cores on a single crystal, but was poisoned by the TLB bug, so everybody forgot about it and looked towards Core2 CPUs or next AMD CPUs and Phenom II was great but didn't add much except OC potential and DDR3 support. Besides, it competed with soon-to-be-obsolete Core2.

Yep a move to 6300 was ok because it's a decent CPU compared to 4100 but 9590 was retarded

well at the time I had an FX4100 and the board was a bottom of the barrel 970 board with no decent VRM section. The FX6300 was my only choice. When I finally got a really good 990FX board, I was looking at the 8 core FX chips. The FX-8350 at Microcenter was $180 while the 9590 was on sale for $220. I figured why the fuck not spend $40 more for a guaranteed 5.0GHz?

I regret my choice. Gaming was shit tier. But my God, if you wanted to see a blu-ray rip encode at record time while also increasing room temp by 15+ degrees, you should have seen my FX build.
>FX-9590 @ 5.6GHz
>2x R9-290X with modded BIOS @ 1410MHz (no voltage limiter)

I was able to almost overload a custom liquid loop with 2x 480mm radiators.

>FX-9590 @ 5.6GHz
At least you didn't need extra heating in winter

The same retard that thought converting server hardware into consumer hardware was a good idea.

>Bulldozer
Literally inferior to Phenom, a fucking joke.

Found the FXtard. Only a retard would buy a CPU that is worse than everything available on launch and even worse than a dual core on the multithread era.

Who buys AMD anymore?
Why?

Intel fired pentium 4 designers. Amd hired them.

Amd fired vega engineers. Intel hires them.

Looking forward to buy a R5 2600
Owning a PII 965BE right now

True this. Phenom II - real cores. better in single thread tasks (aka a lot of games and applications). Ran everything with no issues at all. Single thread or Multi - didn't bother it a bit.

FX - "Fake cores", single thread sucked. The only reason it shined was due to those extra "fake" cores being put to use in tasks such as media encoding and later some games which were made to use more than one core.

If they'd only kept the Phenom II but shrunk it down and cranked up the clocks they'd have had another winner. The top end was 6 core @ 3.20 ghz but the tdp was high. More reasonable was the x4 955 @ 3.20 ghz w/tdp of 95w.

posting from a phenom right now. A processor from 2009 that runs half the shit it does is PHENOMenal. tehehehehe

Me to. I'm . Still runs fine today.

Attached: Speccy.jpg (894x588, 239K)

More like half a decade goy

Quad core FX were literally worse than later Phenoms in every aspect. I wonder why didn't they try to scale Phenom architecture to more core instead of launching this abortion.

you will NEVER now that 2500k feel

What was exactly wrong with the bulldozer architecture, Jow Forumsoys?

This wasn't even the fastest Phenom X4 lmao

Attached: Screenshot_20190131-214520.png (720x1280, 198K)

Are they not? Also i heard that the only reason AyyyMD bought ATI so that they could create "fusion" apus in which all major FP calculations would be outsourced to the gpu. Why would they get this "idea" if FP calculations were not important for a CPU?

True story:

Same system. swapped out a 955 with an FX 8300. Both had same tdp (95w). FX had 8 more cores and a 100 mhz clock speed bump over the 955. (3.20 v.s 3.30). So in single core tasks you'd expect almost same performance.

Well games sucked. Fired up a lot of classics, UT, Q3A,UT2004, etc. All sucked on FX. Tried all the hacks/patches etc. Nothing worked. Video encoding was faster though, but with 8 more cores you'd expect that.

Dropping back in the 955 saw all games playing fine, smooth as babies ass. Video encoding speed was reduced but again nothing unexpected. Frankly the reduction in speed was nothing that was a deal breaker.

FX is now my server cpu. A task perfectly suited for it.

The frontend was good tho, it's being used by Ryzen

29°
HOW
Mine is always 40-50. 55 max

> which were made to use more than one core.
Hello, 2005 called.
Actually, I was forced to upgrade from Phenom II X2 because it ran L.A. Noire poorly. Later on, every RAGE game ran like ass on dual core. So, telling "more than one core" after 2010 is redundant.

Mine is the 95w variant. Nothing is different between the 125w black edition and the 95w version other than tdp and the fact the 95w version is not made for overclocking out of the gate. I also use an internal exhaust blower fan to suck hot air generated by my gpu outside the case.

To me anyway the 955 95w variant is the perfect trifecta; speed, cores, and tdp in one package. No compromises, no issues, just works. The even fsb multiplier means the full use of your ram bus speed so no slowdowns in that area neither. (14 or even number = full cpu to ram bandwidth; 13.5 or odd number = you get almost full bandwidth as advertised but not all of it so a slight performance hit) or another way DDR 400 w/14 fsb = full speed. 13.5 fsb = DDR 375 effective speed.

underrated

>Paying for a quadcore
555 BE unlocked to 4 cores master race

>unlocking a defective core because you were too cheap to spend pocket change on a processor that passed QC

I would have left it at 5.2GHz tops, with strong cooling.

It was my first build :^)

rbt.asia/g/search/filename/upgrade2010/page/1/

seek help

being a pajeet must be sad

this:

>kaby lake i5 gets destroyed by bulldozer
it was oblivios in 2011