Is Bulldozer better than it seems?

I have a friend with an unspecified "AMD FX" PC which he claims to have built in early 2017.
He has a GTX 1070.
Now, what gets me is that, despite the fact that he swears he isn't overclocking, he gets insane framerates in video games. For example, at 1080p in fortnite at max settings, he gets about 180fps average, 220 max, and about 120 minimum. Mind you, all this with 16GB DDR3.
Meanwhile, my ryzen 7 2700x and 1080 ti get 220 max, about 150 average, and about 90 minimum at the same resolution with 3200mhz ddr4.
Now, my framerates in all games are consistent with other people's ryzen benchmarks, so I'm not concerned, but what the HELL is up with his Bulldozer?
Is he lying about his setup, or is the FX chip actually good?)

Attached: 81uNEDESDYL._SL1500_.jpg (1278x1500, 262K)

Other urls found in this thread:

newegg.com/Product/Product.aspx?Item=9SIAD6H65C3643&Description=gigabyte 970 motherboard&cm_re=gigabyte_970_motherboard-_-13-128-602-_-Product
youtube.com/watch?v=9LLZ6jBmDsA
twitter.com/SFWRedditImages

most likely lying

Attached: poo fx.jpg (1278x721, 228K)

FPS in games mostly driver related
buying videocard (You) buying not peace of hardware, (You) buying subsribtion on drivers update - videocard just a protection key

Attached: UPGRADE2010.png (836x768, 17K)

my FX6300 is based

It's absolute garbage and not worth it. Ryzen slaughters it in every way. Save the extra few bucks up for a 2600.

FX cpus were a flop on release, pretty outdated now

They're good now for super budget servers but thats all. Ryzen surpasses it leaps and bounds.

>Ryzen
2X SPEED FOR 2X PRICE
NOTHING GOOD

Zero chance, any fx chip would be bottle necking that 1070 pretty badly, hell, the fx-8350 was a pretty big bottle neck on the 970

There wasn't "much" wrong with the chip design itself. It just game out at a wrong time.
Zen is pretty much identical to Bulldozer, when scaled over the years with the natural performance grows, except multi-core nowadays if much more useful.

I mean, it was a decent *budget* chip around it's release, but no it wasn't good.

Attached: 03_ARM13.jpg (1024x766, 120K)

>They're good now for super budget servers but thats all.
No. They are highly inefficient, you'll make up any different of a FX chip running 24/7 by its power bill against a Ryzen 3 very quickly.

There is plenty wrong with the design of bulldozer
And it is a vastly different arch then ryzen.
What the fuck are you smoking.
If by "came out at a wrong time" you mean if it would have released in 2006-08 then
Yeah it would have been good if marketed correctly as a quad core with "hyperthreading "

bulldozer got better overtime as more games became multithreaded, but it's still no where compared to ryzen or even sandy bridge CPUs.

Of course it's good if you've still got an 8350, wouldn't recommend getting one now but if you have an 8350 you can still play games.

Attached: One Core per CU.jpg (930x797, 171K)

thus tread full of market makers

its more down to GPU than CPU 90% of the time which means he is lying or runs at sub par settings there is no way a 1070 is getting more FPS than a 1080ti unless the 1080ti is on a dual core Pentium

This

>Zen is pretty much identical to Bulldozer
Stop reading here.

Hence the quotation marks
No, it's not vastly different, vastly different would be entire different architecture
Bulldozer was one integrer core and two floating cores as two module, Ryzen is one module as two hyperthreaded cores

That's your problem, you didn't read it all and didn't understand what it was about

>Of course it's good if you've still got an 8350
this is such bullshit, the FX-6350 was way better since it could actually get decent single core performance with overclocks when compared to the 8350 and the additional two cores of the 8350 didn't do shit for games when they are running at worse single core performance overall

the 6350's where binned worse and clocked worse what are you on about
all FX hit a wall around 4.5ghz besides the 9xxx housefire editions

FX hits an air cooled wall voltage of 1.38v, the temps jump up dramatically for mine just to get that extra 100mhz to 4.5 because it needs 1.44v.

meh, so much Gaslighting in this thread

fx has 2mb per core of L2 cache
ryzen has more L3 but only 512kb of l2 cache per core

maybe that bottlenecks the max fps, not a big deal to cause any drama, both cpus are workhorses

Attached: oldman.jpg (250x250, 6K)

My 8320 @ 4.4GHz runs most modern games at largely GPU bottleneck with a 980Ti, but it's severely lacking in single thread vs my 1700X.

>Zen is pretty much identical to Bulldozer
Coffee Lake is pretty much identical to netburst

>fx series
>good for servers
$15 x5650s are leagues better

Finding good motherboards for FX is easier, and usually cheaper.

Bulldozer as an architecture was designed as a server CPU from the beginning then brought to the desktop, 16MB of CPU cache on the FX-8000 was no joke for a CPU that costs less than an i5

X5650 is better but back in 2011 that platform was really expensive and X58 is really expensive now with the boards.

Attached: Bulldozer%252032nm_thumb%255B1%255D.jpg (504x446, 74K)

I had an 8350 running at 4.7ghz on air and it killed absolutely anything I threw at it.

I "upgraded" to a 2700X and barely tell any difference.

Attached: 1533762184657.png (380x350, 295K)

You can't compare the cache implementations because of the immense differences in SMT implementation and basic architectures.

that's strange because I get about 150 mins, 180 avg and 220 max on my Ryzen 2700X + GTX 1070 but I also run a lot of ACTIVE shit in the background

check if your ram XMP profile is enabled, and also make sure that you enable your board and chipset support XFR 2.0 and turn it on

make sure your cooler is adequate and is properly transferring heat

as for the bulldozer performance in modern games, I guess that has to do with processor extensions added to the bulldozer era of CPUs being finally utilized by modern programs squeezing out a bit more performance than games could at the time

You can clearly tell the difference games like WoW.
On the other hand, my 8320 ran DOOM 4 at GPU bottleneck running at 1.4GHz. I fucked up some OC settings, and didn't notice until I fired up ESO and had 14fps.

>I had an 8350 running at 4.7ghz
Shit dude your cooler must be the size of a football because my Hyper T4 can do 4.5 but that's right at the edge.

The FX chips were insane but no way they're outperforming a 2700x

NH-D14/D15 compete with AIO watercooling.
CM's Hyper line is pretty shit compared to high end air coolers.
I run my 8320 4.4GHz 1.37v on an NH-D9L.

Not to mention that the Zen 1 L3 cache itself is double the bandwidth and lower latency than the previous arch, and a huge bump to L1 and L2 latency and bandwidth.

>or is the FX chip actually good?

oh ha ha ha ha ha oh wow, ha holy shit no

That, combined with the far more advanced cache prediction makes the cache one of Zen's most advanced features. I still can't quite comprehend the fact they build a mini neural network in to the cache prediction.

>He doesn't want the multi core performance of an i7 3770 for cheap.

The fx chips are better now then at release. New CPUs are just flaming garbage

>abortnite
kys, /v/ermin. back to the hole you crawled out of.

How dare you

Attached: Intel-Pentium-Processor-G4600.jpg (640x640, 98K)

fx are comfy as hell. always delivering just the needed ammount of power to keep things smooth, even after 5 years thanks to the hardware optimizations.

Attached: 1541815305776.png (561x507, 377K)

Fuck amd

Attached: 20181110_222503.jpg (4032x3024, 3.17M)

R5 2400G is actually pretty slick, and it's unfortunate that RAM prices are as bad as they are.

I would if I could.

Attached: 1503800146174.jpg (781x1177, 98K)

He's talking about the botnets, high tdp intel chips, non soldered spreaders.

FX is just a pure lineup of unlocked processors with no Jewish tricks.

>4 phase mobo without heatsink
>FX8300 in the socket
FX is pretty good, fuck that board.

It's nice of them to allow it but 12w chips to not belong on those super low end 760G/ 4 phase 970 boards.

Attached: amd_fx8350_8_core_cpu_1504700744_6b25d1e40.jpg (826x818, 172K)

>FX8300
That's a 6300, which is fine, but I wouldn't try to OC it very far in the board wihout adequate cooling.

>12w chips
125w chips I mean.

Its a 6300 not an 8300. But yes this mobo died after 2 years of runnning at 4.3ghz oc

That's fucking impressive to run that long with no heatsinks.

Attached: Why the fuck everything single threaded.png (800x522, 265K)

Yeah msi arent as bad as people meme about. It never really died it just became unstable and juttery and i had to underclock it a little

>underclock it a little
Sounds like CPU degradation.

>been running my FX 6300 at 4.4 for two years on my shitty asus 760g mobo.
>at fucking 1.4v because i was a brainlet that tought that the cores were throttling because of lack of voltage
>its been three years since then

i-its going to be all right, right?

Attached: death.png (453x434, 22K)

just upgraded from a R7 260x to a RX 570 and now my FX 6300 is finally showing his age, it's been a comfy ride FXbros

I'm surprised his CPU would degrade, FX can take 1.5v+ on the core and be fine with that, is it possible for a VRM to degrade? I've dbeen dailying a 1.38v OC for almost 2 years and sometimes ran 1.44v when I'm feeling extreme and it still holds the oc stable with me beating on the system with prime95 and transcoding X264.

You'll be fine user, the FX chip is durable as fuck, and heaven forbid anything happen, gigabyte 970 boards are cheap.

Nice AMD GCN 1.0 card.

Attached: RAM and Board.jpg (4032x3024, 2.5M)

its actually a HD 7790, lucky me that i went with this instead of a 750ti, the amd finewine really works

Attached: gpu.png (294x180, 14K)

It really was a finewine series of cards,I just got a 7970 and it's a beast, 3GB of VRAM before the 780ti,

I miss the single slot profile of my HD7750 but playing at 1920x1200 is cool.

Attached: 20180905_205527.jpg (2576x1932, 2.41M)

This is my exact position. I am possession of an unopened 8350 black edition and 16gb 1866 RAM, do I make a build with it or not?

I did say super budget

Hell yeah dude, get a gigabyte 970 board and enjoy it. Set the vcore to 1.38125v to get that 4.2GHz OC and roll with it.

newegg.com/Product/Product.aspx?Item=9SIAD6H65C3643&Description=gigabyte 970 motherboard&cm_re=gigabyte_970_motherboard-_-13-128-602-_-Product

Attached: 2017-08-26-484.jpg (4000x2248, 1.25M)

This baby still does me fine; I don't play new games and I don't keep 100+ tabs open either.

Attached: Speccy.jpg (894x588, 262K)

>FX can take 1.5v+ on the core and be fine with that
It will degrade faster than normal at 1.5v
1.45v is what AMD rates it for with a 5-yr lifetime at stock frequencies.

Phenom II is fucking mad CPU.
If you can run it over 4GHz, it's better than Bulldozer.

isn't phenom better than fx: lets throw 3 phenom cores against 3 fx cores

phenom would outperform them
but phenom almost always has less cores than fx

so fx would win in a multicore softwares but only marginally

1 phenom core is almost equal to 2 fx cores
1 ryzen core is 2 phenom cores

clock speed must be sames

Attached: 41pFlK2grvL._SY355_.jpg (355x355, 15K)

Shit FX9590 owners must be fucked because their CPU ships with a 1.5375v VCORE

>tfw 4.2GHz Thuban performs the same as 4.4 GHz bulldozer
The FX uses little voltage to get 4.4 but thuban technically does beat it, even if it needs 1.5v to do it.

One phenom core is 5% better than an FX core clock per clock, put two cores in an FX module under load and the ipc will be about 25% less than a phenom core, but there's 8 of them at a high frequency.

Attached: 1491349230554.jpg (653x726, 115K)

>32gb ram

why

>fx would win in a multicore softwares but only marginally
Phenom II has dedicated FPU per core, so a Phenom II x6 will beat an FX-8350 in floating point workloads.
FX really shines in integer workloads, and beats out Phenom II per core as well as multicore.

>FX module under load and the ipc will be about 25% less than a phenom core
Only for FP/INT mixed workloads. For server workloads that are heavily INT, Bulldozer completely shreds Phenom II

Why not? RAM was cheap back then.

>fortnight date intensifies

Part of it might be him lying, part of it might be you're still getting almost 1/3 better framerates than he is. A lot of it is that fortnite probably doesn't use multithreading very much, if at all

It really did beat out phenom when it came to actual work like rendering, I still think an FX-8350 rig with 32Gb of ram to cache things and a lowe power video card is a fucking budget workstation

Attached: IMG_0218.jpg (4032x3024, 2.07M)

>rendering
FP heavy workload, so yes, Phenom II wins.

Finding out that most benchmarks are owened by jewtel.

XFX Single slot coolers are pure sex.

Thuban is a good chip but it needs alot of voltage to do what my chip does with it's modules at 4400mhz 1.38v.

If I put my CPU in one core per CU mode which allows me to get 4 real cores at 4.5GHz+ so it performs like a phenom X6 1055T/FX6300

>Zen is pretty much identical to Bulldozer

Are you okay?
Do you have brain damage?

Attached: ea74408d70eb8a05ea8cb2f6c7dbc389.png (765x624, 153K)

I have been trying to search for another used XFX Ghost HD7750, but the people want over $100 for it, so I settled on getting a brand new in the box HD7750 refrence pcb card from HIS to have a spare low power video adapter that performs nicely.

Still on the lookout on eBay if someone's selling one cheap.

Attached: 20181106_152732.jpg (1932x2576, 969K)

Phenom II is a based CPU but it's a shame that it doesn't support SSE 4.1 or 4.2

no, it's bad
t. former FX-6100 owner

Is this not the sexiest single slot GPU you've ever seen in your life?

This, Vishera has the modern instructions and it gives alot of performance at a low voltage compared to Thuban.

Attached: 20181110_204433.jpg (2576x1932, 1.66M)

No.
T. 2700x and a 1080 your friend is a lying cunt and minimum fps on bullshit dozer is garbage

My guy! Thank you! You have any advise as far as cooling then? I have the gigabyte 990FXA-UD5-R5 rev1.0 and the hyperx fury for RAM similar to but aryan edition

I use the Hyper T4 cooler, it's very large, yes it mounts sideways facing the GPU, but it uses the lattch mechanism, which is easy to use and I am lazy.

Couldn't be more happy, 58C with a 4.4GHz OC an 8 cores. at 1.38125v and Medium LLC.

Attached: IMG_0213.jpg (4032x3024, 1.59M)

Wait a damn minute!

Is that dust that is caked on at the top of your keyboard? Holy hell, now that I'm looking at it closer there is so much dust in that keyboard! This is literally just a little over a year ago... compressed air cans were still pretty affordable and accessible back then.

Yes it's dust.

Attached: 20181110_210146[1].jpg (2576x1932, 2.58M)

Wtf I watched an FX-8150/Radeon HD 7970 Fortnite benchmark yesterday and was amazed at how well it was performing and now today I see a thread about Bulldozer performing great on Fortnite. Coincidences are weird.

youtube.com/watch?v=9LLZ6jBmDsA

7970 and bulldozer is a match made in heaven, my card sags so fucking much and the tdp is high but it's amazing.

It sucks that the HD 7970 had terrible drivers when it first came out. NVIDIA saw how terrible it was performing and chose to release their mid range GK104 GPU as a high end GPU (GTX 680).

Now the HD 7970 absolutely destroys the GTX 680 since the GTX 680 was never a high end chip.

Okay I cleaned it for the first time since 2014.

Attached: 20181110_211517[1].jpg (2576x1932, 1.62M)

Do you just not use that computer or keyboard anymore? Why is there so much?

Attached: Woah mama-mia.jpg (540x399, 26K)

Kek, I just never use the 10-key side of it.

>I run my 8320 4.4GHz 1.37v on an NH-D9L.
I had the 8320E (just a lower clocked and lower watt version of the 8320) running at 4.6GHz @ 1.404v. It was a really good chip in that regard. With a Phanteks PH-TC12DX for cooling. Sold that chip/setup and got a 4790k when they came around and got that to 4.8GHz @ 1.275v using the same cooler.

Is the 8320e super nicely binned or something to get 95w at 3.2? I've seen guys getting insane clocks on it, better than what I have on the 8350.

1. at lanch AMD FX was enought for all games
2. now AMD FX is enought for all games
-------------
1. at lanch intel 2600 was over enought for all games
2. now intel 2600 is shittier 2 core corelet freesing in all new games

Attached: BOTNETS2.png (475x413, 19K)

No idea. Been years since I owned it / clocked it. 3.2 @ 95w wasn't outrageous when the regular 8320 was 3.5 (I think) @ 125w. Piledriver was mid 2012... so Ivy Bridge would have been the competitor. The 6 core / 12 thread 4930k was a 130w chip with a 3.4GHz base clock and 3.9GHz turbo.

FX at launch was pretty shit because 99% of games and applications (barring major production programs) were made for single core rather than multi-core. I owned an Bulldozer FX-4100 at 4.6GHz on air then an Piledriver FX-6300 at 4.9GHz on store bought water, and lastly an FX-9370 at 5.5GHz on custom water. When I pushed for 5.6GHz, I popped 2 voltage regulators and went Intel then. FX were a blast to overclock, had the heat output of blast furnaces, and my God their encoding capabilities and video editing was awesome.

With how common multi-core is in software these days, FX can hold their own still. My buddy still has his FX-8320 at bone stock with his Radeon 7950. They still give him around 60FPS in most titles at medium to high settings at 1080p.

I already have a R7 2700x.
And yet, whenever we do lan parties, his framerates are lightyears above mine, despite the fact I've done everything I can to optimize. (And again, my framerates are actually right on par with other Ryzens)

>FX at launch was pretty shit because 99% of games and applications (barring major production programs) were made for single core rather than multi-core.
so wat? 120FPS vs 205FPS is not advantage on 60FPS monitor
----------
1. at lanch AMD FX was enought for all games
2. now AMD FX is enought for all games
-------------
1. at lanch intel 2600 was over enought for all games
2. now intel 2600 is shittier 2 core corelet freesing in all new games

Attached: TESTYOU.png (287x389, 133K)