Both FX-6300 and FX-8350 are faster than i5-7600K in BF V with RTX enabled

INTEL FAGS BTFO!

DELID DIS!!!!!

Attached: Intel BTFOyaM8fDE.png (898x892, 80K)

who cares when intel delivers higher quality pixels via their extended pipeline

Even though ryzen is out now, FX was still one fuck of a value when it was new.

i7-2600k 3 times as fast as i5-2500k?
how is that possible? Shows you how worthless the benchmark is.

Best $70 bucks I ever spent.

>i7 3770 multi core performance for less than 3570K
How can anyone hate the FX-8350?

It just took 5 years for that value to finally materialize in a game.

That is true, but if you transcode alot of you media down to X264, an FX8350 wil chew through a handbrake queue no problem, I actually like it better than those Xeon E3 chips you see on mainstream intel Z series boards.

Attached: Multicore.png (422x460, 43K)

hyperthreading shekels have been deposited to your account

I actually liked using those old unlocked Xeon e5 chips that would sell for $200-ish for a compete workstation for those work loads. FX was a mistake.

>Xeon E5
Well that's kind of a whole other socket type, AMD had no HEDT socket, one can only imagine how many AMD FX 8 core dies AMD would'v glued together if they had an X99 type socket, but on the mainstream desktop, FX was not bad compared to the Z series chips.

>Well that's kind of a whole other socket type, AMD had no HEDT socket,
Your point?
>FX was not bad compared to the Z series chips.
Stay delusional.

It's not bad compared to the mainstream Intel chips available at the particular time, and yes an HEDT socket like LGA 2011 is completely different than a socket where the mobos cost >$100, it is a server socket brought to the desktop. Only now does AMd have an actual HEDT socket called TR4, and it's kicking ass because of it's price to performance ratio, it's getting to the point where the 9900K is so expensive, threadripper becomes a viable option that makes sense.

that's a 7600k at 3.8ghz you retard that's slower that it runs stock on most motherboards.

you can run it at like 5.3ghz you retard or 5.5ghz if you disable all cores but one.

It's not bad compared to the mainstream >Intel chips available at the particular time, and yes an HEDT socket like LGA 2011 is completely different than a socket where the mobos cost >$100,
Not relevant, because nearly complete SB, and even IB e5 workstations were made available second hand for very cheap (minus hard drives) by the time fx8350's were out, and they handled whatever you claim that you did with your fx8350 better than your fx8350, and the mainstream Intel CPUs handled all the mainstream workloads better than the FX has had.
You're looking at the FX series with rise colored glasses, but they were a shit buy during its existence. Granted I personally bought 2 myself, but I never glorified, or grew a fetish them.

Taichi Ultimate+9900K owner here.
This benchmark is fucked up.

A Ryzen 2700X is WAY faster than the 1600X and 4770K.

That's because there's a GPU bottleneck.

True.
We need a GTX2080 series instead of RTX.
Use the space that the raytracing is for tensor cores(AI based AA) or regular cores for performance over gimmick rendering.

Holy fucking COPE

Because BF5 profits a lot from more threads and 4 really isn't cutting it.

>Wow, a video game performance is based on the GPU mostly rather than the CPU and a few older AMD chips perform better, intel fags btfo WOOOOOOOOOOO

>tfw the corelet meme is real after all
I told these i5 2500K motherfuckers that once games take advantage moar cores, a 4c/8t i7 or AMD FX-8000 series is gonna be needed or it'll be a huge bottleneck, but multi core doesn't matter to these people.

kek

AMD playing the l o n g g a m e

yeah, I guess its a good thing that incel users can now benefit from AMD's competition right? with the complete stagnation in the CPU market and all

I have a hard time believing that they have an actual benchmark of a 2080 ti paired with an FX 8350.

7700k gets double the framerate of a 6700k in that graph as well...

Ukranians are retarded.

welcome to literally every CPU thread on Jow Forums

There is no 6700, every i7 other than 2600k is in the GPU bottleneck territory.

Meant 7700 vs 7600.
And no a GPU bottle neck wouldn't explain how horribly retarded this chart is.

High end CPUs seem to be very similar to majority of other who tested RTX, but I'm hard pressed to find any i5 test. If you have any, please provide for the sake of comparison.

>if you disable all cores but one

Intel fanboys confirmed to be living in the year 2003 where a single 5GHz core would be a good thing.

Why does no one ever point out they are not overclocked to their full potential like most people do with their cpus?

The most likely explanation is that they didn't test it and the numbers are fake.

retarded people will not survive

Attached: UPGRADE2010.png (836x768, 17K)

>tfw if you bought a phenom II in 2009, you could still buy a new mobo for it until early 2017

Attached: CPU Store.png (720x720, 277K)

And they will pretend games didn't matter.

shitty single core and shit devs made it look far worse than it was

realistically, fx was better for a casual computer use then an i5 or i3 was,
rendering up till the 4790k the fx was king
and even then though it wouldn't deliver the best fps in games, it was likely going to deliver a smoother experience when games used 4 cores fully

that said I had a phenom II 955, so going fx would have been a side grade, better in some aspects worse in others, so glad I held out for bulldozer.

I wish I lived in your world where things have no value or cost and everything can be had without worry.

amd released it on a bet that multicore was the future

it is, the problem is devs have to be dragged kicking and screaming into the future. its not surprising to see amd cpus do well when devs code in a way that makes use of them.

had we known about the stagnation, the smart buy would have been sandybridge i7 and ride that into zen2

Can someone explain this meme to me? Is it mocking the fact that intel changes its socket with each new gen?

the year is 2000+19
AMD drones are still defending bulldozer even though Ryzen exists.

Attached: 1454304137655.jpg (240x240, 9K)

That is such a nonsensical benchmark it simply isn't funny

Would probably still be running an 8350 if my shitty MSI board didn't burn down.

>I wish I lived in your world where things have no value or cost and everything can be had without worry.
Oh, so you're not American, and you don't have access to online shopping sites like eBay?

you dont get it

Keep coping, intel shill

It's mocking Intel's short-sightedness and how they convinced everyone that 1-5% IPC gains every year on 4C/8T CPUs are enough.

>4 threads are enough for gaming

Attached: Brainlets.jpg (586x578, 45K)

>tfw my i9-9900K is performing with the rest of the best

After owning shitty hardware, it feels so good

NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO

Attached: 1542027477262.png (807x745, 205K)

Oh look, this thread again.

Attached: dcvsmarvel.png (988x896, 1.75M)

Seems legit

Attached: hmm.png (898x892, 109K)

wtf is DC and MARVEL
(full caps like idk)

>owning this status symbol feels good because I was too poor to afford quality hardware

or you could switch from LGA775 directly to Sandy Bridge and keep using it because there was nothing significantly faster up until 8-th gen.

>CPU gen where we see actual gains happens to be then gen Intel adopts AMD' moar cores moar ghz strategy

>moar cores
about fucking time too, we've been stuck with 4 cores on mainstream desktop since the Q6600.

FX was so much fun at overcloking too, loved a lot my fx6300 at 4.9ghz

It's strange that suddenly FX chips preform better after Intel decides to start using more cores. Makes you think.

Probably a combination of multi-core advancement and security patches.

Hold the phone.
I'm running an FX-6300 right now, paired with a GTX1060 6GB. Was considering upgrading CPU & MoBo. Will be playing at 1080p. Worth the upgrade? I don't think I've ever even tried OC'ing my CPU, even tho it comes unlocked.

Attached: you WHAT.jpg (480x360, 14K)

>everyone decides to be modern and use more than 4c after Intel adds cores to be relevant
It's not above these companies to gimp their competition, look at nvidia gameworks.

1060 is probably the fastest I'd install in an FX system, that's a faster card than my radon 7970, which is still decent at 1080p

Attached: cGIay9e.png (280x272, 3K)

>intentional ignorance

Just Wait™ for ryzen 2

> Worth the upgrade?
Here's a tip for you: when you are playing a game and notice uncomfortable FPS dips, fire up Task Manager at the performance tab. Play again. Once you notice dips again, switch back and look at the CPU load graph. If it's fully loaded, time to upgrade.

Thanks man. I'm so stupid, I never even thought about doing this. I have a 2nd monitor so I'll leave that graph open pretty frequently now, just out of curiosity, to see how it's performing. I'm sure it's well. I don't play many graphically amazing games, but would like to prepare for ACECOMBAT 7 and playing it at the best quality.

2700x here siege runs amazing on it if the game engine/program is properly multi threaded it never drops below 4k 100fps max and I'm only on a fucking 1080 gtx

Sell your 6300 for real cheap and get an 8350, its 75 dollars on newegg and that extra module with 2 additional cores will let you hang on a little longer

I was actually considering it a while ago, but I'd personally rather take those $75 and put it towards a new CPU & mobo (when the time comes, of course).

Good thing you told people who bought a CPU 8 years ago that in 2018 they will have to change their CPU.

Attached: 4636598+_066fb3ee131beb9aed6088300e75b326.png (363x314, 129K)

At this point, with Zen 2 right around the corner, the Just Wait™ meme is real.

t. 4690K corelet who's Just Wait™ing

The framerate difference is negligible and irrelevant. Nobody would notice something like this.

Came here to post this. Those who bought a Sandy bridge i5 were guaranteed to get smooth gaming experience for at least 6 years ( which involved two console gens) it's literally athlon 64 tier legend status at this point.

The fact that you can get 30 fps from a 2018 game at Ultra with it(before overclocking) is a testament to how good Sandy Bridge was. Especially once you look at the 2600k, that thing can still chew through any game you throw at it.

>i7-2600k 3 times as fast as i5-2500k?
how is that possible?

Hyperthreading along with a larger cache.

Intel has been robbing consumers blind for years with their intentionally gimped i5's

The RayTracing is leveraging Multi-Threaded on the CPU's hard which is why 8-thread chips are destroying 4-threaded ones.

A Haswell i5-4670K is being BTFO'ed by an FX-8350 entirely do to its SMT and superior multithreading design.

One you get up to a certain level of MT performance the chips all match each other because the graphics card itself is the bottleneck.

The next generation of Nvidia cards with higher Ray_Tracing performance are going to allow us to separate these chips out and get a full breakdown of which manufacturer has the better multi-threading ability.

Im betting AMD will match or beat intel due to their True SMT design being more efficient than the inferior Hyperthreading design on Intel chips.

Thats gonna be and interesting selling point for AMD going forward. Their processors are literally better when Nvidia's vaunted ray-tracing is enabled so if your an enthusiasts gamer buying the next gen RTX-3080ti your going to have to buy AMD becuase they will deliver higher FPS when your RT is enabled.

is this the first game to actually use cores or something? wtf

Attached: whistle.png (250x260, 34K)

Actually yes. Its running the game and also leveraging the CPU for Ray-trace calculations.

FX is eternal

what? doesnt it only do raytracing on dxr?

I own an 6300. Its horrible and cant hold 60fps ib gtav fuck u for shilling it

*sips*

Attached: Fine Wine.png (1600x1036, 726K)

>buying an i9-9900K
>poor

Kek retard

>buying this status symbol should give me the appearance that I'm not poor

>caring whether or not someone thinks you're poor
Try again