Why does Ryzen perform so poorly compared to Intel CPUs outside of synthetic benchmarks...

Why does Ryzen perform so poorly compared to Intel CPUs outside of synthetic benchmarks? Is there any point to owning one if you don't use your computer to run Cinebench all day?

Attached: CPUs vs CPOOs.png (913x908, 107K)

Other urls found in this thread:

techspot.com/review/1829-intel-core-i5-9400f-vs-amd-ryzen-5-2600x/
techspot.com/review/1655-core-i7-8700k-vs-ryzen-7-2700x/
youtu.be/IBN5-H4ob5w
overclock3d.net/news/cpu_mainboard/intel_s_soldered_i9-9900k_can_offer_improved_thermal_performance_though_deliding/1
twitter.com/NSFWRedditGif

because software is still optimized for intel

>Is there any point to owning one if you don't use your computer to run Cinebench all day
If you do literally anything other than gayming, then yes.

would you mind showing real resolution tests? 1080p is so last decade.

Because most software can't efficiently thread 16 workloads, let alone 8. Never fall for the AMD meme unless you know exactly what you're getting into.

>no kikeripper in sight
no big surprise.

>Beacuse retarded persons like me can't effectively use 16 threads, let alone 8.
Fixed your mistakes. I don't know why you may bother in buying a powerful CPU if a Celeron is way enough for you user

Having 82 Jow Forums tabs open isn't efficient, nor is it """productivity"""

>Inturds devolve into posting russian benchmarks because they can't shill in the 1st world anymore
Cope levels just don't stop rising and we have only seen a mere tip of zen2's dick.
I can't even picture the amount of anal hemorrhage it'll cause

Intel sales are only successful in India and China.

>Th-These benchmarks don't c-count!!!!!!
>Ch-Ch-Check out these ones using $1500 RAM!!!!!!!!!!!
Literally every single time.

Attached: pathetic.jpg (506x668, 334K)

Why are intards so afraid of zen 2? Isn't competition good for both sides? Also good morning to everyone on the east coast, this will be the ~200th time I post this.

techspot.com/review/1829-intel-core-i5-9400f-vs-amd-ryzen-5-2600x/

techspot.com/review/1655-core-i7-8700k-vs-ryzen-7-2700x/

Attached: ZomboDroid 26042019075355.jpg (1324x2868, 415K)

Why don't benchmark charts ever have the 3770k listed? They usually have the 2500k, the 2700k, and then jump to the 4670k and 4770k.

It wasn't supposed to be like this Ivy Bridge bros.

because ivy bridge is only a node-shrink progression with a toothpaste TIM regression

I'd like to see an image like that but with OC, since a lot of enthusiasts will OC for daily use and on some cases even tweak to the maximum their timings to get +4000MHz ram and 5,2GHz on the Intel and similar for AMD. So yeah, stock vs stock, average OC vs average OC and high OC vs high OC.

That benchmark is OC'd. The i7-8700K is ~1GHz higher clocks.

Attached: ZomboDroid 07052019094313.jpg (1314x9536, 1010K)

>gaming

>real-life measure of CPU performance bad
stop trying to fit in

OH, nice. Since next gen intel CPU will be still 14nm, Zen2 will be really close to it at 1080, but since I play at 1440, then it will be pretty much the same, like now.
Though I didn't really play new gaymes at all, al I do is play lol and minecraft, but I have stuff in the background.
AMD, pls deliver your shit on September, don't make this suffering kill me myself.

>competition
Intel is getting genocided.

No, but compiling and full testing software/intensive data mining on my local machine IS intensive

>Why does Ryzen perform so poorly compared to Intel CPUs outside of synthetic benchmarks?
For one, SMT being more efficient than Intel's Hyperthreading in those workloads.
And synthetic benchmarks not being latency sensitive unlike high refresh rate games.

That's why I think Zen2 will crush Intel in synthetics but gaming should be pretty much on par assuming AMD delivers as promised.

Attached: 2018-11-25-image-4.png (1328x1091, 55K)

>shitposting

Neat LARP, kiddo. Imagine paying hundreds for HEDT just to shitpost on Jow Forums lmao cant relate

What RAM was being used for this test? I'm starting to see a trend here.

Attached: Ryzen-7-2700X-memory-opt-fps.png (639x263, 29K)

Why do you care about the performance of Yakuza Kiwami 2?

Attached: 1351220456955.jpg (672x701, 77K)

Depends on game engine. Certain engines are designed to take advantage of low latency of the cpu. Others are designed to take advantage of the raw power of CPU. Others are designed to take advantage of limited cores/threads and usually limited by single core or dual core. Any optimization past dualcore is shoddy at best.

Stop proyecting user, you don't have a use case for using 2+ cores but that doesn't mean others are like you

Also a little side tracked but why did it take vega SO LONG to catch up to RTX?

youtu.be/IBN5-H4ob5w

Attached: Screenshot_20190507-105319.png (1280x720, 816K)

>you don't have a use case for using 2+ cores but that doesn't mean others are like you
>Stop proyecting [sic] user
rofl

You are literally arguing there's no point for 16 threads. I'm not even the same guy, but you are literally arguing that tons of threads are useless. Are you OK?

No better argument?
HA HA HA HA HA HA

False

>Certain engines are designed to take advantage of

>low latency of the cpu
Then Intel wins
>raw power of CPU
Then Intel wins
>take advantage of limited cores/threads
Then Intel wins

BUT
AT
WHAT
COST?

I'm really curious, since an OC'd i7-8700K to 5 GHz has like 25% faster single thread performance than a stock 2700X why is it sometimes only ~5% faster even at 1080p?

See

everything

Attached: PCxRwPu.jpg (1327x1222, 331K)

This. In the quest to add even more parallelization to Frostbite for BF1, DICE ended up hurting performance across the board. In all honesty, we'll probably never see a game engine use 8+ threads efficiently.

Why is entertainment software still so horridly optimized for multithreading?

>6/12 vs 8/16
>single thread performance is faster on the one with fewer cores but is just slightly faster in MC.
You answered your own question.

>360mm custom
So cute~

Attached: Core_Temp_2019-05-07_18-00-02.png (664x903, 51K)

>still believing the intel lies that processors are still increasing in power when processors have been stagnating for over ten years
>intel: "ours is totally better guys, trust us, buy our overpriced botnet cpu with severe manufacturing defects that will eventually show up"
>amd: "just buy our cheaper processors, we even have a video chip on them that is just fine if you're not insane enough to want to run overblown fucking shit games on bro"
Anyone choosing intel over amd is an idiot.

Man why is there such intense Intel shilling today? Your +10% performance gain doesn't matter when your shitty CPU has a gazillion vulnerabilities and its three times more expensive. We don't and won't give a shit about a company that held back technological progress out of greed

What's up with the heat meme lately? My air-cooled 3570k hits 85 under load and it's been going strong for 7 years.

So what refrigerant water chiller are you using for your setup? 29C is pretty impressive at idle. Does it go above 90C under AVX prime95 loads?

Attached: 1539965136296.png (1000x746, 235K)

AMD can't do 240fps. Sorry, but not everyone wants to buy budget hardware.

Hottest I've seen it during benchmarking was 76c.

Attached: prime.png (1578x1130, 122K)

It doesn't.

Intel uses really poor soldered thermal interface material under their integrated heatspreaders on their i9 processors thus really exotic cooling solutions are REQUIRED for 5GHz on all cores. i7s just get jew cum.

That wasn't his question and even if I ask you how long you spent lapping your IHS you wouldn't answer me, would you?

Attached: 19102414455l (1).jpg (1918x1033, 601K)

>Does it go above 90C under AVX prime95 loads?
>Posts Prime temps
>That wasn't his question


I couldn't answer you because I first have to grind the IHS first to say how long it took.

>240fps
Oh look, a kid that plays overblown fucking shit games.

He was asking what cooling solution you were using and it's not uncommon at all for i9-9900K owners to use mini 1/4 HP refrigerant water chillers to get temps similar to yours AFTER deliding of course. Pic related is really popular with i9-9900K reddit """""enthusiasts""""" trying to get 5.1 GHz alongside their 360mm custom water cooling solutions.

Attached: Screenshot_20190507-121942(1).jpg (720x1118, 305K)

Yeah, I ignored that part because it was too stupid.

I cool mine with a RX480 and a RX360.

Attached: hqdefault.jpg (480x360, 16K)

Aww, did the truth just trigger you? Sorry, sweetie!

kino fans

Of course you do, nevermind the fucking pex lines sticking out in the back of full ATX case to the water chiller.

Damn this kid is salty

What truth? That you're an idiot who plays shitty games? I bet you're posting this waiting for some shitty cutscene from your goddamn /tv/ "game" to finish.

>i9 processors
>posts an image of an OC'd i5-9600k
What

>fucking refrigerant water chillers just to play video games 5% faster
Man, you seething intards are absolutely off the fucking rails. I have my 2600 OC'd to 3.9GHz at 1.25v on the stock cooler and get 80C temps max on prime95.

Attached: Blender_OC.png (1336x1460, 52K)

Can you elaborate?

>3.9GHz
>80C
Hahaha I remember when my 3930k did that in 2011 oh wait

>can't get hot if the performance is naturally low!

Attached: 1.jpg (717x359, 28K)

Not sure why they used that desu. But still, sad that intel is spitting on its customer's faces. I almost feel sorry if not for how fucking stupid they are are.

>"der8auer decided to lower the thickness of Intel's i9-9900K silicon by "lapping" the processor using ultra-fine sandpaper,retesting the processor after removing 0.15mm and 0.20mm of material from his i9-9900K. After lapping his i9-9900K der8auer was able to achieve temperatures that were five degreeslower than his already delidedprocessors, offering a combined 13 degrees drop over a stock/soldered i9-9900K."

overclock3d.net/news/cpu_mainboard/intel_s_soldered_i9-9900k_can_offer_improved_thermal_performance_though_deliding/1

Attached: IMG_1899.png (1136x640, 1M)

Picture is not mine, just was too lazy to post my case.

Read above.
Look to the left.

This is how it looked like when I still water cooled my GPUs.

Attached: guts.jpg (1155x1573, 991K)

>Niggahertz meme
this is really embarrassing. There used to be a time when Intel boasted that less clockspeed meant more, back during the Core 2 era.

because nips can't into optimization.

Considering this is being done on the free stock cooler I got I'm honestly impressed desu. Got my 2600 for $129 at micro center and spending hundreds for 5-10% higher performance on say a ryzen 3000 zen 2 series isn't really on my wish list. I'm fine with just upgrading to navi from my RTX 2060 desu.

Who are you quoting? Is this a projection?

Agreed. It's a nice little web browsing CPU for mom.

p-put the burden on the GPU to make AMD look less bad!

Because AMD CPUs are not optimized for software.
They think that slapping more cores and bruteforcing is efficient. Their single-core performance shows how wrong they are

This. World War Z has left the entire nvidiot community shaken, the brute force shader optimization on that thing is fucking insane.

Though in all fairness nips were always just island jungle monkeys at heart.

Not him but I have a Q2C, soon to upgrade to zen 2 abd he's right. Intels has been bashing AMD for using surface of the sun boiling hot frequencies for a while.

Attached: World-War-Z-1920x1080-Vulkan.png (805x935, 61K)

Idiots like you give OC headroom too much merit when efficiency on lower clocks will trump OC headroom in the long term.

This is the real reason why AMD isn't attempting to bruteforce anything. They've learned from the Bulldozer shit (these things can overclock well, but still had shit IPC). And what intel learned with the P4.

Imagine if both companies had your stupid brain. We'd have Pentium 5's that has a single core and Liquid Nitrogen will have to be a norm for MUH GAMING

>bruteforcing.
not really. if you fucktards had your way. We'd still be waiting on Itanium to be 'perfected' because x86-x64 wouldn't exist because its "low standard AMD shit"

Ah, it was a projection after all.

Tfw my overclocked 2600K from 2011 performs better than AMDs flagship CPU from 2019

I don't get it. All the games you're talking about are on high-budget game engines that benefit from more threads. This smells an awful lot like projection to me.

>even his cherry-picked benchmarks show intel winning

>just to play video games 5% faster
Maybe 5% overall, but not in select games with high refresh rate.

>2600 OC'd to 3.9GHz at 1.25v
Is this supposed to be impressive? I ran my old 5820K (also 6/12 CPU) at 4Ghz at 0.98V and it was stone cold.
Its max OC went up to 4.8GHz and it would stomp your Poozen into the ground at those clocks. Welcome to 2014 performance bro.

Because the code is 95% middleware.

>high budget games with paid for by jews.
Fixed that for you.

>let's bottleneck the GPU, this will surely show them!

So AMD = jews. Got it.

>Its max OC went up to 4.8GHz and it would stomp your Poozen into the ground at those clocks. Welcome to 2014 performance bro.

>Niggahertz meme once more.

>If I keep saying it, it will be true!

WRONGTHINK

WEEEEEEEEEEEEOOOOOOOOO
WEEEEEEEEEEEEOOOOOOOOO
WEEEEEEEEEEEEOOOOOOOOO

like what? Cowwadoody? Rise of the Shit Raider? Fortnite Kids edition?

>In my intel-only world, where x64 doesn't exist. Where Intel kept on adding nigguhertz

>4Ghz at 0.98V
Doubt.
Fire up Prime95 so we can laugh at your "stability"

>Also a little side tracked but why did it take vega SO LONG to catch up to RTX?
?
RTX released after vega?

you do realise weeb games are NOTORIOUSLY poorly programmed right?

What's even your argument except random sperging at anyone that mentions a one of the specifications that affects CPU performance, the core clock?
Yes my old overclocked CPU from 2014 is faster then your Poozen from last year.
Yes my old overclocked and undervolted CPU from 2014 ran cooler than your Poozen.

Why do AMDrones think that Poozens are some super efficient and cold chips?
You can downclock the i9 9900K to Poozen frequencies and it'll run just as cool and efficient.

I haven't had that CPU for over a year now but yes it was P95 stable.
You can find similar results on the forums, 4ghz at around 1.0v is nothing strange for the 5820K, just a bit of luck with the silicon lottery.

>I haven't had that CPU for over a year now but yes it was P95 stable.
>You can find similar results on the forums, 4ghz at around 1.0v is nothing strange for the 5820K, just a bit of luck with the silicon lottery.
Then those were damn fine chips. Intel at its finest

Ok user, the only real reason you may ask for 240fps is because you are on competitive gayming, please tell us which games do you play soo competitively that you are invited to tournaments around your country or the world. If that doesn't happen your argument is useless

How many times is this pathetic excuse of a shitpost going to be posted?

Not him, but no your old intel overclocked cpu isn't faster than the new amd overclocked cpu and anyone who isn't a teenager just getting into PC gaming knows this. You're core clock factoid isn't relevant. Exactly half of the time the weaker architecture in any given year is the one with higher clock.

Reminder that "competitive gayming" are codewords for "I stream for Zoomers and I'm paid to advertise gayming gear"

Except, at the same clock speed, they have the same performance and on non-gaming tasks, the Ryzen has it beat.

Intel is good with poor engine games

Better than shilling AMD products on an image board for free while pretending to be """productive"""

It's better than pretending to be a Pro™ Gamurrr™