Serious question: what is it specifically about AMD's architecture that makes it perform so catastrophically badly in...

Serious question: what is it specifically about AMD's architecture that makes it perform so catastrophically badly in games? Not trying to generate butthurt, I'd just like to learn.

Attached: 1565905867412.jpg (1920x1080, 528K)

Other urls found in this thread:

youtu.be/Iva7lSdAIxQ
techspot.com/review/1885-ryzen-5-3600-vs-core-i5-9400f/
youtube.com/watch?v=SY2g9f7i5Js
gskill.com/community/1502239313/1564640918/G.SKILL-Releases-Optimized-DDR4-3800-CL14-Memory-Kit-for-AMD-Ryzen-3000-&-X570-Platform
overclock.net/forum/375-mice/1550666-usb-polling-precision.html
github.com/dobragab/MouseTester/releases
twitter.com/NSFWRedditImage

there's some sort of latency issue.

It requires VERY fast RAM.
I assume most reviewers have regular run of the mill gamer ddr4.

>ITT "secret sauce"

They use intel compiler that doesn't take optimized code paths when it detects non-intel cpu

chiplet design.

It hasn't got the blessing of Zion

with 3800 cl14 the 3900x is almost completely at parity with intel running the same ram speed
problem is that ram is either hard to get since you need to get lucky, or very expensive.

>catastrophically badly
>shows an example of an extremely well optimized game which a 750 Ti runs at high
I don't understand what exactly you have to complain about when even average fps is well in the hundreds. The fuck do you need so much fps for?

144 Hz screens.

Why would you buy AMD even if it was 1% slower for the same price? What is the point?

What do you mean? The FX Series outperformed it's competition within more complex areas in games such as Witcher 3.

probably this.
Developers tend to use generic libs so its much easier for them to cross platform their games.
The up to date software with CPU is usually the linux kernel and probably some compilers but the rest lack behind like a year or half a year.

>144 Hz screens.
It's dubious if anything above 120 is even noticeable.
>Why would you buy AMD even if it was 1% slower for the same price? What is the point?
Well, let me assure that such a small difference in pricing is only a thing in the US and maybe a couple of other countries.

lying piece of shit

No mate I think you still don't get it do you, 9400f is $145 whilist 3600 is $199

Attached: 2454.png (1268x673, 207K)

And another thing - Witcher 3 is an RPG, not a first person shooter or a competitive online game. Not a good example really. And again, it's not yet certain if anything above 120 Hz can even be perceived.

>2080 ti 1080p
every time

what's wrong? seems like a reasonable choice to eliminate gpu bottleneck, it is a cpu comparison after all

yes it can, just because you cant doesnt mean u need to cope with ur hertzlet ego, 144 hz vs 165hz is very noticeable and so is 240hz. and its more than just FPS that gets better with higher the Hz.

I mean, fine, but you still have to count in a cost for a new motherboard. Also, that first benchmark was performed with the APU, right? I mean, why? What's the point of that? It's only really beneficial for laptops and no one seriously plays games on those.

>yu just dan see it mang
Yeah, great explanation and nice sources.

APU? I'm not seeing 2400g or such anywhere in the benchmarks. x570 mobos are typically more expensive than z390 ones by the way

Well, no one's posted a source and fast RAM vastly improves performance of Ryzen APUs. And it says 3600 MHz DDR4.

I don't think you know what apu is

I overclocked my DDR4-3200 to DDR4-3400 and got 90 points in Cinebench with an r5-3600, they like fast ram. Guess i should buy a 3800 kit.

Whatever. There's still no source for the screenshot and I don't have telepathic abilities.
And yes, Ryzen APUs benefit from fast RAM, this isn't a secret.
youtu.be/Iva7lSdAIxQ

why would i link you sources since ur too retarded to seek them urself, only thing u can do is gloat ur worm ego that is nerfed in the head since birth with clever wording and correct grammar

Attached: 1555244235001.png (571x618, 29K)

What in the actual flying fucking fucking fucking fuck are you even talking about?

that u are a clueless retard too poor or too crippled in head to admit higher refresh rate than 120/144 matters a lot. and that you have an inoperable reddit tumor aka your brain

If you're talking purely about frame rates, the lows are important too. This test is also done with a 2080 Ti:

techspot.com/review/1885-ryzen-5-3600-vs-core-i5-9400f/

Attached: come on.png (1093x333, 80K)

because it's not a real world comparison at all. you don't see the same kind of scaling when you run at real world settings.

>u ave a tumaourh braourh
Wow. Great explanation after another, thanks man. I totally see how refresh rates higher than 120 Hz matter and make sense, thanks for all the articles and sources.

Think they said the windows scheduler wasn't utilizing the cores correctly, though that might not be the same thing as those crazy drops.

Also one of the recent windows updates was supposed to have addressed that, wonder if it worked.

it doesnt overclock as well.
most games and setting are on par, in that same video.
youtube.com/watch?v=SY2g9f7i5Js
>fake news
embarrassing.

>exact same thread every day
I really hope he is getting paid to shill this hard, it's probably just extreme autism though.

amd doesnt have ivy bridges.

Do you want a 1050Ti and 4K instead
Spoiler the results would all be equal

>user posts a benchmark where AMD CPU gets utterly BTFO at 1080p
>"Lmao, no one with a 2080ti seriously plays games at 1080p retard xdddd"
>user posts a benchmark where AMD CPU gets utterly BTFO at 4K
>"Dude, 4K benchmarks for CPUs are fucking retarded, the proc is obviously being bottlenecked by the GPU, nice try Intcel shill"

Why are AMDrones so fucking desperate and outright lying to themselves. They always post those aggregate benchmarks where AMD seems like it's only slightly behind or keeping up, but everytime someone makes an extended in depth benchmark about any game, AMD shows its true colors being, completely and utterly BTFO by Intel.
Then they either straight out deny the truth or blame the mobo Bios/Windows real time clock/literally any excuse you can imagine and argue that the performance will be better as the time goes by.

There is literally no person more delusional than someone who bought an AMD product.

Attached: 68105659_p0_.jpg (1066x1491, 178K)

It's literally just clock rate now.
If you run the intel chips with the same clock rate as the 3xxx, the AMD wins.
But as you can push intel chips over 5 Ghz, they end up faster on games that don't scale at multicore well (everything not programmed from the ground up to vulkan/DX12 like Doom 2k6).

>If you run the intel chips with the same clock rate as the 3xxx, the AMD wins.
*Dun Dun Dun*

Attached: Zen 2 IPC (2).png (1362x767, 278K)

Not bad, but i bet i can find tests where the intel gets defeated.
When its this close, it ends up on a case by case scenario.

In short:
AMD tries to be general purpose while NVIDIA/Intel cooperate in a lot of use-cases that just cover the normie ground.

>3800
>CL14
If this existed, holy shit.

1. Ryzen is shit at draw calls
2. CCX/chiplet design has room for improvement
3. Devs love to optimize and build their games around Intel with AMD 2nd in mind
4. Zen is a modular design meaning it wasn't made specifically for the consumer market hence the lower numbers
5. Windows is garbage
6. Lower clocks
7. Can't OC well
8. Latency is still an issue
9. Zen favors very very fast RAM
10. You can't have it all

Why do we need this stupid fucking thread every god damned hour?

WE FUCKING GET IT: the buyers remorse from buying an intel processor is soul crushing and debilitating but spamming this same fucking thread every hour won't undo the shitty purchase decision you've made.

Attached: OC.png (1063x1950, 288K)

See, another aggregate benchmark.

Attached: 1546724549457.jpg (1440x1080, 161K)

gskill.com/community/1502239313/1564640918/G.SKILL-Releases-Optimized-DDR4-3800-CL14-Memory-Kit-for-AMD-Ryzen-3000-&-X570-Platform
it exists and is one of the most insane kits I've ever seen.

i haven't bought any new thing recently, I'm curious if the windows scheduling thing actually fixed the issue.

Because AMD cpus are great at gaming.
There's no reason to buy Intel anymore.

based

And now the amd shilling is making me think there is a genuine issue to avoid.

>comparing 12 core to 8 core

Why did you quote everyone? I for one don't really care that much because I'll be able to afford all of this in like 2024, I just pointed out some flaws.

Intel's 12-core would require multiple refrigerant water chillers to OC to 5GHz. When you find someone willing to risk burning his entire city block to the ground for gaymen benchmarks you let us know.

Because I want people to use pic related to shut these retarded threads down.

>mass reply coping
>still posts benchmark where AMD is loosing

Attached: 1565298234010.gif (427x240, 904K)

>optimized for 570 boards
But I want a b450

How does it shut anything down when it doesn't help?
Because Ryzen 3000 just came out there are bios compatibility issues with some boards, and I still don't know if the windows scheduling thing has been fixed, and if that was the cause of the latency.

>losing
You forgot price, temperature, and a few other stats

By 5% on average. Ask yourself this: is 5% REALLY worth spending $200+ on a high end Z motherboard with 100 phases, chokes, VRMs + $200+ on a tripple fan 360MM AIO?

>You forgot price
No I didn't, the 3900X costs more, not to mention the motherboard and "magic" B-die ram costing double

Attached: 1565749519282.png (2523x820, 605K)

this isnt reddit where self-psycho forward loop sarcasm works worm, you are clueless about anything related to refresh rate so you can only spew dogshit out of ur mouth "120 hz doesnt matter"

You don't need magic b-die RAM anymore and a $60 will have no problem with a 3900X and if you're gonna resort to cherry picked benchmarks then kindly explain pic related.

Attached: Screenshot_20190716-093305.png (1280x720, 332K)

LTT actually has done a few videos regarding the impressive scaling on Ryzen past 3600 MHz.
Why no 3900X in your pic?

Attached: Untitled.png (1920x1080, 2.08M)

there is a reason Poozen is benchmarked in FPS only and not input lag latency or stutter, its because the stutter is so huge due to latency, windows scheduler and multiple crossCCX hops that your mouse/keyboard movement would stutter as if u just had a stroke

70 fps with 10us CONSISTENT, NON-CCX latency
is 10x better than 100 FPS with 100-300us cross ccx, fucked up scheduler, fucked up IO thread affinity cross CCX for game input thread

He's right though, it doesn't. Human vision literally cannot clearly discern detail displayed in intervals shorter than 10ms (ie 100FPS). Prove me wrong.

Obkectively prove the latency issue is relevant irl. Also explain pic related.

Attached: sotr_0.jpg (1019x606, 119K)

why are u so retarded? just by the way you write and word ur sentences i can instantly see you are a low iq worm, go read all the blurbuster articles, there have been studies and papers by TOP NEUROSCIENTIS and VR HARDWARE ENGINEERS showing that 1000 Hz on 1000 FPS is needed

there is no point in proving u wrong, you are a tiny low iq reddit worm, if u want to learn something go learn it on your own by reading blurbuster articles and learning what MPRT is, as well as G2G times, and 10 other things

pic related is FPS not input consistency.
AMD is irrelevant CPU for anything past cinematic 60 fps game experience because no one in competitive esports wants their input fucked up, i dont give a fuck about irrelevant press X to talk worms playing their 60 fps console ports.

Objectively download LatencyMon on your poozen as well as MouseTester and graph the results

>Intel gaming
>spend months fishing for a golden sample CPU that can reach 5GHz all-core
>buy an expensive motherboard that can provide enough power for 5GHz overclocking
>buy an expensive cooler that can handle the 200w of your overclocked CPU
>buy a $1300 video card to not bottleneck your CPU
>buy a 1080p 240Hz monitor for the same reason
>buy an expensive PSU to power the hungry CPU and GPU
>play games while enjoying 5% more fps
vs
>AMD gaming
>buy a CPU and everything else
>play games

Why does intel have such serious stuttering issues in half the games tested? I thought they had super uber low core latency.

Anyone mind explaining this phenomenon?

Attached: MHW.png (1373x1413, 49K)

> Acuses me of Cheripicking cause I show 10 popular games
>Proceeds to post 1 which is GPU bound by hairworks killing performance
It's not even worth it arguing with these "people"

Attached: 1554225620685.png (1303x1068, 1.82M)

>5% more fps
>300% less input latency
>no cross context CPU calls that will make the game stutter

yes, its called not showing a benchmark in a console port game that will be irrelevant 2 months down the line. try CS:Go latency

>because the stutter is so huge due to latency
Brainlet here is what you're saying really true? Or are you just memeing?

The latency issue has been known for a while, don't know if it's actually fixed yet.

Attached: local latency.jpg (1773x996, 260K)

OBJECTIVELY. PROVE. THE. LATENCY. ISSUE. IS. RELEVANT. IRL.

I'm waiting

Attached: Screenshot_20190716-093341.png (1280x720, 309K)

% less input latency
You know input latency is measured in milliseconds, right? Using an AMD GPU over a Nvidia GPU is probably like a 1000x improvement in input latency compared to using an Intel CPU over and AMD CPU.

its true, there is no other CPU out there apart from Poozen that has multiple CPU cores and an I/O die so nothing is optimized for it, Windows deployed a patch fix but thats not enough because games need to "pin" the thread handling game input and IO to one CCX, but the problem is that windows I/O dispatching also needs to be pinned on that same CCX, otherwise there is again cross CCX input latency hops.

stop posting irrelevant FPS benchmarks, i dont own a Poozen CPU to prove it, look on Jow ForumsAmd for LatencyMon posts or MouseTester charts

except DPC latency is measured in microseconds retard. try again. you do know what DPC/IRQ are, right inbred worm?

>still has posted 0 proof

>still trying to gaslight this because he is getting paid for shilling on 4c

imagine your life value being so low that u must post using paper psyops tutorials in order to earn enough to live.

Attached: 1565386292448.png (753x960, 29K)

>except DPC latency is measured in microseconds retard
Oh, you mean the things that don't fucking matter because human brains are too large to notice such small time intervals? Or are you such a retarded bugman that you actually have an insect brain?

>still has posted literally 0 proof
I see

>except DPC latency is measured in microseconds retard
so then show the test where amd has latency issues.
you cant, because its not a problem in games.

>his only argument is deflecting that small numbers dont matter because his brain is to retarded to know what context they are applied in

>4Ghz vs 2Ghz doesnt matter, its literally just few nanoseconds difference, too small for your brain to notice
>what is stuttering
suicide pls? ty

find it, there is a vid online testing this in cs:go and amd has 2x the latency over intel

oh there's a video that proved what you're saying is wrong actually. go find it.

>OBJECTIVELY. PROVE. THE. LATENCY. ISSUE. IS. RELEVANT. IRL.
Here you go, twice the amount of input lag, in a competitive game, waiting for you to cope with some bullshit

Attached: Ryzen Input Lag.png (1274x717, 341K)

>2700x
very nice

thanks user
what this graph doesnt show is inconsistency though, imagine doing 1000 mouse inputs and sound gets a DPC/IO requests so your mouse input gets scheduled to another CCX and you end up with a cross-CCX input stutter =)

Attached: 8763487462876286.jpg (2256x2393, 920K)

It's the difference between a game's controls feeling tight and sharp to floaty. Course for shit players it doesn't matter anyways.

I didn't know 18 was twice 11.

holy shit fuck off lmao, you're comparing the input delay of the 2700x which gets lower fps. that has NO RELEVANCE AT FUCKING ALL to the 3900x comparison which surpasses the 9900k in fps in csgo. holy shit
>single digit IQ

Ghz vs 2Ghz doesnt matter, its literally just few nanoseconds difference, too small for your brain to notice
Oh wow, you're sooooo right! That's why 9590s are the best low latency processor! Great opinion.
>suicide pls? ty
Is this what zoomers say now?
>>what is stuttering
Gee, I don't know, seems like something that you'd be more experienced with.

Attached: Untitled.png (853x479, 521K)

there is a huge difference when you properly tune your PC/OS so that all IO requests take sub 10us and are processed with consistent pooling vs random lags

do u know what consistency is?

overclock.net/forum/375-mice/1550666-usb-polling-precision.html

FPS is not relevant to DPC latency in any way unless your CPU is at 100%

github.com/dobragab/MouseTester/releases
feel free to post your chart of frequency vs input with this tool if you are on poozen
>using single player 60fps capped cashgrab game to prove a point
like i said user, i dont give a single iota of a fuck about reddit worms playing their 60 FPS cinematic playthrougs, i care about bleeding edge autists and competitive esports.

>you're comparing the input delay of the 2700x which gets lower fps.
Wrong, the Ivy Bridge Xeon on DDR3 gets lower FPS but still less input delay.
>that has NO RELEVANCE AT FUCKING ALL to the 3900x
It's safe to assume the 3900X has even more latency due to 2 separate dies, and 1 IO die.
The architecture only changed for the worse with the die shrink

Meanwhile in the real world the 3900X is literally the best processor for CSGO. It's funny you'd choose that game since it's almost like AMD targets it, even the 5700 XT does great with its anti-lag feature on it.

>like i said user, i dont give a single iota of a fuck about reddit worms playing their 60 FPS cinematic playthrougs, i care about bleeding edge autists and competitive esports.
k

Attached: CSGO[1].png (1373x1413, 55K)

>gets his paid shilled CPU BTFOD
>starts mentioning GPU to keep the shilling going
insane

now lets see input latency and not irrelevant FPS beyond 500? what now shill? show me the DPC LATENCY and/or INPUT LATENCY while playing CS:Go of your paid shill CPU

>>gets his paid shilled CPU BTFOD
But it didn't. You're just telling lies by saying Intel has better input latency in csgo. It'd be like me saying that AMD has good input latency in PS3 ports which are known to get fucked by inter-CCX latency. You make it too obvious when you just lie about everything.

?
its even worse for Zen2 due to two fucking dies, more threads, meaning higher probability of scheduler picking the other CCX to do your I/O.