Ryzen is trash for gaming and emulation

>Ryzen is trash for gaming and emulation
>Intel's 9th gen is overpriced to hell
what a shitshow, I'm sticking with my Kaby Lake for another year

Attached: 1541946188782.jpg (1883x1080, 235K)

Other urls found in this thread:

cpu.userbenchmark.com/Compare/Intel-Core-i5-7500-vs-AMD-Ryzen-5-1600X/3648vs3920
techspot.com/review/1678-amd-ryzen-threadripper-2990wx-2950x/page5.html
youtube.com/watch?v=6JpEb-MGiHo
youtube.com/watch?v=vWOWI3TWYCY
store.steampowered.com/stats/
twitter.com/NSFWRedditImage

Pretty sure that first statement is wrong.

>sticking with rebranded pentium 3 in current year

what CPU you have rn?

How is ryzen trash for gaming?
Emulation difference is minor unless you are emulating current gen
Gaming not best, but far from trash
Upgrading is not worth it unless you have a clear thing you want to achieve imo

i5 7500

>corelet
kek

2039, Intel releases its new socket for the ultranew 14++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++nm CPU "Ivy KekLake"

You forgot two + signs. Every plus sign means another two hardware vulnerabilities.

what would have possessed a person to buy a kaby lake processor

>gaming and emulation

Attached: consider suicide capcha.jpg (300x57, 7K)

>i5 7500
4 cores in 2019 LUL

>Ryzen is trash for gaming
Uhh, it gets like 5-10 fps less than the top end intel processors at 1080p. I don't think it fucking matters if you are getting 185fps or 178 fps.

>he has a post-skylake Intel processor
>he wants to upgrade after only 3 years

cope

Attached: file.png (1112x772, 217K)

cpu.userbenchmark.com/Compare/Intel-Core-i5-7500-vs-AMD-Ryzen-5-1600X/3648vs3920
holy shit you should kill yourself son
>same year
>cheaper
>rapes it

I bought an 8600k for less, what the fuck?

Guyz whan new AMD prosessur cum out eh?

Ryzen isn't necessarily "trash" at gaming. Although it's undeniable that there is a gaming performance deficit, most likely due to the latency between the cores.

Attached: Ryzen 1440p Bottleneck.png (2560x1440, 3.34M)

>Uhh, it gets like 5-10 fps less than the top end intel processors at 1080p.
In this example , it's getting 30 less on average at 1440p.

sauce?

Is the 2700x worth anything at this point?

the advertisers are getting more and more desperate.

And people who own 120+ hz 1440p monitors are rare so what does it matter if he's at 130 or 170?

Average framerates are not what your eyes will notice.

>due to the latency between the cores.
do you care to elaborate on that.
you can skip it, if your whole argument is "core latency is bad, mmkay?"

LMAO you AMD shills won't stop making excuses for when Ryzen falls miserably behind. SAD!

100% guaranteed if it was the other way around with Intel 30 fps behind and AMD 30 fps ahead, you faggots would be saying "INTEL BTFO!"

Yes, high core latency is good. High core latency definitely has no affect on average frame rates.

Attached: 1516290255622.jpg (552x661, 71K)

You're retarded

Yea, ryzen is bad at games because it's a shit design

Damn, you BTFOd me. Not sure if I'm going to be able to sleep tonight with how badly I got BTFOd.

>definitely
I am not playing lotto here.
wow, so many arguments.
do you know what makes it so bad then.
what happens in s/w and makes it so bad?

>wow, so many arguments.
>do you know what makes it so bad then.
>what happens in s/w and makes it so bad?

It just underperforms. Every single CPU bound game, even the best optimized ones run 20% worse on an equivalent ryzen

The STATE of shitel retards

Attached: 1545672612521.jpg (400x600, 188K)

INTEL BTFO!

Attached: 1534204099106.png (1920x1080, 611K)

Ouch.

Attached: image.jpg (1136x640, 87K)

>It just underperforms.
so this is the most scientific explanation here?
I thought someone would comment on cache coherence being slow on ryzen.
data consistency on multiple-ccxs reduce performance.
...and all I got was "core latency is bad, mmkay"
It seems to me that no-one of you faggots knows anything about cpus, you just read boxes like ltt.
I don't get it why it's always the cpu's fault.
Wasn't it 2 years ago that amd was a fault about novidia's frame drop?...which turned out to be a shitty novidia driver.
can you atleast prove me that the driver is not responsible for the lower performance and ofc the game itself.
From my POV, all I see is TR and EPYC obliterating Zions and on every case TR gets more performance on games than, say, 2700x.
doesn't TR and EPYC have more latency than Zions?

The problem with Ryzen and gaming is just that engines like UE4 and Unity are optimized for Intel and haven't had any changed for Ryzen.

It's not like anybody was gonna spend a single dime optimizing for Bulldozer.

This might actually mean the performance gap will decrease as the years go by.

it must be that secret latency sauce that no-one can explain.

THREE TIMES more FPS, man Intel is absolute shit.

Attached: 1543967632202.png (632x535, 13K)

>Having to resort to iGPU benchmarks
Damn, AMD shills have to scrape the bottom of the barrel.

People want the 10 extra FPS now, not huge gains in the future. It's a matter of checking the benchmark and buying at best. The FPS in the graphs is tangible, a better market isn't. Thus everything will continue to be optimized for Intel and Nvidia only in the near future. At least the reports say Zen is selling well.

I found a really cheap 2600x for sale, should I pull the trigger? the AM4 ITX mobos are cheap for some reason.

Intel is losing marketshare, in part because they are selling at capacity. They literally can't produce enough processors because yields are too low. Companies like Dell are forced to use alternatives, and the only real one is AMD.

This also means games will have to be optimized for AMD, though it'll be a year or two until these games start coming out at the soonest.

3x, sweaty

I'll help you with some charts, you retards.
how come TR 2950x, a fucking 16c32t with more latency than the 2700x, get more performance than the 2700x!
It's as if you are full of shit, with your snake oil explanations.

another shit answer.
>latency bad, mkay?
>why?
>peeple buy what peeple by

we can safely conclude, that none of you retards have any idea what they are talking about. you blame latency, but you don't know if it's the cpu, or the driver or the game, or even the windows scheduler.
If I wanted /v/ answers, I'd be already in /v/, not here.

Attached: Ashes.png (1321x1501, 67K)

Jow Forums is mostly /v/tards

>begging for an explanation as to why latency is bad

Attached: +_dcc74ccea90440fbbb7be441c8dec455.jpg (766x690, 106K)

>you blame latency
I don't blame latency. I blame people.

they can't be that stupid that they don't even know that the windows scheduler was updated to stop cross-ccx thread creation.

backplane servers have several μseconds of latency and they use 10gbase-kr. what's your argument exactly?

every online retailer show that 7 out of the 10 most sold cpus are AMD, except from a few shitholes like india where they buy intel and mostly c2d.
so, we have as a fact that your excuse is invalid.
I didn't ask anyone what they feel or what they believe in. I asked how the latency causes low performance.
I even posted the threadripper chart to show you that the TR performs better than the Ryzen, even though "latency bad, mkay?".

>except from a few shitholes like india where they buy intel and mostly c2d
sir ples delid dis post

they must buy c2d to have that intel golden Zion 56 core experience.
You know, that nUMA design, where there are 2 chips on one package and the one chip doesn't know that the other chip is next to it, only the OS knows it and it doesn't tell it to any of them.

Straight from AMD retard

Attached: game_mode_amd.png (806x391, 44K)

that's a cpu microcode switch that is already obsolete with the updated windows scheduler.
this feature is more useful under linux where the scheduler is more fair.

This post is probably bait but for the price Ryzen is fucking amazing. I got an R5 1600 for $74. That's like out of this world price-performance ratio.

user, this retarded NEETS barely know what a CPU is, why did you expect they would understand anything you said?

also, to make your day less of a nightmare, have a check here, techspot.com/review/1678-amd-ryzen-threadripper-2990wx-2950x/page5.html
it's where I took the chart above.
they have the f1 benchmark on normal mode and "gaymen" mode for the threadripper.
it doesn't change at all, does it? well, that's because I said it here anyhow, read your shill books. I might be around the next time you start shilling again.

>loses to intel processors
>only beats other ryzen
>i got you now!
What the fuck? Are you retarded?

>Having to restort to 150+ fps benchmark on 240p just to make Zen look bad
Damn, this Intlels...

I mean, in my eyes that's pretty bad that it can't at the very least match a 5 year old Haswell. But I guess to the hardcore AMD fans, it doesn't matter.

I purchased a domain and I started developing with intel pin tool library a binary analyzer for tracking weak points in cpus, e.g. why does this gpu with ryzen get less draw calls compared to i7. is it the driver, let's analyze it. is it the game, let's analyze it. is it the pcie, let's stress it with our driver.
but then I thought about how wasteful that would be for me, since they barely understand ltt.

classic shitposter that scrolls through a thread and reads only posts that have a picture on the left.

>5 year old Haswell
that's because your 5 year old jewlake is at 100% stuttering like shit
and the ryzen idles at 20% on the same load.
if you keep using pajeet software, every cpu can be fast.
can someone post the typing comparison of the ipad vs the pentium III gif? I can't find it.

user that project sounds really great, I hope you keep it!

>8 core 16 thread i7-5960X at 100% but 8 core 16 thread 2700X at 20% on the same load

Attached: 16e.jpg (903x960, 52K)

That could be a relevant point IF Intel had developed a better chip since 2013, but given that it was Moar Ghz and no IPS gains argument your argument becomes not that relevant, heck in 2017 you could say the same for the 7700K...

>legacy software and backwards compatibility will always be seen as "trivial" to major corporations

I hate this world.

Attached: 1557376950991.gif (640x360, 303K)

Are you new to computer parts? No offense intended. Intel parts were more expensive before ryzen because literally nobody bought amd.

Msg me if you need serious funding for that. It's 3:30 am here now. I'm gonna sleep and read it later.

Avg core latency. Latency within CCX is actually less than what Intels have. On Linux with CEMU I can lock my OC'd 1700 to just single CCX, if core latency was really the issue then I should see a massive performance boost right? It didn't make any difference at all

Poorfags and freetards can only afford AMD so they shill it constantly.

>ryzen is trash for gaming and emulation
your sources are trash. unless you really need to hit this 240fps in your battlefield 24 armageddon, ryzen is doing well in most of the modern vidya. same with emulation, i'm emulating ps3 titles on 1700 and i barely get any minor frame drops in persona 5
>intel 9 is overpriced to hell
9600k is ryzen 2700 money, i'd say it's a fair prace for intel standards

everything you think is wrong

Sweet Jesus Pooh! That's not sauce You're eating POZ LOADS!

Attached: proxy.duckduckgo.com.png (500x370, 85K)

>hurr durr ryzen cant get 10 more fps to match intel
>ITS TRASH
meanwhile it offers 5% less perfomance for 30 to 40% less price

>very low settings
>on an older game
>on high-end CPUs
Do you expect someone to notice difference between 400 and 500 fps?

Where is the proof that Ryzen is trash for emulation? People say this all the time for years but I haven't seen any proof since Cemu got updated to run fine on it. So even though there's no longer any proof, people still say it as a blatant lie.
It runs Persona 5 and BotW at 60fps stable.
What is more demanding than RPCS3 and Cemu?

It's not like you can run emulated games at over 60fps most of the time so why would you need something more powerful than what can run them at 60fps?
How retarded do you have to be to overspend for a locked 60fps?

I emulate just fine.

I have a 5820K and it's still great. No need to upgrade yet. OC to 4.5

30-50fps is not 60

2377, Intel goes 5nm!

first, to state the obvious, op is a retard. hurr it's worse than intels, therefore it's an absolutely irredeemable trash and emulating vidya consoles makes no sense because you won't be able to play it anyway, and your locked fps in persona doesn't count because you'd get only 40fps in gran turismo 5 while other intel user would get silky smooth 55fps

it's not entirely as retarded though, you gonna need those 10fps more to play some demanding piece of vidya that barely hits 30fps on a superior intel. if i'd want to own a pc exclusively to emulate ps3, i'd definitely go for intel. then again, if you're that commited to emulation, why not buy a $50 ps3 instead?

>It runs Persona 5 and BotW at 60fps stable.
how about providing proof for this first, dickcheese?

Current AMD and Intel are trash sadly, the Ryzen 7 2700x finally has the same IPC as my current i7 4790k so for gaming it's a dumb upgrade. Only thing I'm missing out on would be more cores and ddr4 with high clocks. Time to play the "just wait for x" game for another year i guess

60fps is 60fps though.

youtube.com/watch?v=6JpEb-MGiHo
Virtually locked at 60 in BOTW. This is not uncommon and there are tons of videos of the same. I ran at 60 on my 1600X, even. And that was one of the most demanding Cemu games.
youtube.com/watch?v=vWOWI3TWYCY
There's a few dips in Persona 5, but otherwise 60fps. Still runs better than a 9600k.

>nooo it's not fair if only people made 240fps patches for the 30fps console games then intel would win

>if i'd want to own a pc exclusively to emulate ps3
Not gonna buy a $500 CPU and $225 board just to emulate PS3 marginally better.
When the 2700X does emulation better than even the 9600k that retards shill, what's the point?

It's not a dumb upgrade when you only get 35fps with the 4790k in Kingdom Come Deliverance but you get 60 with the 2700X.
But sure if all you play is old games, then 4c/8t is still fine.

1080p lowest settings is real world performance testing in FPS games. You want things to be sharp(so native res) but really high fps. Intel excels at that. Fuck anyone who's shilling AMD in heavily GPU bottlenecked scenarios, a bottlecap could perform the same as an i9 in 1440p ultra settings test graphs that i've seen thrown about.

Go ahead and test games in 640x480, I'm sure 352 fps is so much better than 331 fps

Attached: 15241006322z6uo19egw_5_1.png (619x452, 17K)

Yea on games that are a lot more CPU bound the performance difference really shows, luckily it's only a handful of games.

this is obviously bottlenecked by something else( memory? engine? some bus?). Sure had to scrounge a benchmark outta the bottom of the shill bag. Show me battlefield, pubg, csgo, apex legends not fucking lost planet from 2006

>The problem with Ryzen and gaming is just that engines like UE4 and Unity are optimized for Intel and haven't had any changed for Ryzen.

Shouldn't UE4 at least have some optimization, considering the current Playstation and Xbox both have AMD chips (even if it's just Jaguar)? Even then, with the next generation both being on a Ryzen derivate, there should definitely be improvements.

ryzen is trash for emulation yes but i havent seen any performance issues with it when gamming.

Jaguar is a construction core, not Zen

LOL 5% MORE FPS AMD BTFO
>csgo
lol

Attached: untitled-17.png (667x522, 21K)

>70888470
>Ryzen is trash for gaming and emulation
Slightly worse than Intel in STP =/= Trash
You're a retard and I refuse to give (You) a (You).

the i5 9600k is cheaper than the 2700x where I live, why should I bother with amd?

200 is engine limit for frostbite. some people are dumb as fuck jesus

HEDT benchmark, very kek. 9900k vs 2700x or 9400f vs 2600?

What is the comic in the OP

so now dick waving about cs go doesnt matter anymore?

It runs ffmpeg better, so my bet is game devs are just fucking retards and video games are low quality toy applications, like usual.
Seriously who gives a fuck about playing AAA games in the year 2010+? They're all such normie trash.

The problem is mostly HEDT. Nobody cares about HEDT for gaming unless you're some stupid youtuber that doesn't have a dual PC setup.
where's 9900k vs 9700k vs 9600k vs 9400f vs 2700x vs 2600 ? Or would that csgo benchmark absolutely BTFO AyyMDrones ?

>Seriously who gives a fuck about playing AAA games in the year 2010+?
store.steampowered.com/stats/