Is Ryzen slower because it does bounds checking?

Is Ryzen slower because it does bounds checking?

Attached: ACO_Ultra.png (1336x1998, 93K)

Other urls found in this thread:

youtu.be/ryzCY2eOdKM?t=236
youtube.com/watch?v=Hjrtw_CJKiU
phoronix.com/scan.php?page=article&item=amd-ryzen-aocc&num=1
twitter.com/NSFWRedditVideo

it's slower because it is running 1 ghz slower

>7fps
>at 1080p
>using a 1080 Ti
Truly my greatest ally!

Attached: ROTTR_2017_11_29_11_09_20_600.png (2560x1440, 3.99M)

This. It has the same IPC. AC Origins scales well with hexacores but Intel is still stronger. Though Ryzen is cheaper and the 7nm Ryzen will do at least 4.5 Ghz with a higher IPC than Intel. Its fine unless you run ancient resolutions in 144 Hz.

The 8400 runs at 3.8GHz all core, yet its faster
It has probably to do with latency from the architecture of ryzen.

>AC Origins scales well with hexacores
Explain the first two entries then.

Yes. Bounds checking is antisemitic.

PC developers have been optimizing games for Intel's ringbus architecture for a long, long time because AMD were completely irrelevant in the CPU space for so long, so it made sense. It's the same reason that the Skylake-X chips and their bingmesh perform relatively poorly in games compared to all other Intel chips - it's a different architecture. They use the same basic cores (it's literally just Skylake in that sense), so that's proof enough that Intel's cores aren't inherently superior. If they were, Skylake-X would perform just as well as the rest. The uncore is just as important.

It wouldn't surprise me if people actually ran a 1080ti with 1080p monitor.

Source?

is a ryzen 7 running in legacy mode as fast as an incel lmao4coar?

Attached: 1520357650644.jpg (1078x862, 260K)

>intel is 1 ghz faster
>only 9 FPS difference IN THE 1%
ahahahahahhahahahahahahahahaha is this the power of shintel?
imagine poozen 3 @5 ghz poor shintel gonna be left in dust

There is literally a benchmark of the chips in question in the OP of this thread.

Neck yourself

Attached: 2600v8400.jpg (1280x721, 128K)

and most people only run shitty midrange cards like 1050 and 1060 so they will never get in cpu bottleneck unless they play in 720p low

Hang on, these benchmarks say that an overclocked 2600 is 2% faster in AC: Origins than an 8400, but the OP says that it's slower. And both of those sets of benchmarks are from the same guy, Steve Walton, who works for both Hardware Unboxed and TechSpot.

I smell a rat.

Attached: frog.jpg (326x270, 24K)

Good job I play at 1440p

Attached: 1440p_MIN[1].png (1324x1665, 101K)

youtu.be/ryzCY2eOdKM?t=236

youtube.com/watch?v=Hjrtw_CJKiU

All the extra value in the 8700K ;)

>8700K
>167% the price for 15% more performance.

What memory is the 2700X using?

On top 8700k being slightly faster in games I also emulate a lot and Poozen can't into single core

Buy a real Ps2 faggot

>Emulator heavily coded to Intel with Intel compiler
Gee. I wonder why.

Name an emulator that runs at full speed on an Intel CPU and doesn't on a Ryzen CPU.

Makes sense though. If you're an emulator developer, struggling with performance issues, you're going to want to leverage your hardware as much as possible.
Since Intel has had the best single thread performance for around a decade, it makes complete sense that the emulator would be optimized for Intel, because it gives you the most headroom possible for your emulator.

I have one Rajesh. I emulate to have higher resolution and access to otherwise very expensive rare games

Well of course it would. But it's very niche. Only a few autists emulate old PS2 games and Switch shit. But enjoy your shitty cel shaded Zelda at 30 FPS.

This. It's amazing how much Intel-optimised software there is out there, and Ryzen gets to within a few percent of it. I wonder how much Ryzen would lead if software were optimised for it.

>*NEW Only old game emulaters matter

A very good upgrade for Skylake-X is more L3. If they put more L3 in Cooperlake/Icelake, it'd already outperform Skylake-X even without the IPC improvements/clock increases.

A fair bit I would imagine. Intel was already caught cheating with their compiler to slow down when it detected an AMP CPU.

Wasn't that during the Athlon days? Or have they tried it on again?

>AMD

They still do it to certain degrees, but people just don't use the Intel compiler for benchmarks.

Who knows. Nobody has tested AFAIK. Maybe it's not possible to test it on the latest architecture. Which would be convenient for Intel.

>40 AMP CPU

Attached: 1524901703195.gif (250x250, 992K)

Why doesn't AMD compile their own or make their own compiler optimized for AMD cpus?

They do
phoronix.com/scan.php?page=article&item=amd-ryzen-aocc&num=1

How would the AMD cpu perform using AMD's compiler vs an unaltered Intel compiler ( one that didn't have AMD gimp functionality )

Outside my field of knowledge so I dunno. Maybe someone with better understanding of compilers and software optimization can chime chime in. Is there a reason not to use AMD's compiler over Intels?

Pretty sure AMD's AOCC is just their own in-house compiler that they used while developing Zen. They obviously couldn't just contribute source code to mainline compilers.
AOCC has had all of its source code released, and is being merged into mainline compilers such as GCC, LLVM, etc.

>tl;dr
AMD doesn't have their own big compiler, they just give all of their optimizations to existing FOSS projects.

how does one "optimize for ringbus"?

Likely, make sure your memory accesses are in a certain pattern.

there's always a fishy reason to recommend intel.
you heard it hear boys.
similarly priced weaker ((((((cpu)))))) is better.

which emulator is single core?

Well considering even the 1080ti cant reach 144fps at 1080p, if they want to benefit from a 144hz monitor, it better be a 1080p one.

>20 bucks
>167%
?

Just buy an intel next time and save yourself the butthurt.

Stop being anti-semite. 1080p is great. Don't run tests on 720p or 1440p. What are you? A nazi? All amd users are nazi indians. You need to do the needful and buy a 8400. I'm currently laughing at you sar. Intel has more hrtz per core than amd. Stay mad ayymd. destroyed and bankrupt.

>2500k from they costanza year
>GTX 1060 with 3gb ram cuz I'm poor and got it at bargain price
>last week got another 8 gb of ram again from costanza year of ebay so now I have 16gb of 1600 DDR3 ram
>because I'm ultra boomer myself, perfectly content and still stuck with 1080p

We still gonna make it bros *sips* I run everything close to max but still have no problem lowering to just "high"

Attached: 1484054073639.png (791x821, 89K)

Welcome to modern game development, where you get the same graphics as 5 years ago but somehow you need better hardware now.

Attached: maxresdefault.jpg (1280x720, 135K)

>5 years ago
That shit looks from early 2010 at best, i bet even a well shaded with a good texture pack source engine game looks better and can be maxed with half the requirments this meme game need just for the menu.

Looks more like something from 2005

Attached: steam-older-deus-ex[1].png (700x525, 441K)