Intel's 9th Gen Core Gaming Benchmarks Flawed and Misleading

>Intel paid Principled Technologies, a third-party performance testing agency, to obtain performance numbers comparing the Core i9-9900K with the Ryzen 7 2700X across a spectrum of gaming benchmarks, instead of testing the two chips internally, and posting their test setup data in end-notes, as if to add a layer of credibility/deniability to their charade. The agency posted its numbers that were almost simultaneously re-posted PCGamesN, gleaming the headline "Up to 50% Faster than Ryzen at Gaming." You could fertilize the Sahara with this data.

>You could fertilize the Sahara with this data.

OH NO NO NO NO NO NO WHAT HAPPENED INTELBROS!!!

techpowerup.com/248355/intels-9th-gen-core-gaming-benchmarks-flawed-and-misleading

Attached: [email protected] (839x471, 69K)

Other urls found in this thread:

patreon.com/posts/21950120
youtube.com/watch?v=D1mJMI_uaa8
en.wikipedia.org/wiki/Ryzen#Desktop
youtube.com/watch?v=0oci_YiKbhY
twitter.com/SFWRedditGifs

>looks at unpaid result
Are we ignoring the fact that amd cpus are not doing that great in the game that they invented strictly for benchmarking amd products?

you cant be that stupid

>you cant be that stupid
Show the Ashes of the SIngularity benchmark.

Attached: aots.png (1278x718, 285K)

But the 2700X is not a cheap CPU, why would you spend a lot of money on an inferior product?

It's not like Z370 motherboards are that expensive, they're in the same price league as AMD ones now

The benchmarks carried out by Principled Technologies are even more bogus than we first thought. A few viewers pointed out that the Ryzen 7 2700X was listed as tested in the “Game Mode” within the Ryzen Master software and I foolishly thought they might have just made a simple copy and paste error in their document as they would have used this mode for the 2950X. This does explain why the Threadripper CPUs were faster than the 2700X in every test.

What this means is a CCX module in the 2700X was completely disabled, essentially turning it into a quad-core. I’ve gone ahead and re-run the XMP 2933 test with Game Mode enabled and now I’m getting results that are within the margin of error to those published by Principled Technologies.

patreon.com/posts/21950120

kek a 4core vs a 8core

So what are the benchmarks with all 8 cores like?

add 5 more fps to 8700k aka 9 more fps than 2700x
tremendous 50% difference

Yeah, but the Intel equivalent (8 cores 16 threads) costs over double the 2700X price. And comes without a serviceable cooler.
Strictly speaking, the 2700X is far more value for the dollar than the 8700k, it's not even close.
You're free to do whatever you want and get the best if you want to pay the Intel tax for it. No one's telling you not to. It's just a terrible way to burn money, when the difference, realistically, isn't larger than 10-15% in most workloads.

STOOOOOOOOOOOOOOOOOOPPPPPPP
STOP KVETCHING AND BUY INTEL
THE MORE YOU BUY THE MORE YOU SAVE

Attached: 1436734391281.jpg (202x249, 30K)

Do you consider the equivalent to be the same core count, or the same performance?

If the 6-core 8700K beats the 8-core 2700X, why would you buy the 2700X?

[[[[[[[[[[[[[[[[[[[[[Paid Result]]]]]]]]]]]]]]]]]]]]]]

THANK YOU BASED INTEL FOR GIVING ME THE HONOR OF PREORDERING THE I9 9990K BEAST!!!!


Amd btfo

Attached: Screenshot_20181008-205226_Chrome.jpg (1080x2220, 343K)

>Do you consider the equivalent to be the same core count, or the same performance?
>If the 6-core 8700K beats the 8-core 2700X, why would you buy the 2700X?
Same core count, obviously. That's how competition works.
The 8700K only beats the 2700X at games and a few other specific applications, marginally. In most threaded workloads, the 2700X is far ahead because it obviously has more cores.

>But the 2700X is not a cheap CPU

Neither is the 8700k, in the uk the 8700k is £170 more than the 2700x. This isn't even taking into consideration the fact that the 2700x is better at productivity shit, keeps up in gaming, cheaper, comes with a cooler and avoids intels tricks and security flaws.

what did they do?
use 2133 ram and blow hot air in?

They didn't use XMP profile on the 2700X, only the default RAM values, while using XMP and tuned timings on the Intel CPUs.
Oh and they enabled Game Mode on Ryzen Master, a feature meant for Threadripper, not the normal Ryzens. That feature disables one of the CCX, to improve latency on the Threadrippers. Except it disabled 4 cores on the 2700X, effectively making it a 4core CPU competing with an 8core.

Disabled one CCX basically turning the 2700X into a 4c/8t CPU

Why the fuck does Ryzen Master allow disabling CCXes on processors where this brings no benefits whatsoever?

It allows you to disable all 7 but 1, if you want to. Gives you basically full control over the CPU. It's not necessarily a bad thing, if you want to break OC records, you want to do it on less cores, because it'll be more stable, for example. This could lead to better performance in specific games or workloads too, if you wanted to. Too much work for the average person to ever bother, though.

...

The problem isn't disabling cores per se, but having a nonsensical "Game Mode" preset that got carried over from another line of CPUs. If I were a layman who just bought a new CPU, and the manufacturer's tweaking software had a "Game Mode", I'd surely enable that for gaming, right?

You're not wrong, but at the same time, Ryzen Master is a software that you have to go out of your way to download, and it assumes you know what you're doing. Not only that, it has a disclaimer and agreement at startup that your warranty is void by using it (though I don't think AMD can prove you did, but it's there in case people burn their CPUs, I imagine).
I have no clue how the interface looks like or if there's a tooltip indicating that it disables cores like that, because I don't use Ryzen Master. But I'd imagine it does, since it was pretty detailed in the explanations when I tried it.

youtube.com/watch?v=D1mJMI_uaa8

Attached: gamers-nexus-intel.jpg (1206x747, 200K)

It has a row of big-ass buttons on the bottom.

The idea of "Game Mode" on TR isn't just to halve the core count, it's mainly to avoid using CCXes that don't have direct access to the memory bus to improve average memory latency. But on Ryzen, all CCXes have access to the memory bus, so the "Game Mode" disabling half the cores makes no sense (aside from maybe providing slightly higher turbo frequency on the remaining cores, which may help with the most poorly threaded game engines like SC2)

Attached: game-mode-on.jpg (2880x1575, 301K)

>The absolute state of poozen
4th refresh of 14nm still 10% faster than Zen

Yea I'd try "game mode", but I would also compare it with other modes, or settings, and use the best one.
For the most part though I think presets like these are always shit, so I only use them when curious.

>be intel
>your glue eating neighbor AMD is going on about this corelet design and "no seriously, itll be better this time!"
>get complacent, incremental upgrade from 4th to 6th gen because lazy
>Zen drops
>its not shit
>Intel: oh shit im caught with my pants down
>drop 7th gen, its ok but not priced competitively
>Zen and the glue eaters are chipping into your market share
>drop 8th gen, its also ok, but not priced competitively because youre a fat lazy complacent ass
>Zen+ comes out and has relegated intel's entire middle market noncompetitive
>Yield problems, rumors Intel may even go back to 20nm for certain contracts
>Spectre/meltdown cucking older chips
>Cant get 10nm to work and AMD is already moving down to 7nm
>Cuck your 9th gen chips by disabling hyperthread on every single chip that isnt the 9900k
>hey guys look at this box its shiny
>Price your 8c/8t above the ryzen 8c/16t
>Price your 8c/16t at above $500
>the 2700x is 10-20% less performance for $250 less
>Zen 2 is coming out next spring, assuming past prices and incremental perf gain of 10% the 3700x chip will be margin of error performance, $100-250 less, with 2x the threads of the 9900k
>See writing on wall, pay some nobodies to detime and cuck a 2700x so you can claim 30% superiority
>Tfw your $500 flagship is only 25% better in gaming against a detimed and half disabled 2700x

Next year is going to be very interesting for processors.

Ryzen is still on 14nm too, retard

zen+ is technically 12nm.

12nm is basically 14nm with a little spice. Not much different from Intel's 14+++++.

Ryzen has made CPUs interesting again. The jews have to compete now, again, and we finally see the benefits.
Celeron-Pentium-i3-i5 lines are completely rekt by Zen+. The only CPU that might be worth something among these is the 8350k and only for emulation purposes.

12 nm low power with features bigger than on Intel's 14nm+++++

Using the same technology AMD would probably perform better than intel on every aspect.

TSMC's 7nm is slightly better than Intel's version desu

Low-end Celerons/Pentiums aren't rekt by Zen, AMD has nothing in the sub-50W category except a couple 28nm relics.
I wonder if AMD has something low-power in the works to make on 12nm after the high end goes to 7nm.

TSMC's 7nm is better than intels 10nm that will come a year later when TSMC will start production of their first 5nm chips, all of this because intel fucked up their initial goal and had to increase the size of transistor features.

>CCXes that don't have direct access to the memory
There's no such thing on the 16-core threadrippers, which is the ones Game Mode was originally made for.

Attached: threadripper_architecture-final_16.jpg (1918x1080, 148K)

>get complacent, incremental upgrade from 4th to 6th gen because lazy
Why would a billion dollar company spend billions on R&D if they do not have any credible competition?

It took AMD 2 major failures and a decade to come up with Ryzen and their advantage comes mostly from a lithographic advantage provided by TSMC.

... yes? Each of the active dice has two memory controllers, one for each of its two CCXs.

>and their advantage comes mostly from a lithographic advantage provided by TSMC.
but Ryzen is fabbed by GloFo using a process licensed from samsung

Attached: 73590ED1-20DB-4A1A-B894-FC35AE335520.jpg (500x425, 58K)

>lithographic advantage provided by TSMC.

u wot m8? current Ryzens are produced by GloFo and 12nm is not all that different from Intel's 14nm+++++++++++++.

>AMD has nothing in the sub-50W category
What are the 2200G, 2400G and 200GE then?

2200G/2400G are 65W and 200GE isn't sold yet.

Even 200GE isn't exactly competition for stuff like J5005.

The 2500U and 2700U exist, you know. They're both 15W rated parts, the 2700U even has SMT enabled, which none of Intel's offerings at that range have, as far as I know.

you have a slight misunderstanding of infinity fabric my guy. Ryzen has two CCX on a single chiplet, Threadripper doubles up on that... but what game mode does is prevent core0-3 from talking to core 4-7 and so on,

effectively you halve your multithreaded performance for some (30%~) reduced latency, many sites have done testing of it, it only really helps if you have a severely single threaded game and want to try and net another 100MHz or so out of turbo.

the 2200GE and 2400GE are 35w 4/4 and 4/8 cores/threads respectively and have a iGPU
en.wikipedia.org/wiki/Ryzen#Desktop

2500U/2700U are fairly expensive and intended for laptops, until AMD starts selling them for cheap to be put into shit-tier all-in-ones and NUC type boxes, they're not competition for Pentium Silver and Celeron.

I was excited for those, but apparently they only exist on paper.

AMD does this thing of releasing low end parts only for OEMs, I guess if you can afford 1000 of those they'll sell them to you.

>2200G/2400G are 65W
They are configurable between 45-65 W.

The J5005 you mentioned has a $160 MSRP. Both the 2500U and 2700U have been featured in quite affordable laptops already, what they actually needed was more high end laptops with proper dual channel memory with decent speeds to take advantage of the CPU and integrated graphics properly. They're great CPUs for the price, and certainly not behind Intel's offerings by any means.

That hardly matters when we're talking about an alternative to Atom shit.

Well I can't find a single prebuilt PC using them either, even though small boxes that can run non-demanding games like WoT would probably sell well.

What are you even talking about.
>but what game mode does is prevent core0-3 from talking to core 4-7 and so on,
What does that even mean? If the cores just "couldn't talk to each other", you'd have effectively split your system into two separate, independent machines, and that's not what happens. Game mode disables half the cores (those on one of the dice, in the case of TR). If you doubt this, read .

>The J5005 you mentioned has a $160 MSRP.
The price listed on Intel's site for those is utter bullshit - not only it's the same $161 for the entire Pentium J line (and $107 for the entire Celeron J line), you can buy a whole NUC based on J5005 in retail for that price.

>Both the 2500U and 2700U have been featured in quite affordable laptops already
N-series, the laptop counterpart for the J-series, is used in the absolute cheapest Windows laptops you can buy. They're VERY cheap in numbers, so they're in a different market segment than 2500U/2700U.

>what does that even mean
if ccx1 is cores 0 through 3, ccx2 is cores 4 through 7, game mode effectively halves your core count and this is reflected in all multithreaded benchmarks that can fully utilize the system, your fucking link even shows that.
pic related, also vid related youtube.com/watch?v=0oci_YiKbhY

Attached: 90353.png (650x200, 18K)

also to further add to this, I admit I could have worded my original post better instead of saying preventing the ccx from talking to eachother I should have said it disabled the secondary ccx cluster but it just didn't seem right but its just arguing semantics.

regardless, those paid benchmarks clearly handicap the AMD system on multiple fronts
>purposely reducing the memory frequency
>running the stock cooler only on the AMD system, while giving the Intel systems a quite good Noctua cooler
the list can go on and on, but ultimately its just scummy bullshit and we're in agreement of that.

Even with the "updaid" results, Intel beats AMD so what's your point?

>also to further add to this, I admit I could have worded my original post better instead of saying preventing the ccx from talking to eachother I should have said it disabled the secondary ccx cluster but it just didn't seem right but its just arguing semantics.
But that's just exactly what I said, then, and there's still no such thing as "CCXes that don't have direct access to the memory".

That's exactly what reviewers had in mind.

on threadripper thats the case from what I've tested on my 2950x, each chiplet only gets 2 memory channels.

>each chiplet only gets 2 memory channels.
Yes? That still doesn't mean that there is any CCX that doesn't have direct access to memory, unlike on the 2990WX. Also, turning on game mode doesn't disable the memory controllers on the second die, only the cores.

I'm honestly kinda confused on why you keep bringing up memory, I talked about LATENCY before, refering to the infinity fabric interconnect latency penalty by jumping from CCX1 to CCX2 and so forth.

>Ashes is an amd benchmark
Ashes was a radeon thing with mantle, dx12 and multi GPU support. I don't think there was ever anything about ryzen sponsoring the game.

I don't even know what you're trying to say anymore. What I did was reply to which claimed that not all TR CCXs have direct access to memory, saying that wasn't the case.

This.

Ryzen wasnt even a thing. Radeons where benchmarked with Intel CPUs.

Not that user, but the game mode was released for the 1950x generation which has DMA. But the performance impact was still there if a thread was jumping across the two dies. Same with jumping between CCXs on the same but it's not as severe.
I've been testing game mode on my 1700 and it disables half the cores but pins the cores at 3.7ghz which is a decent uptick from stock and shows in some games performance and I was previously setting core affinity manually for games.
The new setting coming for the 2990wx is dynamic setting in software that should identify programs that suffer from the core hopping and provide them better core attunement, if it works for games as well then that should also help the single die parts as well unless they use more than 8 threads or something.

At 3600 SEK for the 2700X vs 4800 SEK for the 8700k it's not much of a contest which of those give you the most for your money. Though none of them are worth it, the 1800X is 2800 SEK and the 1700X 2300 SEK. If you're looking for a 8-core and you're in Europe then the 1700X is a pretty obvious choice (the 1700 is just 100 SEK cheaper).

Most price/performance comparisons don't reflect the 50% price-hike on Intel CPUs that we've seen the last month. This rather huge price-change due to Intel incompetence and manufacturing problems has made Intel a non-choice.

>in the uk the 8700k is £170 more than the 2700x
yeah, it's the same all over Europe. I have a real fear Intel won't get their shit together for so long (it'd only take months) that AMD will run into supply shortages which would lead to a similar price-hike on their products.

AOTS was made for AMD. It was meant to leverage as many slower cpu cores as possible along with mantle/vulcan/dx12.
Game was tailor made to benchmark amd's hardware both cpu, and gpu just to tip the gaming averages slightly in amd's favor.

>But the performance impact was still there if a thread was jumping across the two dies.
True though that may be, game mode wouldn't have helped with that since it only turned off the cores on one die, not the memory channels. So to access the memory attached to the other die, the enabled cores would still have that hop.

The Ryzen v1000 embedded parts at what's targeting those embedded atom like markets. I suspect it's what the a300 am4 motherboards were for initially as well. There are some decent designs with the v1000 floating around, but nothing for the budget consumer market. I've only spoken with some people about their enterprise stuff so far but that comes in at £300-400 for itx parts.

Would that be much of a concern? If the cpu was set to the correct numa mode then the core would pull it's own data into it's local memory pool and not place it in the distant, TR launched with numa mode options before the game mode was released. I'm not saying it's impossible but it sounds like a problem that was solved when NUMA was devised.

It was a game that tried to use modern APIs to improve performance. I'm sure RTG sponsored them to use Mantle and then DX12 happened so they migrated to that. Those APIs were designed multithread the cpu side graphics rendering processes because single core was a real bottleneck for that. It was released before Ryzen was a thing and RTG was certainly not on good terms with the cpu division back then. It's entirely incidental that the cpus benefit because it was targeted at graphics rendering, it's just that radeon cards took advantage of the technologies such as async more than nvidia at the time.

>tfw got 2600X for 2149 SEK

>Would that be much of a concern?
Not saying it would be, just saying that it doesn't help with die-hopping when it happens. And if you use more memory than is available on the local channels, it does happen.

There's of course also the issue that Windows doesn't seem to handle NUMA allocation good at all (as evidenced by the 2990WX benchmarks), but that's of course not the hardware's fault, and only a problem for Windows users.

That doesn't sound very impressive. Isn't that just about what it normally goes for?

The game was made to
this was not a secret.
>RTG was certainly not on good terms with the cpu division
They were on better terms with each other than they were with anyone else you fucking dipshit.

You're right. I got it on sale, but it looks like the price is now that low everywhere.

Attached: DpGl5GsWwAAR8-z.jpg:large.jpg (909x600, 142K)

>2018
>Even considering Intel (Israeli Nepotism Electronics of Tel Aviv) over AMD (AMERICAN Micro Devices)
Shame on you Jow Forums