Intel is coming out swinging over at E3, 2019. During the company’s ‘gaming performance for the real world’ event in LA, Intel challenged AMD and its upcoming Ryzen 3000 processors to “come beat us in real world gaming.”
AMD is set to talk about its 5000-series Navi GPUs and Ryzen 3000 CPUs in greater detail during the LA show – taking place on June 10 at 3:00pm PST if you want to tune in – but Intel scooped an early Sunday slot to get ahead of the red team and attempt to call out AMD on tangible performance in common workloads including gaming, and, what it believes to be, unrepresentative synthetic benchmarks.
“So you’re going to hear a lot about gaming CPUs this week,” says Jon Carvill, VP of marketing, says. “They may or may not come from certain three letter acronyms. That said, here’s what I want to challenge you. I want to challenge you to challenge them. If they want this crown come beat us in in real world gaming, real world gaming should be the defining criteria that we use to assess the world’s best gaming CPU. I challenge you to challenge anyone that wants to compete for this crown to come meet us in real world gaming. That’s the measure that we’re going to stand by.”
One of Intel’s targets for unrepresentative benchmarks was Cinebench R15/20. This is a popular benchmarking software among reviewers, but Intel holds that, seeing as so few users actually use Cinema4D by its own internal numbers, performance in this benchmark does not equate to real-world utility.
Intel also claimed that most users won’t benefit from the bandwidth offered to a graphics card by PCIe 4.0 “not today and not in the immediate future,” with only the maximum, highest-fidelity HDR 4K, 144Hz resolutions necessitating high bandwidth.
[ ] PCIe 4.0 is unnecessary they are omitting that the GPUs may only need half of the PCIe lanes, freeing some for say additional SSDs (in RAID?) or a crossfire/SLI solution - at an increased chipset power consumption.
Aaron Murphy
>unrepresentative synthetic benchmarks.
This much cope. It's just like when AMD performed worse in synthetic benches.
Joshua Russell
So they already know that the 16c/32t is coming and they have nothing to show for it.
This is like AyyyMD removing FPS counters to pretend their GPUs are as good as Nvidia's.
James Sullivan
[x] Benchmarks don't matter [x] PCIe 4.0 doesn't matter [ ] only shit old games compiled with Intel's compiler that deliberately slows down AMD processors matter
Parker Reed
>Intel also claimed that most users won’t benefit from the bandwidth offered to a graphics card by PCIe 4.0 “not today and not in the immediate future,” with only the maximum, highest-fidelity HDR 4K, 144Hz resolutions necessitating high bandwidth. Someone screenshot this and remind us the time when intel releases a new socket series with PCI 4 support.
Chase Rodriguez
This shit is even more hilarious when you take into account those benchmarks they posted the other day "proving" that Intel's new chips beat AMD's. But.... >optimizations on the Intel processors may not have been enabled for the AMD processors >security fixes may have been disabled >all kinds of other bullshit in a disclaimer that is like 3 paragraphs long lmao. Intel is kinda known for this shit at this point, though. youtube.com/watch?v=osSMJRyxG0k
Hudson Sullivan
Intel is skipping right ahead to pci-e 5
Aiden Lee
Shit, here's an even BETTER video from Adored talking about trumped-up benchmarks in particular: youtube.com/watch?v=3V8pEsjNa4Q
Jackson Cruz
Stay in a single thread with your consumerist fanboy circlejerk.
>One of Intel’s targets for unrepresentative benchmarks was Cinebench R15/20. This is a popular benchmarking software among reviewers, but Intel holds that, seeing as so few users actually use Cinema4D by its own internal numbers, performance in this benchmark does not equate to real-world utility.
This goes both ways. Yes Cinema4D is a joke in terms of 3D industry usage numbers (After all it was a motion graphics suite that got crowbarred into being a full 3D suite, and it shows) but at the same time Cinebench R20 was basically gimped with AVX512 to make Intel look good compared to AMD after AMD whipped their asses on Cinebench R15. So Intel cared a hell of a lot about Cinebench when it was in their favour, but now it doesn't count.
Cooper Kelly
>AMDoredTV >sponsored by AMD and Patreon
Yikes
Liam Nelson
>giving a single about dronedTV
back to /ayymd with you
Jaxson Morris
back to tel aviv with you
Mason Reyes
>most users won’t benefit from the bandwidth offered to a graphics card by PCIe 4.0 “not today and not in the immediate future,”
I don't give a shit about the GPU making use of the PCIe 4.0 I'm looking at those super fast NVMe drives.
Jace Smith
Guys, has this ever happened before? I can't remember seeing Intel get crushed like this ever.
>lisa su goes on stage >Babbles about things >oh our competitors said to come forth and beat them on real world testing >im assuming they have all the mitigations enables since its real world >so for the first time EVER i present you the competition killed >FX 8350
Lucas Flores
architecture is more important than niggerrherts
Joseph Price
Better to be a faggot than a retard who actually takes adored seriously after how wrong he turned out to be over zen 2.
Angel Wright
That would be absolutely fucking savage. Then come back and rape them with ryzen 2.
AMD fans are laughing at Intel now but they will be the ones weeping when Intel strikes gold again
Ian Cooper
>pay companies to optimize their games for your CPU over AMDs >use those games as benchmarks
Really makes you think!
Ian Thompson
3.6 kilowatt, not great not terrible.
Samuel Nelson
There are AMD-sponsored games too you know.
Ian Wood
>I started following CPUs at Bulldozer
Gavin Sanchez
>by the time intel brings their 10nm amd will be at 5nm (well its either zen2+ or zen 3 either way its going to be hilarious)
Michael Turner
i havent seen any amd cpu optimised games
Jacob White
Netburst was created to work around the enormous latencies of Rambus memory by having an absurdly long pipeline and an assload of clocks. Intel had a cooperation with this Rambus company, hoping to ultimately control designs of everything and demand exorbitant licensing fees. Rambus was shit on P3 (hence the creation of Netburst) and never came to AMD, and most of all nobody wanted to pay Rambus for practically nothing. So everyone else settled around what became DDR1, it was cheap and reamed Intel.
Rambus did have a gimmick in trying to save pins and traces.
Rambus was the most horrible fucking meme of all time. I'm glad they're dead
Adam Cruz
>high settings on a CPU benchmark I'm glad to see people were just as retarded then as they are now.
Ian Gray
They've been one-upping each other for decades. AMD made superior 386 and 486 processors, broke the 1GHz mark and introduced 64bit and multi-core desktop processors first. Intel was basically eating their shit all throughout the P4 era until they came out with the Core architecture. And even then it has come to light that their ability to catch up to AMD with Core was only because they took dangerous shortcuts that everyone backing team blue is now paying for if they don't want to become some hacker's bitch.
Jonathan Diaz
>attempt consumer lock-in antitrust bullshit >get buried If only this outcome were more common.
Nathan Sullivan
So every game that is extremely GPU-bound is AMD-optimized?
>come meet us in real world gaming Are they gonna shoot up the AMD E3 booth?
William Brown
remember the SCO lawsuit?
Cooper King
amd is sandbagging. just wait for reviews. 50% ipc increase and intel commits sudoku.
Jack James
>come compete in a competition on things we specifically picked to make sure we win good god could they be any more cocky?
Tyler Nelson
>1v1 me
Robert Morris
so power consumption is a bigger problem than pcie 4.0?
Levi Diaz
It's true, here at Intel we have built the best gaming CPU's. In this example our CPU is beating AMD's offering by 6FPS, for GAMERS that are SERIOUS about their GAMING this is a big deal. At the end of the day you get what you pay for.
Dylan Phillips
>Being too stupid to take pricing into consideration
Jason Russell
better yet - a notebook hooked up with google Stadia, practically yelling:
YOU DON'T NEED A BEEFY CPU ANYMORE FOR GAMING
I wonder if you can pay more for better frametimes.
James Gutierrez
They won't be when Zen2 beats them anyway.
Jayden Garcia
>real world gaming >quake 3 at 640x480
Zachary Scott
Intel is running out of use cases where it's better at a frighteningly fast rate.
Logan Myers
This Shareholders would lynch them if they were to lower their margins. Intel went as far as saying that they're willing to sacrifice market share to keep their margins.
Has anybody the latest picture that compares German AMD/Intel sales handy? Despite AMD outselling Intel 2:1, they're almost equal in terms of revenue generated due to Intel's higher prices.
Hudson Price
they could only get away with it because they had faster chips, not anymore
Xavier Hernandez
>our tests was made from the trustworthy and 3rd party
>Principled Technologies
Asher Cook
SHUT UP GOY
Isaac Watson
240p gaymen goy
Eli Robinson
What do you need 80x24 text mode for?
Leo Phillips
There are no security flaws The researchers are delusional, take them to the infirmary
Gabriel Cook
>PCIE 4.0 don't matter lul
Except both next gen consoles are using them, fuck you Intel
& move to 7nm already for fuck sake & drop skylake
Ryan Campbell
2 processors enter only 1 leaves.
Adrian Wood
As I said before, wait until intel has pcie 4 support and see what they say then.
Julian Nguyen
Many high graphics settings have a big impact on CPU.
Wyatt Perez
Lmao, this. I don't give a shit about raw performance. Performance per buck is the only metric that matters, and Intel has been absolutely BTFO in that regard.
that's because settings scale differently from two different microarchitectures.
This was very obvious because A Ryzen won't even try to go full load on all threads when you hit low settings. And this is why a comparison at low-end is moot.
Because Intel has a Single-core advantage, it has the power to scale back and still perform in lower settings.
That's the real reason. I agree that for an Intel CPU nowadays, low-settings will absolutely show computing power. But AMD's architecture is fundamentally different that comparing how it would scale in lower settings would be stupid.
Brandon Foster
Buy Intel goy
Jason Martinez
Yeah but intards are pretty much like applefags
Daniel Thompson
Intel literally can't compete with the 12-core and 16-core, so they went full damage control with "only gaming performance matters"
Jose Jones
This. Finally people see the bullshit here. Idiots think that AMD scales the same way as Intel does.
Fundamentally, Intel has always relied on one tactic. BRUTE FORCE. It's true during the Prescott days, it sure as hell is true today. Only time they did not was when they got suckerpunched by the Athlon64 into making the Core Microarchitecture.