Intel is coming out swinging over at E3, 2019...

pcgamesn.com/intel/worlds-best-gaming-processor-challenge-amd-ryzen-3000

Intel is coming out swinging over at E3, 2019. During the company’s ‘gaming performance for the real world’ event in LA, Intel challenged AMD and its upcoming Ryzen 3000 processors to “come beat us in real world gaming.”

AMD is set to talk about its 5000-series Navi GPUs and Ryzen 3000 CPUs in greater detail during the LA show – taking place on June 10 at 3:00pm PST if you want to tune in – but Intel scooped an early Sunday slot to get ahead of the red team and attempt to call out AMD on tangible performance in common workloads including gaming, and, what it believes to be, unrepresentative synthetic benchmarks.

“So you’re going to hear a lot about gaming CPUs this week,” says Jon Carvill, VP of marketing, says. “They may or may not come from certain three letter acronyms. That said, here’s what I want to challenge you. I want to challenge you to challenge them. If they want this crown come beat us in in real world gaming, real world gaming should be the defining criteria that we use to assess the world’s best gaming CPU. I challenge you to challenge anyone that wants to compete for this crown to come meet us in real world gaming. That’s the measure that we’re going to stand by.”

One of Intel’s targets for unrepresentative benchmarks was Cinebench R15/20. This is a popular benchmarking software among reviewers, but Intel holds that, seeing as so few users actually use Cinema4D by its own internal numbers, performance in this benchmark does not equate to real-world utility.

Intel also claimed that most users won’t benefit from the bandwidth offered to a graphics card by PCIe 4.0 “not today and not in the immediate future,” with only the maximum, highest-fidelity HDR 4K, 144Hz resolutions necessitating high bandwidth.

Attached: 1539279093137.png (552x661, 288K)

Other urls found in this thread:

youtube.com/watch?v=osSMJRyxG0k
youtube.com/watch?v=3V8pEsjNa4Q
youtube.com/watch?v=Otcge1cn8Os
twitter.com/NSFWRedditGif

[ ] PCIe 4.0 is unnecessary
they are omitting that the GPUs may only need half of the PCIe lanes, freeing some for say additional SSDs (in RAID?) or a crossfire/SLI solution - at an increased chipset power consumption.

>unrepresentative synthetic benchmarks.

This much cope. It's just like when AMD performed worse in synthetic benches.

So they already know that the 16c/32t is coming and they have nothing to show for it.

This is like AyyyMD removing FPS counters to pretend their GPUs are as good as Nvidia's.

[x] Benchmarks don't matter
[x] PCIe 4.0 doesn't matter
[ ] only shit old games compiled with Intel's compiler that deliberately slows down AMD processors matter

>Intel also claimed that most users won’t benefit from the bandwidth offered to a graphics card by PCIe 4.0 “not today and not in the immediate future,” with only the maximum, highest-fidelity HDR 4K, 144Hz resolutions necessitating high bandwidth.
Someone screenshot this and remind us the time when intel releases a new socket series with PCI 4 support.

This shit is even more hilarious when you take into account those benchmarks they posted the other day "proving" that Intel's new chips beat AMD's. But....
>optimizations on the Intel processors may not have been enabled for the AMD processors
>security fixes may have been disabled
>all kinds of other bullshit in a disclaimer that is like 3 paragraphs long
lmao. Intel is kinda known for this shit at this point, though.
youtube.com/watch?v=osSMJRyxG0k

Intel is skipping right ahead to pci-e 5

Shit, here's an even BETTER video from Adored talking about trumped-up benchmarks in particular:
youtube.com/watch?v=3V8pEsjNa4Q

Stay in a single thread with your consumerist fanboy circlejerk.

>FIGHT ME IRL
AHAHAHAHAHAHAHHAHA

Attached: 1559133586741.png (443x512, 70K)

When memes become reality
Clown world

Attached: 1536694146733.png (1200x800, 164K)

>One of Intel’s targets for unrepresentative benchmarks was Cinebench R15/20. This is a popular benchmarking software among reviewers, but Intel holds that, seeing as so few users actually use Cinema4D by its own internal numbers, performance in this benchmark does not equate to real-world utility.

This goes both ways. Yes Cinema4D is a joke in terms of 3D industry usage numbers (After all it was a motion graphics suite that got crowbarred into being a full 3D suite, and it shows) but at the same time Cinebench R20 was basically gimped with AVX512 to make Intel look good compared to AMD after AMD whipped their asses on Cinebench R15. So Intel cared a hell of a lot about Cinebench when it was in their favour, but now it doesn't count.

>AMDoredTV
>sponsored by AMD and Patreon

Yikes

>giving a single about dronedTV

back to /ayymd with you

back to tel aviv with you

>most users won’t benefit from the bandwidth offered to a graphics card by PCIe 4.0 “not today and not in the immediate future,”

I don't give a shit about the GPU making use of the PCIe 4.0
I'm looking at those super fast NVMe drives.

Guys, has this ever happened before? I can't remember seeing Intel get crushed like this ever.

Cinebench doesn't matter if AMD is winning

t. Intel

History repeats itself

Attached: 1558890923536.png (450x337, 36K)

wtf how is a 2ghz chip beating intel's 3.4ghz chip

Intel will always lie and cheat.

Attached: dont-trust-intel.png (1168x414, 215K)

Amd had double ipc

damn this is making me nostalgic

yeah the original Athlon, Athlon XP, and Athlon 64
Intel was in deep shit until the Core 2 Duo

Does that mean Intel is going to be better in a couple years and amd will be a joke again?

LOWER THE PRICE

I dont care what your new marketing bullshit mantra is, I'll buy according to performance/price
cant beat the AMD now? - lower the price

That's the point, they can't
Not with how hard they are binning these 14 nm CPUs, much less with how bad the yields for said binnings are.

>f-fight me IRL bitch
c'mon intlel

Netburst was fucking garbage,

>Adored
Fuck off

I fucking love this timeline

I don't give a shit about either company, but seeing Intel getting wrecked and actual competition in the CPU market again is fantastic.

Attached: 1558853140201.png (800x618, 937K)

You sound like a faggot.

It really depends on the future of Intel's process schedule

inb4 recorded gameplay

Remember this because it seems to be memory holed.

youtube.com/watch?v=Otcge1cn8Os

Attached: kikeripper.jpg (638x710, 96K)

Jim, give it up. You're a joke.

GOY I NEED MARGINS

COPE more by calling others retard and to kys on twitter

delete this anti-semetic thread right now

Here's your 9900ks power delivery bro.

Attached: Embedded-Power.jpg (400x300, 14K)

>lisa su goes on stage
>Babbles about things
>oh our competitors said to come forth and beat them on real world testing
>im assuming they have all the mitigations enables since its real world
>so for the first time EVER i present you the competition killed
>FX 8350

architecture is more important than niggerrherts

Better to be a faggot than a retard who actually takes adored seriously after how wrong he turned out to be over zen 2.

That would be absolutely fucking savage. Then come back and rape them with ryzen 2.

Attached: 1054894-pcie-4_0-ssd-1618x907[1].jpg (1618x907, 349K)

Yes

Only retards think otherwise

AMD fans are laughing at Intel now but they will be the ones weeping when Intel strikes gold again

>pay companies to optimize their games for your CPU over AMDs
>use those games as benchmarks

Really makes you think!

3.6 kilowatt, not great not terrible.

There are AMD-sponsored games too you know.

>I started following CPUs at Bulldozer

>by the time intel brings their 10nm amd will be at 5nm
(well its either zen2+ or zen 3 either way its going to be hilarious)

i havent seen any amd cpu optimised games

Netburst was created to work around the enormous latencies of Rambus memory by having an absurdly long pipeline and an assload of clocks. Intel had a cooperation with this Rambus company, hoping to ultimately control designs of everything and demand exorbitant licensing fees.
Rambus was shit on P3 (hence the creation of Netburst) and never came to AMD, and most of all nobody wanted to pay Rambus for practically nothing. So everyone else settled around what became DDR1, it was cheap and reamed Intel.

Rambus did have a gimmick in trying to save pins and traces.

Attached: D5UyLSzVUAEXePR.jpg:orig.jpg (714x1000, 80K)

Rambus was the most horrible fucking meme of all time. I'm glad they're dead

>high settings on a CPU benchmark
I'm glad to see people were just as retarded then as they are now.

They've been one-upping each other for decades. AMD made superior 386 and 486 processors, broke the 1GHz mark and introduced 64bit and multi-core desktop processors first. Intel was basically eating their shit all throughout the P4 era until they came out with the Core architecture. And even then it has come to light that their ability to catch up to AMD with Core was only because they took dangerous shortcuts that everyone backing team blue is now paying for if they don't want to become some hacker's bitch.

>attempt consumer lock-in antitrust bullshit
>get buried
If only this outcome were more common.

So every game that is extremely GPU-bound is AMD-optimized?

Attached: 4L_yf4NVyjO.png (1314x1192, 53K)

>come meet us in real world gaming
Are they gonna shoot up the AMD E3 booth?

remember the SCO lawsuit?

amd is sandbagging.
just wait for reviews.
50% ipc increase and intel commits sudoku.

>come compete in a competition on things we specifically picked to make sure we win
good god could they be any more cocky?

>1v1 me

so power consumption is a bigger problem than pcie 4.0?

It's true, here at Intel we have built the best gaming CPU's. In this example our CPU is beating AMD's offering by 6FPS, for GAMERS that are SERIOUS about their GAMING this is a big deal. At the end of the day you get what you pay for.

>Being too stupid to take pricing into consideration

better yet - a notebook hooked up with google Stadia, practically yelling:

YOU DON'T NEED A BEEFY CPU ANYMORE FOR GAMING

I wonder if you can pay more for better frametimes.

They won't be when Zen2 beats them anyway.

>real world gaming
>quake 3 at 640x480

Intel is running out of use cases where it's better at a frighteningly fast rate.

This
Shareholders would lynch them if they were to lower their margins. Intel went as far as saying that they're willing to sacrifice market share to keep their margins.

Has anybody the latest picture that compares German AMD/Intel sales handy? Despite AMD outselling Intel 2:1, they're almost equal in terms of revenue generated due to Intel's higher prices.

they could only get away with it because they had faster chips, not anymore

>our tests was made from the trustworthy and 3rd party


>Principled Technologies

SHUT UP GOY

240p gaymen goy

What do you need 80x24 text mode for?

There are no security flaws
The researchers are delusional, take them to the infirmary

>PCIE 4.0 don't matter lul

Except both next gen consoles are using them, fuck you Intel

& move to 7nm already for fuck sake & drop skylake

2 processors enter only 1 leaves.

As I said before, wait until intel has pcie 4 support and see what they say then.

Many high graphics settings have a big impact on CPU.

Lmao, this. I don't give a shit about raw performance. Performance per buck is the only metric that matters, and Intel has been absolutely BTFO in that regard.

OY GEVALT, DELETE THIS!!

>J-JUST WA-WAIT!!
AHAHAHAHAHAHAH!!!!!

Attached: 1468299372334.gif (800x430, 564K)

tl:dr?

you forgot:
> performance doesn't matter

that's because settings scale differently from two different microarchitectures.

This was very obvious because A Ryzen won't even try to go full load on all threads when you hit low settings. And this is why a comparison at low-end is moot.

Because Intel has a Single-core advantage, it has the power to scale back and still perform in lower settings.

That's the real reason. I agree that for an Intel CPU nowadays, low-settings will absolutely show computing power. But AMD's architecture is fundamentally different that comparing how it would scale in lower settings would be stupid.

Buy Intel goy

Yeah but intards are pretty much like applefags

Intel literally can't compete with the 12-core and 16-core, so they went full damage control with "only gaming performance matters"

This. Finally people see the bullshit here. Idiots think that AMD scales the same way as Intel does.

Fundamentally, Intel has always relied on one tactic. BRUTE FORCE. It's true during the Prescott days, it sure as hell is true today. Only time they did not was when they got suckerpunched by the Athlon64 into making the Core Microarchitecture.

HMMMMMMMMMMMMMMMMMMM

Attached: 1534204099106.png (1920x1080, 611K)

>Intel: OURS IS STILL BETTER FOR GAMING!!!111
>Lisa at E3: Oh btw, we forgot to tell you how easy these overclock.

Attached: 1558545774096.jpg (700x565, 73K)

It's not just beefier, it's also far cheaper.

For older games that don't have any performance benefits beyond 4 cores this is true though.

Intel only recovered because they used unsafe security practices for speed

now that they got caught out, they lost ~15% performance

This, the entire Core architecture is completely pozzed. That's how they won, with speed holes.

2007 lmoa

Attached: 1557876056704.png (1536x2048, 401K)

Look at this intcel fags, you caused this. You brought this upon this world. Now pay in your blood and tears.

The competition can't compete.