RIP Intel

youtu.be/pwsLSrcoCgE

Attached: Screenshot_2019-07-06-22-38-54-013_com.google.android.youtube.png (1440x720, 622K)

Other urls found in this thread:

techreport.com/review/34192/intel-core-i9-9900k-cpu-reviewed/2
twitter.com/SFWRedditGifs

how the hell am I supposed to fit all my games on 72mbs? Jesus Christ AMD fucking sucks

Attached: latency.png (1399x1226, 545K)

>less latency

Attached: AMDRyzen53600X470Tests8.jpg (719x568, 152K)

Attached: Intel-AMD-Naples-Reply-13-1080.348625475[1].png (1500x844, 210K)

chiplet to chiplet L3$ has 70ns latency intel with xmp is 40ns in dram :D

I WANT GAMECACHE TOO

Attached: 1544667387109.png (813x1402, 324K)

Doesn't matter if you have that much cache inceltard.

no bros we wuz kangz

Attached: 1559056252058.jpg (1510x1593, 662K)

I don't get it. If latency is such a huge issue on AMD why is a 2700X clocked 1GHz lower than an OC'd 8700K still 95% as fast on average especially at 1440p?

Attached: Witcher.png (1314x1192, 53K)

>Memory latency
Wow, it's fucking nothing. A 100% increase in memory latency corresponds to a 7% decrease in performance.

Attached: bf1-ram-bench-scaling.png (607x371, 18K)

feels good to be able to keep shitting on amd for the 10th straight year, intelbros

Attached: intelchillguy.jpg (350x498, 68K)

>>
because bf1 is a game from 2001 retard

For real though it's sad how much intel is lagging behind AMD now. I mean sure they're faster but at what cost? A $200 Z motherboard that can handle a 25/7 5GHz OC, $200+ tripple fan AIO, $100s of dollars in additional electricity bills each year? All that just for 5% higher fps?

Attached: BF1.png (1314x1192, 52K)

>gamecache

what happened to all the L3?

Attached: 1557196644799.gif (207x207, 1.37M)

Nobody really buys Intel though except rabid fanboys.

WTF DELETE DIS U DIRTY SHUDRA MOTHER FACK U BADIR

Attached: mad.jpg (300x377, 71K)

Remember: the 8700K has a 1GHz frequency advantage.

Attached: WD2.png (1314x1192, 52K)

What is it about AMDrones that they think they can just shit in the streets? IDC about your explosive curry diarrhea - don't shit in the street, Sanjeep.

>incels literally grasping at straws before the avalanche of reviews

this is some new level of damage control from incels

that 1ghz advantage seems to stay in place with the arrival of the ryzen 3000 series too lmao

Attached: amd running.jpg (680x485, 279K)

AMD knows not to partake in the GHz wars, they don't want to repeat history (see A64 curb stomping pentium housefires). So instead zen 2 has 15% higher IPC than zen+ so 4.4 GHz zen 2 = 5GHz inhell.

Attached: MEA.png (1314x1192, 53K)

Impressive. There's more intel shills and trolls than actual posters ITT.

You don't need AMD unless you're participating in a virginal dick measuring contest.

Also why have the Chink dickshits been using the Ryzen brand? Because they don't want normies differentiating between generations. They know how ignorant their customer base is.

>gamecache
If you needed any evidence that only retards buy AyyMD

not real cache

Not only the latency is way lower than my 1600X but those speeds are stupid. Like, DDR2 levels.

Attached: latency.png (537x519, 202K)

u feel trolled because of the truth?
techreport.com/review/34192/intel-core-i9-9900k-cpu-reviewed/2

Attached: Graphs_aida64memlatency.png (480x370, 8K)

This actually works differently with cache, though, because smaller chunks of cache are generally accessed than chunks of RAM.

Attached: designated.jpg (986x926, 252K)

He cropped out the 2133 MHz ram

Attached: 1547291865841.jpg (3840x2160, 766K)

AMD is like the democrats of technology.

seems like they won't partake in the performance wars either

Attached: maxresdefault.jpg (1280x720, 61K)

>2133mhz ram
>no boost
fuck off incel

test

Miss me with that gay shit

We're being shown the effect of bandwidth, not latency here. If GN also posted timings we could make some rough equivalents, if they stated that they used the same timings at all speeds we also need to normalize for bandwidth to know the performance effect of latency.

You have a 5th grader's understanding of logic.

At what time will the embargo be lifted?

07.07 at 07:07am

Which time zone is the 7th time zone?

kek

Attached: tc1gfk0iha801.png (2518x1024, 345K)

you cant really compare hz, one chip might have lower hz and do more work than another. just a fyi

then how come chA(d)MD performs like ass in the real world?

>worse than gen 1 Zen
So in other words something needs software fix, cause Zen2 copied some design aspects of the 9900 on the front-end to reduce latency.

>AMD
>72MB of Gamercache
>Incel
>No Gamercache

the twilight zone

>Intel MEGATASKING
heh, nothing personell...kid

Attached: 1495087953097.jpg (493x493, 54K)

>No gamercache
Doesn't matter, you lost. There's a reason why Intel hasn't been used in a console for nearly 2 decades now. Same goes for NVIDIA.

Intel has Copecache and Dilationbooster

yeah, price.
you get what you pay for, 10% lower performance, at 20% lower cost, who do you think a profit oriented company is gonna go for?

because they don't run on empty for quiet a few years and have no debts?

Intel is for incels. Not trannies.
0
Gamer
Cache

i wouldn't mind to meme about that shit but it's just not funny anymore

1942 is actually from 2002

AMD has gamecache, gamers are right wing nazi transphobes. Intel spends millions on diversity programs. Intel is for inceltrannies

NOO DELID DIS VIDEO GAMES ARE SUPPOSED TO BE FOR MANCHILDREN

Say it with me now
Zero
Gamer
Cache

What does game cache mean? What's the difference between game cache and unqualified cache?

Remember: this is zen+ NOT zen 2.

Attached: Deus.png (1314x1192, 55K)

Intel is 7 years ahead
4c with l4$ is 100% 25% pro core
8c ryzen 103% / 12% pro core
intel ipc is double for gaming :D with a 3 year old cpu vs current gen amd

Attached: 128edram.png (1097x877, 187K)

how many % higher is 72 than 0? my calculator is having a hard time with this one

infinity you underage retard.

So does that mean coffee lake regressed in performance?

Attached: FH.png (1314x1192, 61K)

>100% performance reduction in both bandwidth and latency corresponds to a 7% FPS penalty
Cool, so it's extra irrelevant.

depend are we talking -0 or 0?

>comparing to gen 1 Zen still
lmao, Zen 2 does not have latency issues.

how long until the reviews start popping up?

±7200%

If you'd scroll up, you'd see that I'm comparing Skylake to Skylake.

Day after tomorrow for good ones.

AMD has ∞% more gamecache than Intel?
im checking the Intel box and I can't tell if it has 0 gamecache exactly or if it has -0 gamecache

You're comparing RAM to RAM.

no you are just gpu bound :D

>AIDS POZZEN 2700X
>Loses to 8700k
>in the most heavily threaded game
>WHILE BEING BOTTLENECKD BY GPU
How AIDSmd shills will ever recover?

>gamecache

Attached: 1561172111222.png (210x240, 5K)

It is zen+ and not zen 2 after all. Still pretty impressive given how most games can only efficiently use 4 cores at maximum.

Attached: RSS.png (1314x1192, 57K)

That's because game developers are so incompetent today that the real programmers of every game are just the guys that made Unity and UE.

>Poozen doomed

Attached: AMD-Ryzen-9-3900X-and-Ryzen-7-3700X-CPU-Review_Far-Cry-5_large.png (960x985, 101K)

What AidsMD shills will do now?

Attached: AMD-Ryzen-9-3900X-and-Ryzen-7-3700X-CPU-Review_Rise-of-the-tomb-raider_large.png (967x1211, 573K)

I find these cherry picked results really interesting. What do you think about pic related?

Attached: Dirt.png (1314x1192, 50K)

Now post the other 3 games

your Copecache is almost maxed out better stop before you start to overheat

>I find these cherry picked results really interesting. What do you think about pic related?
Outlier

they literally copied and pasted old data

Attached: 156233542.jpg (1707x788, 336K)

GAAAAAAAAAAAAAAAAAAAAME CAAAAAAAAAAAAAAAAAAAAAACHE!! OMGGGGGGGG!!!!!!!!!!!

Attached: 1561677651902.png (280x305, 29K)

Because latency was a minor issue with first gen ryzen, and now we are on the second major gen, with the refinement included the the third gen ryzen cpus.

excuse me but what the fuck do 1280x720 gaymer benchmarks even fucking mean?
I'll be playing at 1440p

Attached: 1539065691399.jpg (800x800, 68K)

nothing for you, it's essentially a synthetic benchmark unless you're actually playing at 720p, find benches in the resolution you play to find out how each cpu scales

For 8700K/7700K/8600K/7980XE only. So?
8c 9900K/9700K with 2667 DDR4 still cucks 12c 3900x with 3200 DDR4
>what the fuck do 1280x720 gaymer
That's a CPU test, otherwise it'll be a 1080ti benchmark.

>That's a CPU test, otherwise it'll be a 1080ti benchmark.
so they could run a benchmark like they did back several years ago, make it 1440p or 4k and turn down the graphics to the absolute minimum

Because the idea is to not test the GPU at all and just test the CPU. The CPU's experience when gaymen is 95% the same when playing at low res but the GPU is largely removed from the equation.

If you don't do that then you end up with benchmarks like where the GPU is bottlenecking everything. You learn nothing about which CPU plays games better.

Honestly though, did anyone unironically expect the latenty to become anything BUT worse when they physically lenghtened and complicated the path between cpu core and memory?

It's like expecting AMD to bend laws of physics.

...

are you implying that every cpu on would pull the exact same frame rate at something above 720p? because if so then you're wrong, and if you were right then anyone who just plays games doesn't need anything other than a 10 year old cpu so why bench games in the first place?

Still introduces significant performance variables from the GPU that way. But yeah, they could do both.

>gamecache

Attached: 1534677298168.png (205x246, 6K)

so theyre hiding their latency disadvantage against intel (be honest it exists) with a shit ton more cache right?

It doesn't make any sense though, it shows which cpu is better in a use case that will never happen and just causes confusion when people want to know which cpu to get for gayming, when any of them would do the same. This is a cheap marketing tactic from intel, and should be illegal.

Uh, spending money to improve the product so that ones of its performance dependencies is reduced isn't "hiding".

You'd be right if you said the marketing was trying to hide the issue, though. Cache latency is going to be a significant problem for gaming and calling it "gamer cache" is kinda lol.