The absolute uberchad continues to certifiably wreck shit

The absolute uberchad continues to certifiably wreck shit
is AMD in trouble?

Attached: Screenshot_20190917_193220.png (605x246, 37K)

Other urls found in this thread:

phoronix.com/scan.php?page=article&item=amd-zen2-spectre&num=1
twitter.com/AnonBabble

Jim would never hurt AMD. He's probably feeding AMD Intel secrets to help Zen even more.

>Jim's career is him just hopping from one company to another. Competing with nobody but himself.

Attached: 4zc4y2lq39b31.jpg (640x640, 85K)

He's just helping Intel catch up because zen was too good. After this he'll create a high performance risc-V architecture and throw everything awry before going back to amd to fix zen10 that cost cutting measures inevitably will fuck up.

He's helping Intel make GPUs not CPUs.

>all the replies to the tweet are wojak-tier posters
Jesus christ how does carmark put up with Twitter.

Jim built Zen, right? Holy Christ he ruined Intel's day

>Shows up
>Wrecks competitors shit
>Leaves

Attached: Jim Keller.png (816x1020, 400K)

He's in-charge of all silicon development.

>Senior Vice President of the Silicon Engineering Group

Jim helped AMD make the only two good architectures they ever had. Every time he leaves AMD always goes to shit again.

He apparently helped out with Zen, Zen2, and Zen3 before leaving.

>apparently
lol there's no question Zen is Jim's baby. AMD hired him after the mess that was bulldozer to be able to compete again. Everytime Jim has gone back to AMD he helped them be relevant again. This is his first time at Intel.

6 years to product, 2022-2023

We'll see if hes primarily on CPU design or if hes working on something else like FPGA

...and whether Intel 7nm doesn't shit the bed

Attached: 1487770379097.jpg (600x773, 111K)

>how can one man be so base-
>Jim Keller is married to Bonnie Peterson, sister of clinical psychologist Jordan Peterson
Oh shit.

Imagine being so good at your job you literally bounce from one major competitor to the next over and over, playing tit for tat with innovation

>6 years
With AMD's lack of money they shat out 1st gen Zen in less than 4.

Intel is in a position of financial superiority, half-finished product, and a severe need to outcompete. They will be on parity with AMD (at massive core count scale) by Spring 2021 at the latest, desktop product Q3 2020 at the latest. And you're fucking delusional to think otherwise.

AMD also got lucky that IBM shared their fab and technology with them.

IBM had nothing to do with Global Foundry's licensing of Samsung's 14nm fabrication, user.

yeah. if only amd could higher clocks, fix their cache ratio problem, and not advertise false boost speeds. if only amd could take advantage of their own goodness.

I really don't understand why it is so difficult for the community to understand what is going on with Ryzen and gaming performance. The reason is simple.

You have all seen the AMD diagrams that show the Infinity Fabric. They clearly show interconnects between each chiplet and the IO die and list it at 32 bytes/cycle.

You know that with the 3000 series of chips, the Infinity Fabric tops out at roughly 1800mhz.

Doing the maths: 32 bytes x 1800mhz =~57GB/s

The theoretical maximum limit of dual channel 3600 MTs RAM is ~57GB. With latency overheads, you can test that at about 51GB/s in Aida64.

All that is great if you run cinebench, blender, handbrake etc. The CPU gets all the data the ram can supply. The processed output of the CPU ends up in the L3 cache where it is output to the monitor, storage or a memory address.

When you run a game, firestrike or timespy, the CPU has to process the instructions that are passed to the GPU. A 2080ti at Max fps needs about 15GB/s of instructions, textures etc to render it's many frames per second. The GPU obtains these instructions from mostly the L3 cache (game cache).

If the GPU is taking 15GB/s from the 57GB total bandwidth, that only leaves a Max theoretical bandwidth of 42GB/s before the latency overheads available for the cores to obtain data to process for the next instructions that it has to pass to the GPU.

Reduce memory bandwidth starves the CPU and the number of instructions to render frames is reduced.

Intel doesn't have the same limitations. On a 9th gen CPU the cache multiplier determine the ringbus bandwidth. The ring also transfers data at 32 bytes/cycle but the cache is clocked at around 4200mhz. That calculates to a Max theoretical bandwidth of ~144GB/s.

The bandwidth of the L3 caches on both Intel and AMD are roughly the same. And clocks cache at CPU frequency and Intel uses the cache multiplier. (Ever wonder why AMD chips don't overclock to 5GHz? It because the cache won't run that fast withing the power and temp envelope of the Ryzen chip)

The Intel dual channel 3600 ram still tops out at about at the same 50ish GB/s and the Gpu and the GPU still wants it's 15GB/s but it can run over a pipe that can carry 144GB/s. The CPU keeps getting data from ram at the maximum the ram can supply it and as a result, the CPU can process more instructions for the GPU.

No next gen consoles, finewine mystical bios, or chipset driver update is going to make a 3900x the gaming champion anytime soon. its only going to get worse with newer, faster cards. its hell, fire and brimstone, for amd at the end of the tunnel. not light.

The only man who can compete with him is... himself

Is this from reddit?

No way

Attached: rsa.png (446x446, 189K)

Its not him competing with other people, its him competing with the laws of reality itself since no one could match his glorious autism

the only reason why amd is having a chance with zen 2 is because intel is to jewish with HT. if they gave the 9700k HT and priced it at $350 there wouldn't be a single reason for the 3700/3800x at all. give the 9600k HT and the 3600x is worthless. all amd would have is their MOAR CORES 3900x. well, only if your autistic with mitigation's. you tards forget they hurt amd too. phoronix.com/scan.php?page=article&item=amd-zen2-spectre&num=1

Attached: jeJNud9.jpg (608x369, 96K)

not OP but it sure seems like it. it be best to ignore that shill as reddit is trash. amd has no such limitations. only intel jewish psyop shills spread that fud. like amd uses glue.

Attached: 1558929293066.jpg (679x758, 54K)

1488 can't come soon enough

Attached: 1559275183673.png (300x286, 36K)

Intel has too much money to care. I think that's the issue. They beat their expectation this quarter with $16.5b. They don't even care about the small % loss due to AMD being competitive.

Oh, they do care, just not swerving the whole business to "battle" like AMD did

Their advertising has changed markedly since Zen launched, lots of vague bragging terms like "real world performance" and weird cooling showcases

Attached: valid point.png (501x423, 19K)

It came once, turned out to be a literal day of the rope, but for Nazis.

I just hope Raja pulls through with Intel's GPU lineup and disrupts the price gouging that NVIDIA and AMD have been doing to us for years now. I used to love AMD GPUs but the prices are redic for what performance they give anymore.

Attached: 1497566914156.png (128x99, 21K)

nice blog post i'll be sure to subscribe

Attached: 435346347.png (210x201, 50K)

>intel
>lower prices
time to wake up bro

if anyone is dreaming it's you

Attached: price performance.png (2523x820, 605K)

>was $700
are you a mongoloid by choice?

It doesn't have to be Intel. If their Xe GPUs are any good I have no doubt we'll see NVIDIA and AMD cut prices or put on good sales to persuade people from jumping ship.

>paying full price ever

It's been at $450 for months now, the "sale" is just to trick people into thinking it will go up in price soon
also MSRP is $490 so even then it still beats 3900X

AMD doesn't need to cater to poorfags now that they are able to compete again.

you literally proved my point tho, i was saying intel isn't known for making prices lower. AMD is known for making prices lower, they've always competed with nvidia by being price to performance king. The reason your picture shows intels prices being cut is BECAUSE of AMD. Nice 1080p picture tho but you'd have to be a mongoloid to buy either of those cpus and play at 1080p and not 1440p.

Intel processors are so devoid of value that, even when being peddled at lower prices compared to the competition, they still manage to get outsold somehow.

>price to performance only matters when AMD is better

Based.

r/AMD UNITE

Jim Keller would bring the end of AMD, not because of his own doing but Intel will never want AMD suddenly leapfrogging them out of the blue ever again.

The next Sandybridge from Intel and they will ensure to bury AMD forever, there's a reason why AMD has been rushing their tech to bury Intel.

>The absolute uberchad
The guy is basically a redditor.
Materialist atheist "science has no limits" egghead.

Intel is smart enough to know that AMD is needed in x86 market to avoid crys of monopoly.

>Be so OP at designing CPUs the entire market is Keller vs Keller.
He's basically playing chess against himself at this point, hopping from one company to another to keep things interesting.
I would love to see the pay negotiations this guy goes through.
I'd imagine he can basically demand whatever the fuck he wants and he'll get it without a single peep from higher ups, no matter how outrageous the demands are.

what the fuck

> 1080p with a RTX 2080 Ti
Who does shit like this?

Hopefully the dipshits at intel have a solution for the countless physical limitations that have caused stagnation.

The future is probably just adding more cores, but no programming language for parallelism exists yet.

>but no programming language for parallelism exists yet.
There's plenty of them you're just not looking.

Each CCX has 32B read + 16B write / cycle, each memory controller (1 on desktop parts) is 32B/cycle, PCIe is on the IO hub part of the IO die which is 64B/cycle.

Where's your evidence memory bandwidth is the limiting factor for those things? On Intel there's more eviction to main memory and rotating back going on in L3 since it's smaller, so it wouldn't make sense to use memory bandwidth data on Intel platforms to infer a bottleneck on AMD.

Attached: 09fdd897-e441-44ef-8926-df4ffcd85f6e.png (1258x704, 416K)

>CCX
*CCD

what happens when he dies? a computing dark age?