CPU's will double in power by 2022

CPU's will double in power by 2022.
youtube.com/watch?v=XW_h4KFr9js

Attached: xcpu_lede.png.pagespeed.gp+jp+jw+pj+ws+js+rj+rp+rw+ri+cp+md.ic.YnbrmCDUPp[1].png (650x300, 252K)

Other urls found in this thread:

en.m.wikipedia.org/wiki/Gustafson's_law
twitter.com/NSFWRedditImage

INTEL 10NM SUPERPOWER 2020
WAIT

Thank you AMD ARM Intel etc we live in truly blessed times. Except for that whole single thread problems

>He didn't watch the video

>double in power
so Intel is gonna turn their 500-watt housefire into a 1,000-watt housefire?

kek. I should have said double the IPC.

Knowing Intel this might actually happen.

>just replace the motherboard with a giant chip
nice

The only thing that's happening is a whole bunch of transistor leakage. HKMG was fine.

Who cares bro, i keep using my shitty dell from 2007 until it dies and then i still have a toshiba satellite thats sort of working.

Attached: famine1.jpg (960x643, 60K)

Too bad Moore's law is coming to an end "soon" 1-atom transistors have been a thing going on 7 or so years now, but just need to be put to use on the ICs. I thin that is something like 0.5nm. Making your ICs bigger doesn't count.

Attached: 1551903120901.jpg (600x800, 119K)

tfw I recently upgraded my vostro from 2008
double 8gb ram instead of 3, and a 500gb ssd instead of 320 gb 5400 rpm drive.

best part, it has a 16:10 screen

I don't know how this faggot got relatively popular in a short amount of time.

I find his musings interesting. Certainly more interesting than Adored etc.

Why would you just go and say something fucking retarded like that?
Planar devices leaked far, far more than even early FinFET devices. Moving from Fins to GAAs will bring leakage down by an order of magnitude.

Attached: 28-14.jpg (1064x714, 98K)

Yeah
Intlol 10GHz microprocessor by 2010
Take that you AMDrones!!!111

this channel seriously has the most retarded takes
fuck off with this shit

No need to wait till 2022 they already doubled in power in 2019 thanks to AMD

what a disgrace of a """man""""

Even if they could pull that off by 2022 (they can't) it still pales in comparison to the yearly doubling we used to get.

If amd didnt come out with ryzen intel would still give us quad cores with 5% performance gains every generation

Inshallah

CPUs will be sold by height by 2030.
"New 5cm intel CPU! new 7cm AMD chip! 3 inches Nvdia!"
It will sound a LOT like dick measuring context.

I made this joke the other day about CPU e-peens.
>mines thicker than yours

They will eventually get to a cock-like geometry, or at least look like phallic borg cubes.

Phallic isolinear chips?

Those look very flat.
Also i bet there will be a liquid channel in the middle of the phallic chip towers to cool down the thing.

Already a thing.

>Too bad Moore's law is coming to an end
it's bullshit, watch the video
Moore's law states transistor count would double every 18 months or so, which is still happening at a steady rate

I hope you realize "Moore's Law" isn't actually any sort of scientific law lmao.

It was literally just an prediction and it has been slowing down. We're closer to about 2.5 years now.

MOAR COARS

>moore's law
ah it was just an observation

this cuck has no clue what he's talking about
the whole "automatic parallelism using AI" is a joke automatic parallelism has been heavily researched for decades it's not just going to happen now because of meme learning
if major advancements are made it will be at the compiler/runtime level not hardware, there are far too many drawbacks to abstracting physical cores to the point where it would make most current applications perform worse especially ones that depend heavily on manual optimizations such as game engines

Attached: 1552326973122.jpg (973x752, 235K)

is that a potato resistor?

He talks about a kernel based algo already that splits single threaded tasks to the cores accordingly that already exists and is being touted to companies.

There's a hard physical limit, smoothbrain.

So i shouldn't get ryzen this summer???
jk ofc i'm getting it, power isn't going to double for another 5 years.
You think people have the money to upgrade cpu's every couple of years?
Hah!
i'm sure 7nm was held off by 2 years just cause people needed to catch up.

diddnt the i7 come out like 10 years ago?

Attached: 1539418747227.jpg (640x643, 40K)

I, too, took Computer Science 1

it's a gimmick that doesn't actually work except in artificial ideal situations or you'd be hearing about it by now from more than this random youtuber, there just isn't that much performance to be gained by parallelising for the vast majority of applications
in fact the faster individual cores get, the less you want to parallelize because memory/inter-core latency and bandwidth become more and more of a bottleneck
simple explanation for brainlets:
when you parralelize a task you cut it up in chunks then give it to the cores, however the cores are all in their own location so the chunks need to be big enough for the travel time to be worth it, as the cores get faster the minimum chunk size gets bigger
performance will more than double in 5 years, not because of muh cores though but because of accelerators + massive architecture improvements especially in regards to l1-3 cache

He does a bit of investigation, instead of just "oroite goys howzit goin" followed by twitter drama and muh emails.
Also you can actually understand what the guy says.

>in fact the faster individual cores get, the less you want to parallelize because memory/inter-core latency and bandwidth become more and more of a bottleneck
That's simply not true. A lot of applications will always benefit from a certain amount of parallelization, no matter how fast individual cores do get. For other apps, that is right.

Knowing intel this might not happen at all

Should be called the revised "Moore's conjecture" anyhow.

Eh, that video... we're probably actually mostly dealing with en.m.wikipedia.org/wiki/Gustafson's_law , not Ahmdal's argument.

I guess this is why intel chips haven't been getting smaller, because they are chasing this stupid ai shit.

2022
2022watts

has machine learning even done anything nice in the past 4-6 years?

Yeah it made fire hydrant pic captcha possible

My cousin who works at Intel said.... he's getting laid off be cause he was getting too old and Intel wanted to hire cheaper younger workers. Still might stick with Intel processors.

That's not impressive.

>Leakage is currently one of the main factors limiting increased computer processor performance. Efforts to minimize leakage include the use of strained silicon, high-k dielectrics, and/or stronger dopant levels in the semiconductor. Leakage reduction to continue Moore's law will not only require new material solutions but also proper system design.

This isn't the 90s, pal. Strained silicon isn't relevant.
Don't try to quote some out of context bullshit you googled when you get called out for making a completely stupid and factually incorrect statement.
Current leakage is lower than it has ever been. Current leakage will continue to decrease as the industry transitions towards GAAs.

Why is there not an open source version of that AI multithreading predictor software he was talking about?

Attached: 1325678175649.jpg (355x495, 106K)

The chance of CPUs doubling in power by 2022 is the same as teh chance that India will become a world power by 2020.

But it has entirely been going to the L3 cache. CPU Benchmarks haven’t increased substantially.

>Moar's Law

>SCHEDULER_FUSE0 ( 1 threads)
>SCHEDULER_FUSE1 ( 2 threads)
>8 cores combined as a single thread or two
Maybe it'll arrive much sooner than expected.

Attached: 1kqm080rjfd21.png (854x666, 35K)

yes, like when every atom in the solar system has been used for a single cpu
maybe then there will be no way to add transistors

Wirth's law. Software gets slower faster than hardware gets faster.

pajeets have pivoted and are now saying 2030 lol

>INTEL 12NM QUAD CORES BY 2022
based, I can pay triple the price of AMD for 7 more FPS on average

too bad it's just simultaneous calculations and not faster calculations

Which OS are you running on it? I have a similar laptop from 2009 and W10 runs alright on it, other than popping/crackling audio (fixed with foobar2000 WASAPI event mode) and stuttery 720p YouTube™ playback.