What do we do once we hit 7nm?
We have pretty much hit the limit of silicon
What do we do once we hit 7nm?
Other urls found in this thread:
en.wikichip.org
twitter.com
Change architecture, use different materials, let NN develop more efficient architectures. The human brain has much less neurons and barley can power a light bulb.
Also Apple might unironically kill intel with their newer chips.
6nm?
We move to 5nm
and then 3nm after that
>barley
I bet you could overclock to almoats 2 bulbs. Increase the maltiplier a bit, most people don't use 100% of their brain to be rye. You'd risk migrains, but you'd get more bang for your buck.
It may sound corny, but I'm pretty sure you can "rice" (why is it spelt that way btw?) your brain.
Use oxygen, free and renewable.
Problem solved
Cringe
before we get past 7nm/10nm or whatever jew marketing name springs up next, we need to get EUV working so that quad+ patterning (= more masks and steps = more time and cost) can be reduced.
Intel's 10nm struggles are a testament that the chemistry and materials physics are already really fucking hard to get past, so we'll see what happens next.
Gate all-around is supposed to be the savior of the sector, but it will basically involve coating the undersides of suspended nanowires, which is turning out (maybe unsurprisingly) to do reliably at any sort of scale.
pic related.
>barley can power a light bulb
is the second wheat revolution upon us?
The brain consumes 20 W of energy
>maltiplier
I bet you could get 1.5 bulbs.
Okay I'm a bad speller. Horrible in fact. Yes comparing human brains to computers is fallacy, no I'm not into neuroscience but my point about efficient designs led by nondeterministic algorithms stands.
what kind of light bulb/
>The term "7 nm" is simply a commercial name for a generation of a certain size and its technology and does not represent any geometry of a transistor.
en.wikichip.org
stupid guizi, the future belongs to rice
>1080ti is 13 brains
>What do we do once we hit 1um?
>We have pretty much hit the limit of silicon
>What do we do once we hit 50nm?
>We have pretty much hit the limit of silicon
>What do we do once we hit 7nm?
>We have pretty much hit the limit of silicon
t. oldfag who's heard all this shit before
>watt
>energy
The human brain doesn't function like a traditional computer
quantum tunneling?
Kek no
Finfet.
It's kinda like the multi-core thing when processor reached a certain clock speed. People go around the problem instead of bruteforcing their way to a solution
yeah maybe theyll have the courage to remove using a pcb
Also, superdialectrics.
MOAR KOREZ
BASED and redpilled
It is energy, expressed in relation with rate of time
Stop being a faggot. You're well understood what he meant
How about silicon carbide?
It's more about mechanical properties than electrical
>>>/reddit/
Lerk moar
at some point along your timeline the fabs just started completely lying about feature size.
yeah we have decent switching speeds and continuously improved leakage, but density improvement growth has gone to complete shit compared to what we saw 10+ years ago.
I think your brain is underclocked
You sound like running at 0.6 bulbs
Is this possible?
I read that 7nm might be the limit for silicon, because smaller than this and quantum physics fucks everything up
7nm is pretty arbitrary but yeah, it's not silicon, but silicon oxide, which is used as gate oxide in mosfet (which is the basis of devices in an IC)
At certain thickness, the gate oxide will stop behaving like insulator and quantum tunneling will occur
So right now scientist is looking at diaelectric which could stop electon movement at couple atom thick
This gonna be interesting
I hope they'll find as cheap material as silicon
It all comes back to 60mv/decade sub threshold switching.
you want a high ratio of current in the on state vs the off state. Something like 1000:1 or so. (I forget the exact ratio, maybe it's 1000000:1)
As feature sizes get too small, the FETs start leaking too much current.
You can increase the gate voltage (and play around with doping to adjust the threshold voltage) to get better on/off ratios, but power consumption goes with the square of the gate voltage. If you'll notice, gate voltage hasn't been scaling with Moore's law - that's why Intel couldn't increase clock speeds anymore after 2003.
basically they're stuck in between a molten rock and a hot place. the process node leaks too much and consumes too much power, so they increase the gate voltage, which then also increases power.
only moonshots like CNT and quantum tunneling FETs can realistically get us past this issue because FET physics won't let you past the 60mv/decade roadblock. I don't see us moving away from silicon in the next 10 years though, because silicon manufacturing is 10 years ahead of anything else.
Why don't we just make bigger chips?
Power consumption
Moar cache moar cores.
man, imagine being retarded enough to think this
>quantum physics
No such thing.
i have a better question
is the nigger from pic rel crying because he saw onion
or because he sees someone working ?
I have a better question
How many times were you dropped on your head as a child?
modern cpus and gpus already use massively less power than things from 5 years ago.
TDP hasn't changed though
the hottest pentium 4 ran at ~115w
broadwell-e 6800k's run at 140w
*quantum tunneling