When do you think Moore's law will end?

When do you think Moore's law will end?

7 nm at 2020?
5 nm at 2022?

Or do you think it will never end?

Attached: Moore's_Law_over_120_Years.png (1513x1062, 205K)

>When do you think Moore's law will end?
it already ended around 2012.

Attached: 1519751379299.jpg (618x597, 144K)

>When do you think Moore's law will end?
It's already dead.
We still have some shrinks left, but 3nm (aka Intel 7nm) require different materials or device structures (QWFETs/GAAFETs/whatever).

If we continue to redefine what it applies to, it won't end. For me it ended in 2006. 5 GHz barrier plus heat it's hard for electrons to behave.

Do you know what Moores law is op?
>"Moore's law is the observation that the number of transistors in a dense integrated circuit doubles about every two years."
What you've shown is not about that. Especially since you look at GPUs. You've now favored parallel work heavily to not see the flatline.

Why does Moores law matter? Our penises don't know the difference, so why should our brain?

>per dollar
Graph invalidated

Did you also happen to notice it's considering gpu calculations per second as the same thing as processor calculations per second.

We were still on 22nm in 2012. We also got 14nm in 2014 exactly on schedule.

(Performance hasn't improved much since then however, despite the multiple die shrinks.)

It's slowed down but not dead. Intel just released their first 10nm part last month (a U series i3).

AMD/Samsung/TMSC are currently at 10nm (equivalent to Intel's 14nm) and will be launching 7nm chips in the next few months.

Moore's law refers to transistor density though.

I just got that picture from Google Images.

...

>Moore's law refers to transistor density though.
What do you think is preventing us from making it denser?

Heat

We can pack in more transistors. But we can't keep the electrons within them anymore.

why not make CPUs bigger area surface so more transistors can go on while keeping same density :D

Pretty soon, quantum tunneling is already an issue and is only going to be worst @ 5nm

Firstly, bigger chips means you get less chips per wafer. The less chips per wafer, the lower the yield and thus the higher the price per chip.

Secondly, each wafer will have some imperfections. If your chips are small, then it's not a big deal because you can simply discard the few bad chips and keep most of the good ones. With larger chips, you would have to discard a greater percentage of chips because many/most of them would contain an imperfection.

Attached: main-qimg-75a9671d8c10423f4a78c8a6d825cfeb.png (602x201, 77K)

Which fucking version.
He revised it 6 gorillion times

but eventually right? they have to go up in size since there is physical limit of going smaller and smaller?
i mean forgive me my ignorance , i cant even wrap my head around the fact that we have knowledge and technology to move around atoms that we cant even see or know how they look like,its beyond me, its like a fantasy and now we make transisotrs few dozen atoms wide? my shitty mind is blown

It feels like "computer experts" in the 80's predicted that we'd be a lot further ahead technologically by 2020 than we actually are... unless something radically accelerates in the next 18 months.

Attached: Computer chip.jpg (1349x1019, 334K)

The ant think isnt wrong. Its just we have way better monolithic cpus so why restrict our self to ant memes? Also based on what you wrote i was confused as to hoe this pic was about the future

>why not make CPUs bigger area.

Grace Hopper helped us think of light-year distance in relation to election travel speed. With resistances an election moves about 8 inches per nano second. Increasing size increases distance the elections travel.

>but eventually right? they have to go up in size since there is physical limit of going smaller and smaller?

The physical atomic limits are the problem we hit in 2006.

We have just gotten smarter about chip design and dedicated chip space for things like branch prediction. That's what makes us faster now. Not transistor density.

Laws of nature don't end. Stop calling it a law.

Well when we have graphene chips we can have teraherz CPUs and an iGPU so good we won't need dedicated. It'll be in the Angstroms in thickness.

at 3nm in 2025.
but it wont really end.
We will have 3d stacking techniques by then, so we will be okay.