Redpill me on Moore's Law

How accurate is pic related? With Moore's Law becoming obsolete, will the doubling time of transistors per area just slow down and be less frequent than every 18 months, or will computing progress stop altogether? Will things like quantum computing, diamond based transistors, nanotechnology, and better algorithms allow computing progress to continue, or are we near a hard limit where computers cannot get any more advanced?

Attached: techprogressmooreslaw.png (560x420, 10K)

We plateau'd a long time ago.

1980s/1990s was the golden age.

Attached: processor_clock.png (1000x727, 257K)

It's something science fiction writers use to justify their plots.
I really doubt anyone would take it seriously.
As time goes on we'll discover more efficient ways of computing.
Maybe even getting rid of transistors altogether.

>clock speed
confirmed retard.

Clock speed is literally meaningless across architectures.

We're not even close to saturating the Landauer limit. Then after that it's gonna be reversible computing which can take us a ways farther.

> implying programmers are good enough to appreciate multiple architectures
Multiple cores are a smokescreen to cover stagnant technology.

Then keep using your 3.8Ghz Pentium 4 retard

>clock speed

>How accurate is pic related?
It's complete speculation. Nobody knows what's going to happen.

People can make individual transistors that are tiny but won't be feasible on large scale before decades of rnd. Then you get to a point where you can't build an individual smaller transistor because laws of physics prevent it.
You can speculate what happens in the future to a degree of certainty, like we're going to get 5nm eventually, but are we going to get smaller ones?

The chart isn't measuring "transistor size", it's measuring "tech progress" retard.

how about you google what the subject of this thread means

The post you replied to was specifically about the picture retard

The graph is derived from transistors sizes.

No one knows

time for fluidics to rise

Attached: 1516672065585.gif (270x188, 1.78M)

>literally thinks transistor size and "tech progress" are interchangeable terms
lol

The problem is the heat resulting from the transistors is unsustainable at this point so we can't just keep doubling them. That's why we just add more cores

Engineers will engineer stuff and scientists will research stuff.
They will just keep making new technology. Remember that most groundbreaking new technology has been something nobody could forsee, just think about all the people who thought that the Internet was just another fad

transistors can keep getting smaller up until a certain point, so yes the growth is logistic until the underlying factor (traditional transistor computing in this case) is changed, e.g quantum computing etc

Moore's Law specifically refers to the trend that the number of transistors that will fit on a chip of fixed size and price will double every 18 months.

see

What the fuck are you talking about, more cores means more transistors. Doubling core count more than doubles transistor count.

>CPUs are moving at relativistic speed

Nobody in this thread knows what Moore's Law is...

>current smallest transistors: 10nm by Samsung
>average diameter of an atom 0.1nm
Moore's "law" can not keep going on forever (it's already kind of dying).

>are we near a hard limit where computers cannot get any more advanced?

We are probably near a hard limit of how many transistors you can fit into any square area of space for the kind of classical computing we have been doing over the last 50 years. We have a ways to go, but you can't get a transistor smaller than a few atoms so there is going to be a physical limit.

However there is no reason quantum computers can't take over over the next 50 years or some other kind of technology. We are not limited to the kind of classical computing we have been doing.

And we still have a good ways to go, but will there be multi petabyte thumb drives using smaller versions of what we have now? probably not

we still move from point A to B in basically the same way we did in the 1920s.

>implying that other methods of computing won't be developed that don't use transistors
>what are optical computing, quantum computing, biological computing, and graphene CPUs

Moore's law is true, there is a certain limit to how small a transistor can be, its not a theory. With that said, the traditional transistor may not be in use 30 years from now after we get to that limit.There's still a long way to go with our current computers though, we're only at like 14-12nm i think? We'll get to the minimum soon and there will be a different technology that will be faster. Some people say quantum computing is the way we need to go. That's exciting, I'd recommend looking into it, quantum computers are no longer just a theoretical way of calculating anymore. Will it replace our desktops when Moore's law is reached? Probably not, but we'll cross that bridge when we get there I believe, computers are fast enough for me right now.

Attached: 1509933228765.jpg (639x479, 20K)