Moore's Law is dead

>Graphene turned out to be a meme and graphene companies are going bankrupt one after the other
>Carbon nanotubes were vastly overhyped and they are only going to find use in extremely limited and non-essential sectors
>Quantum computers are making good progress but they are never going to replace standard computers because they are only good for very specific algorithms that is of any use to governments and large corporations
>Senior exec at AMD straight up said we're stuck on silicone for at least another 10 years and that they don't have any replacements for silicone even planned

It's over bros..

Attached: mooreslaw.png (1200x586, 215K)

Other urls found in this thread:

youtube.com/watch?v=XW_h4KFr9js
news.mit.edu/2016/analog-computing-organs-organisms-0620
en.wikipedia.org/wiki/Nanoscale_vacuum-channel_transistor
twitter.com/NSFWRedditImage

>silicone

Attached: 1549029327737.jpg (1200x676, 128K)

I'm fine with 10 more years of massive fake titties.

Maybe developers should optimize for current hardware instead of using bloated frameworks for everything

I'll let you in on a little secret, Jow Forums.

The next big thing is...

ANALOGUE COMPUTING

This is bad, if the singularity doesn't happen in the next 100 years we're all fucked, the earth is running out of resources fast

>the earth is running out of resources fast
jewish lie

They are optimizing for current hardware, and that's exactly the problem. You want them to make efficient software? Then give them 15 year old hardware and tell them to optimize for that. Give them early dual-core 1.6GHz Opteron machines with 1GB of RAM.

/thread

we still have some time until we reach 3 nanometers

But think of all the poor java and javascript "devs"!
How can we expect people to be good at their job?! No one told me you need to be a good at math to be a good programmer it's just not fair bros!

I don't really disagree with you entirely but saying most people are optimising their code in any way shape or form in 1997+22 is just a fever dream. People shit out code without considering anything, let alone performance or optimisation.

>just optimize your way into higher raw power bro
fucking retard

Jesus Christ that skill/practice.

The fuck? Java might not be the high performance language but it's not terribly slow and it's pretty similar to c++ so all those Java devs could easily switch to c++.

Being good at maths doesn't automatically make you good at programming. Being good at certain kinds of maths might help you, but you still need to study and practice programming. That's exactly the problem with our current compsci curricula, they're bloated with unhelpful maths like differential equations instead of stuff that could actually help like number theory, graph theory, category theory, Boolean algebra, relational algebra, regression analysis, stochastic processes, formal systems, lambda calculus, optimization, combinatorics, linear and nonlinear programming, operations research and numerical analysis. And at the of the day, students end up devoting more time to unconstructive maths than to actual programming practice. Depending on the college they're attending, I discourage kids from picking compsci as a major and encourage them to pick CIS instead.

Photonic computing soon, bros! Surface plasmons up the wazoo!

Attached: glasses-sunglasses-cyclops-glasses-1_grande.jpg (600x600, 64K)

Java and Javascript have been around since before the Pentium MMX was even a thing, those languages are not intrinsically slow or inefficient or anything like that and one's retarded if one blames them for our current performance woes.

what about this technology?

youtube.com/watch?v=XW_h4KFr9js

7nm is in consumer products
5nm already exists but it's not ready for volume production yet
3nm is well in development and the Samsung GAA process looks extremely innovative
TSMC is working on 2nm
Moore's law is not dead the only reason it might look that way is because Intel is struggling and x86 is a curse

The point being made was that the industry is saturated with non intelligent developers that just use whatever is more popular and easy, and who are incapable of writing efficient programs.

The entire industry needs a reboot. You can't build upon a poor foundation.

retard

How can you say current silicon tech is a poor foundation with the results it's produced? Seems to me like one of the most impressive things humans have ever done.

hes a sensationalist retard with no brain

So I don't have to upgrade every n years anymore. Yay.

Yea but it's pointless without an open flame. The purpose of tossing food in the wok is to tingle it against the flame

Thank god for that. Now "programmers" will actually have to learn to code properly instead of just throwing shit at the wall and relying on increases in processing power to cover their mistakes. And for us plebs it means no need to upgrade unless the computer breaks. As someone who is still rocking an i5 2500k I'm very pleased with this news.

thats not the white mans burden
> depopulate yourself

Simple: we regulate the industry, making it so that one must possess a statutory certification in order to be licensed to practice any IT profession. Create high quality certification bodies composed of boards/guilds/bars that must ensure candidates are highly qualified before they can join the profession, just like accountants, lawyers and doctors are obliged to. Make it illegal to practice the profession without being certified.

thats a weird way of spelling illuminati
> who the fuck controlls the jews
stay woke

It doesnt matter we will all be accessing the cloud via cheap low power hardware

Thats a good thing
If they cant improve performance they will improve tdp
My laptop will finally last as long as my phone

we are at the beginning of ai revolution

optical?

*cough* Stocastic computing
Neural nets dont need precise math and neither does you graphics

honestly sounds like a great way to kill most commercial software and make sure everything substantial is open source and anonymously developed
efficiency and performance are two sides of the same coin you can't have one without the other

Singularity is a meme

So quantum computing?

Well OP, no place to go but UP.
Chips on top of chips on top of chips on top of chips with layers of heatpipes..

Attached: steel cube.jpg (1000x1333, 115K)

news.mit.edu/2016/analog-computing-organs-organisms-0620

quantum computing is something entirely different
the plan with stochastic computing is to run non critical paths close to the noise floor to gain potentially a lot of efficiency for a loss in precision

None of this shit is going to pan out just like none of the wonder materials and theoretical computing methods from the 90s and 2000s ever panned out.

Humans vastly overestimated and oversold the capabilities of computational technological progress.

We are coming up against the hard physical limits in progress.

Don't you dare shit on Von Neumann and Turing shithead.

But muh technological singulamemety

It will run out of gas quickly without additional TFLOPS every year.

The singularity was never going to happen, it was just a cope of people who wanted to hold on to the belief that there were better things to come, but were too smart to believe in the afterlife.
In reality humanity is never going to conquer the stars, there won't be an utopia, a satisfying conclusion to all the human struggle through the ages. Humanity, like all life forms in the universe, is doomed to extinction, and eventually all of our achievements will be forgotten, buried beneath a universe in ruins, as heath death erases everything there ever was.

Vacuum-channel transistor looks very promising right now.

en.wikipedia.org/wiki/Nanoscale_vacuum-channel_transistor

Attached: c2c69b07c7132068a108542957147658.gif (220x307, 219K)

Just waitâ„¢

IBM TrueNorth
Intel Loihi

The industry will have to adopt domain-specific hardware architectures and domain-specific languages to squeeze the last drops of performance out of the silicon. Most programmers seem to be incapable of learning these engineering approaches.

Software is only optimized if it's profitable. So pretty much just finance and gaming. It makes sense, developers are fucking expensive.

Why not just make bigger CPUs?

Attached: 1552221470546.png (225x225, 4K)

It takes time for a signal to travel from one end of the chip to another. Also, the bigger a chip, the lower are yields.

Why not just make a big CPU out of small chiplets? Oh, wait

But will we have any by the time intel reaches 7

Attached: 346373.jpg (950x635, 110K)

Chiplets or not, propagation delays will bite your ass at some unimpressive size.

Imagine Intel giving up and going to TSMC or Samsung.

3D stacking solves that problem for a while.

>The absolute state of Jow Forums

>developers are fucking expensive
This is the fucking problem with this industry right now. Unlike other industries, we haven't found a way to massively drive wages down, which is currently holding our development back heavily.

Why don't we just design CPUs like this?

Attached: CPU.png (1152x541, 22K)

We eventually will.

We will

>heath death
no...

Attached: heath.jpg (682x383, 61K)

I knew this was going to happen. The fast rise of technology will now begin to plateau. Turn to Jesus.

Is Jesus going to make my CPU run at 10 GHz?

clock speed increases are over because we're down around the level where you have to worry about the distance light can travel within one clock cycle. It's 100% multi-core and IPC from here on out.

I mean couldn't you just go crazy with chiplets and improve stuff like Infinity Fabric? We can always use more cores to a certain extent (I think something like 128c or so before the returns become so diminished it's not useful for most things other than highly parallelized stuff).

Why do you think pajeets are everywhere? They get paid next to nothing.

>Moor's Law
not actually a real law

well I don't like that

>100% multi-core
My GIL hurts

More cores. AMD wins.

That's one more decade of mediocre improvements in the pipeline at best.

voodoo magic to convert single thread code into multithreaded code when

Attached: 1560898275625.png (808x1200, 1.7M)

We'll need to get rid of languages like Python first.

they already build it into the CPU it's called out of order execution

They also produce next to nothing. Which brings me back to my point: we haven't been able to find an effective way to quickly reduce wages while keeping production high in this industry yet. Outsourcing to India was a disaster, and other approaches like attempting to teach kids programming from an early age and trying to persuade more women to join STEM have had very limited results. Until we find a way to pay developers a lot less, our development in this area will continue to be very poor. Quality only increases as competition goes up, which only takes place when production prices drop. It's basic economics.

>Python
Will it ever be reworked to fit the modern day and age?

Yes, it's called Nim.

Python 3 was a disaster so I'm not sure it can be reworked well.

I thought it's called Go

bump

Turing: theoretical limits of computability with no regards on time/space/energy efficiency.
von Neumann: lots of math plus endorsement of a certain type of computer, believing Pascal's wager.

>OoO is speculative multithreading
I know Jow Forums is retarded but...

Both of them were brilliant minds dragged into the war and thus into digital computing, being dragged away from what they were truly interested in: analog computing.

>being dragged away from what they were truly interested in...
gay se... oh.

Everything is about more cores.

Computer are about to get fucking massive again.

It was never a law.

Eventually they will start getting rid of the endless frameworks and abstractions to get more speed.

From a user's perspective and excepting things like waiting for video / 3D renders and Photoshop filters, computers have gotten slower in the last 30 years.

user, you're a fucking idiot.
Its called clustered multi threading. Soft Machines' VISC arch is the prime example, and they were just acquired by intel. AMD has had patents on this concept going back to the 1990s.
Multiple hardware cores work on a single software thread to increase total single threaded performance.

Attached: VISC-CSMT.jpg (980x643, 239K)

Should they delete everything and start over to continue Moore's law?

This. If anything, it was a conjecture.

It's too late. We need a galactic brain to make an AI that can code everything from scratch. There's too much for humanity to feasibly tackle.

Has the research into gallium substrates had any breakthroughs?

Stacked cpus are probably only 2 generations away from mainstream desktop. HBM is already a form of stacked cpu.

Graphene was always a meme material.
Photonic is the next step in CPU evolution.

It's as if you get what you pay for.

>we haven't been able to find an effective way to quickly reduce wages while keeping production high
>Until we find a way to pay developers a lot less
> making software development a sweat-shop.
Cough up the money you motherfucker, no one will develop stuff for you for cheap. Pay up bitch. If you don't like it then you can always develop and maintain all these systems yourself. I bet you wound't even know where to begin.

>HBM is already a form of stacked cpu.
It can be. There isn't always a control logic layer in the stack. It can be pure stacked DRAM.