What else is there to do in computer engineering now that we reached the end of Moore's law?

What else is there to do in computer engineering now that we reached the end of Moore's law?

Attached: chip.jpg (625x313, 72K)

Other urls found in this thread:

computer.org/csdl/magazine/co/2019/05/08713821/1a31hkHCJPO
twitter.com/NSFWRedditVideo

bio computers

big computers

Quantum will mostly be used as coprocessors to support general processor tasks.

Photonic-computation is the next big step but that area of study has only just begun.

Attached: 1556746299742.jpg (600x800, 119K)

making software that does not suck, and does not need 8gb of ram to run a text chat client.

Attached: Sato_Tadanobu_with_a_goban.jpg (796x1167, 1.2M)

Install gentoo.

That's not an engineering problem though

If the hardware worked simpler the code could be more efficient without additional work

Does this mean programs will become less bloated over time?

Good one

Attached: 1417488142984.jpg (1284x980, 389K)

finish "perfecting" RISC for like 4th time then realize that CISC is actually superior and the only way to get higher performance

Risc based main processors or some other isa with a priority on optimization paired with many highly optimized single purpose asic based coprocessors. Kind of like the amiga or like we do currently to a certain extent with gpu's and hardware accelerated aes and video but with more optimization and speed.
And this Now that hardware is no longer able to off load shitty programming practice and unoptimized spaghetti code the only solution is to not use bloated shitware.

Don't insult slack

Now we will get specialized ASICs for each task. The CPU will be split into multiple specialized dies (maybe on a single chip). Instead of a monolithic general-purpose chip. The PC market will get much more interesting when smaller transistors aren't economically viable any more.

parallel computers

More cores. Better IPC. Design an architecture more efficient than x86.

If CISC is superior, why are there no modern CISC machines? x86 is emulated in microcode on a RISC chip.

We move onto photonic computing and new materials.

We first start by replacing the transistor.

Neuromorphic Computing.
You can do more than just neural networks.
computer.org/csdl/magazine/co/2019/05/08713821/1a31hkHCJPO

More cores.

moors law isnt a real law. more of a simple trend that is sometimes accurate.
i would argue that, sure, they arent doubling in performance, but rather the focus is on the die shrinking and smaller more efficient processors with the same processing power. that is still a continuation of moors law which is the transistors per area still increasing by die shrinking... faggot.

no, fuck you, a chat client doesn't need to be a browser. they also got rid of their irc bridge so FUCK slack

Attached: apu_middle_finger.jpg (657x635, 39K)

when you come up with a cross platform UI kit that isn't fucking terrible and isn't just a web browser call literally everyone.
and yes whatever you just thought of is trash and I don't need to know what it is.

Qt isn't fucking terrible, Blink is. Sometimes I think using Microsoft Word as a UI toolkit would be more efficient.

Qt is pretty terrible. billions of classes to learn, and basic shit like handling different window sizes is an immense chore.
there's a reason the web exploded while the desktop largely remains stagnant.
HTML* is write once play anywhere.

>but writing not shit software is too hard!!!
Sounds like you just went into the wrong industry. I can't wait until Geek Culture bullshit crashes and burns and we all become pariahs again.

Building UIs using imperative logic is ass backwards and retarded and is the epitome of POO design.
If there's one thing that HTML did right, it was that it was a declarative syntax for defining a UI layout and that means you could have a rendering engine built by a core of non-retards to figure out how to deliver a UI that meets the HTML specs as written.

So is turning everything into a web app with a shitty UI more computationally expensive than the actual program logic itself. But it's all good, at least with that you can pretend you're an actual developer!

But you know what? Honestly, I agree that Qt is shit. When the Web 2.0 bubble eventually bursts I hope to see people go back to writing proper native toolkit applications, because the "cross platform" obsession is kind of fucking hilarious in the modern age where we have at most four or five platforms, and the bulk of them are just *nix variants.

There are other ways to consume declarative languages than to build a rendering engine, buttmunch.
Microsoft already did all of this shit when they made WPF. The UI is declarative XAML and logic written using C#.
But it's not cross-platform because Microsoft doesn't have any interest in The Year of the Linux Desktop outside of EEEing through a terminal emulator.

>taking 100x as long to do the same thing someone can with 1/10th the experience is better!
>worrying about CPU cycles for GUIs
if the UIs are shitty it's not HTML's fault. It just gives you the tools to do whatever you want easily.

>I can't wait until Geek Culture bullshit crashes and burns and we all become pariahs again.
not going to happen
thats like people who invented the first car saying "I can't wait until everyone stops driving these so automobiles become underground again"

>>taking 100x as long to do the same thing someone can with 1/10th the experience is better!
No, but enabling lazy and incompetent developers to flood the market with scams, shovelware and perpetually unfinished garbage cash grabs that aim to push responsibility for quality assurance on the users to increase profits certainly is a problem. I'll personally take the tradeoff.
I've never seen someone try to equate a vapid, childish and overly capitalistic mindset to a car before, and I really hope I never have to see such a thing again.

We'll come to resent bugmen and basedboys just as we came to resent the last round of trendy subcultures they themselves replaced, especially when future generations start really suffering the ill effects of our last few decades of reckless consumption and moral bankruptcy.

Moores law is actually a legal law. If the chip making companies dont abide by the law Moore himself comes and takes the firstborn of the CEO.

Attached: Quantum Computer.png (1589x1122, 481K)

"geek culture" never died.
you can still see it in places like i2p eesites, .onion domains, gnunet, and freenet.
arguably any piracy related stuff counts, but when everything is mainstream netflixime, anvengers endgame capeshit, and AAA gayms, its hard to lend the idea any credence.

100ghz graphene processors