What could cause a similar tech revolution like this buddy did?

Attached: trans.jpg (1200x1200, 64K)

Other urls found in this thread:

congress.gov/115/bills/hr6227/BILLS-115hr6227rfs.pdf.
twitter.com/NSFWRedditVideo

real world memristors

Quantum computing. Which means we have to throw away all our current understanding of how logic gates works

sounds like a huge meme

that's cause it is
normies seem to think it's magical, when in reality it's worthless

Good developers

Attached: bait.jpg (512x391, 70K)

Death of the anime.

>Quantum computing
if you thought incompatibility was bad now imagine if hardware had as many variables as fuckin software
>2045, can't use my ddr8 kit because it only has 9 states while new mobo chipset has 10 states

>we have to throw away all our current understanding of how logic gates works
Why?
Quantum logic gates are basically the same, the only particularity is that they're all reversible.
Even the truth tables are oddly familiar

>2045
Nice meme.
You forgot that to operate, quantum chips need to be cooled to close to the absolute zero.

>Divide both sides by zero

Attached: 1383010371117.jpg (208x199, 13K)

5 / 0 = 0 r5

MOORES LAW IS DEAD.. CLASSICAL COMPVTING IS DEAD.. THE INTERNET IS DEAD

Attached: MVRA.png (3286x2432, 749K)

brainlet here - where's the mistake?
I feel like it's at "Take b^2 from both sides", since you can't add/subtract from numbers with different exponents, or something like that

Attached: ok maybe.jpg (426x282, 61K)

its right here if a=b then dividing both sides by (a-b) means dividing both sides by 0

the hanging of all kikes and leftists

8ch /qc

>bait.jpg

sooo is it valid/logical/correct then, as long as (a-b) does not equal zero?

You should never have left Africa white boy

Which it always does based on the identity a=b, and if that changes the rest of the argument is useless anyways.

If quantum computing ever arrives it will be severely limited to research applications. When the transistor came onto the scene it was immediately adopted into consumer electronics because it made totally obsolete a really expensive and fragile piece of equipment and replaced it with an absurdly cheap and durable one. Quantum computing is NOT the next transistor at all, it will be vastly more expensive than conventional computers forever, this disqualifies it right off the bat from revolutionizing anything but multi-million dollar research endeavors, the transistor changed the entire world almost immediately, quantum computing will not even remotely do that, it is too complex and too expensive.

just add more legs

It won't revolutionize shit because most usages don't benefit from the quantum process.

Quantum computers have already proved their superiority in cryptography and communication, and once we surpass 128 qubits, new applications will surface that we can't even imagine now due to the black magic weirdness of quantum mechanics.

>it always does [b != 0 && a != 0]

wait why do they never equal zero? I understand the part about the rest of the argument being useless if either of them are 0, that's pretty intuitive, but why do they never equal zero?

All for the low low price of millions of dollars. When transistors were first available they were immediately cheaper than vacuum tubes, let me know when running a quantum computer will be cheaper than a conventional one and I'll start to believe they will revolutionize computing, until then you'll have maybe three or four of these computers operational worldwide in the world's most well funded research institutes and they will fold some proteins or something, that's it. You can screencap this if you want, in 80 years you'll see I'm still right.

As in, the premise that a = a + a is only being made for the case that a=b, if you change that then the rest of the picture doesn't follow anymore.

Meant for

Oh, of course. seems obvious now that you explain it that way. Thanks user!

I'm already using 19 qubit quantum computers over the cloud for free buddy ¯\_(ツ)_/¯ and with innovations in qubit isolation technology quantum computers will become a lot easier to own

this is the closest anyone in this thread has been to "correct" i swear

in my opinion, the game changer for COMPUTERS is either gonna be mems relays or non-vacuum "vacuum tube" chips with lasers to increase electron radiation from the anode
both of these are poised to potentially drastically decrease complexity for most computers, we just need to get out of the rut of computer engineers thinking the limiting factor of computers is the speed of light. (something we can't fucking change) if you could have an entire wafer worth of silicon at even a couple decades old of a process for cheap (as in almost an order of magnitude cheaper) it would be revolutionary. it doesn't matter if we're limited to a ghz if we could have a couple thousand of them. this is essentially what gpus already do, but this would be physically much larger. obviously delivering tasks is a problem but it's simply not as bad as people make it out to be, especially because it could be general purpose. this would also be assisted by 3d stacked memory, which is coming sooner or later, but will likely not be a "game changer" by itself.

on the other hand the future of electronics is probably gonna be cheap reliable memristors but i have no idea how it's gonna go. artificial learning is scary, and it's gonna need a "killer app".

Teaching kids how to use computers.

"artificial learning is scary"
thumb-sucking brainlet detected

Attached: b.jpg (480x360, 14K)

>I'm already using 19 qubit quantum computers over the cloud
Because you're a hobbyist. Your personal computing habits don't reflect the global computing industry.

memresistors
ternary computers
efficient microwave power beaming
far field magnetic forcefields

What does that have to with the post I was replying to? But to answer your question, congress.gov/115/bills/hr6227/BILLS-115hr6227rfs.pdf. And do some research before you make claims like these.

If a = b
then neccesarily a - b = 0

what is that even?

a transistor

Babbage's analytical engine actually being built.

These days maybe things like optical interconnect or memristors

Batteries

you just divided by zero you faggot
of course 0 equals 0.
Math still works, you're just shit at algebra.

room temperature superconductivity

Screencapped this. We will all point at you and laugh at you in 30 years time.

Binary is for fags
Ternary now

>ternary
>not pentanary

more like memeristors amirite?

May I ask what is fucking keeping back the graphene revolution?

Attached: 1488422137545.jpg (157x179, 4K)

Organic batteries

conductive polymers

already exist, lithium batteries are coming back.

Brain mapping. Imagine being able to install programs into your brain.

Brain mapping.
Just imagine if you could install programs into your brain.
>optical sensors are malfunctioning
>just update the drivers

Either fusion energy or nanobots

Attached: 1515526934677.jpg (1024x768, 170K)

a = b ⇔ a-b == 0

They fuck up with they "Divide sides by (a-b)". They make is seem like 0/0 = 1, which it is not.

Attached: 1538754404200.jpg (565x487, 73K)

Room temperature superconductors

that's what he said

Optical versions of transistors

Ahh, the good old "hide the division by zero from the brainlet" sleight of hand.

That picture made me laugh and feel creeped out at the same time.

optical or quantum computing

it gets me every time

Attached: Untitled.png (390x322, 7K)

The next step is a combination of quantum computing, cloud technology and 5G or even 6G networking.

>huge quantum computing infrastructure for an entire city
>OS is stored in a data base
>quantum computer delivers computational power over the internet to phones that have almost no storage. only batteries, screen and a few processors ultra efficient processors.

Why would any sane being want every device to be stored in a single database?

TO-92 packaging didn't cause a revolution.

installing gentoo


..
.


.

think about it

What on phone you need quantum computer for?

Survielance cameras, the KILLER APP of AI

Internet of Things