Will low level assembly languages always exist?

will low level assembly languages always exist?

Attached: MIPS-ASSEMBLY-LANGUAGE-PROGRAMMING.jpg (1678x980, 68K)

Other urls found in this thread:

en.wikipedia.org/wiki/WebAssembly
twitter.com/SFWRedditImages

here's your (you)

yes

no, they will eventually be replaced with javascript and go

Yes, although soon computers won't be binary

>the javascript architecture
>ternary instead of binary
>true, false and undefined

will assembly like languages exist on quantum chips?

This, trinary in the pipeline in 2 or three years and quantum mainstream in about 10

they already exist.

>true, false, tralse

>Javascript drivers
>Javascript UEFI / boot
>Javascript firmwares for everything

so assembly like languages will indeed always exist?

assembly language is a representation of what is going to be executed by the computer, they're just a text representation of the instructions with a few goodies like not using absolute adresses for everything and such
even if computers become based on fucking Javascript, there'd assembly languages

I feel like this is someone trying to figure out if Assembly is worth learning

yes

Would this require actual language changes? Bits just suddenly have 3 possible values

Assembly languages for quantum computers with simultaneously both exist and not exist.

>Bits just suddenly have 3 possible values
are you sure it's 3 possible values user?

how is it not? he's talking about a bit, not 2, 4 or 8. except I'm fucking dumb I don't see how ternary can have more than 3 possible values for a bit

Interpreted that as ternary for some reason, but even with an n-it what language change would you require?

You do realize Quantum is only faster for some problems... right user

statically typed Javascript would be GOAT desu

probably none as you could just emulate the current two values, but wouldnt be quite the progress would it

You'll have a shitload more logical and bitwise operations.
And the "ternary" operator will have an extra expression.

As stated many times, quantum computers wont necessarily replace standard computers, at least for an extremely long time. Quantum will only be used in research and cracking other nations cryptography.

>true, false and undefined
>3 states, any of which could be meaningful in the right context ("definitely yes", "definitely no", "idk"
Those are rookie numbers, you are like a little baby. Let me present this masterpiece from MSDN: A tri-value boolean, which can have five values, but only two are supported.

Attached: 2018-08-15 22_48_50-MsoTriState Enumeration [Object Library Reference for the 2007 Microsoft Office (716x492, 17K)

I love Microsoft
I abso fucking lutely do
fuck me what is this fucking shit
this is orders of magnitude worse than
escape_utf8_real or some shit
thank you, you made my week
holy fucking shit

Attached: niceoneOP.jpg (317x267, 12K)

lmao what is this lol, were they drunk

but for what reason?

Of course not. Next gen CPUs will know how to read C code.

Considering assembly is a direct translate of machine code into somewhat readable "words" and symbols yes it will always exist.

fuck that

what about null

Attached: poo.png (629x454, 448K)

every time

Attached: WHAT2.jpg (215x231, 36K)

no.
cpus are transitioning towards running only python.
You know, there was a problem with brackets, which you couldn't represent in 0 and 1, but now that python is famous and cpu design is very mature, we are ready to do that huge leap.

>simultaneously both exist and not exist.
maybe

Is it all 0 is false and all 1 is true (2's complement of 1111 1111 being -1) with everything in between being not supported.

You're not talking to real programmers - they won't understand.

That is assuming they ever figure out how to make more than ~15 qubits stick together.

Is there a benefit to that as opposed to just, y'know, a single bit?

Well what are you going to do with it is the question. It is useful if you're doing something that uses it. If you code c++, it could be useful.
not for a web dev though.

en.wikipedia.org/wiki/WebAssembly

Is WebAssembly the future of the internet or just another meme?

>bit
>boolean
>null

it's hard to store/operate on a single bit. that type is probably 8/16 bits.

Sorry, I meant using a single bit to store the truth value. Why 0000000/11111111 instead of the normal 00000000/00000001? Especially when it means defining this clusterfuck of an enum?

I guess it was once intended as a tristate and then by convention only got used as a boolean so they deprecated the other states? Maybe?

yeah that is a good guess. anything prefixed with 'mso' in the office codebase is a sign of mega-legacy. given the name is 'triState' and there's only 2 supported states, i'd bet no code is actually using it anymore but it's remained for file load compatibility

>Why 0000000/11111111 instead of the normal 00000000/00000001?
So that one is a bitwise NOT of the other, probably.

computers that use high-level languages as instruction sets are and always will be terrible

Yes it's a meme. It literally states it's not for direct use. It's meant to be compiled to from C/C++/etc and no one supports that,

that's sad, assembly knowledge should be mandatory for webdev

>null
>undefined
>nan