This CPU can do anything yours can, only slower

This CPU can do anything yours can, only slower.

Attached: 146957-8086-chip_b.jpg (600x394, 40K)

Other urls found in this thread:

en.wikipedia.org/wiki/Bailey–Borwein–Plouffe_formula
en.wikipedia.org/wiki/Windows_Advanced_Rasterization_Platform
twitter.com/NSFWRedditVideo

Then it can't do the same thing

Yes it can, only it takes more time. The exact same thing is done but it takes longer

Can it address more than 4 MB RAM? Use virtual memory paging? Do memory mapped I/O? Do floating point arithmetic? Do SIMD instructions?

it can't even calculate 1.5+1.5

>can it do X
It can compute anything that is computable. So that means it can emulate any of your fancy technologies. The only drawback is that it will do so at a snails pace

Can it fit my CPU socket on my motherboard

can it suck my dick?

No, it physically doesn't support 32/64-bit instructions

>It can compute anything that is computable.
>So that means it can emulate any of your fancy technologies
Then it's not the CPU doing it, it's the software doing it. And whatever additional Integrated Circuits you'd create in order to assist the software and port I/O.

In other words, it doesn't do that, not even slower.

Hmmm, not really.
The 8086 was 16 bit, it didn't have all the 32 bit and 64 bit extensions, not to mention MMX or SSE, which are used everywhere nowadays.
It also could not do preemptive multitasking, and it couldn't address more than 1MB of RAM. So no, it can't do what a modern CPU can. It would be nice if it could though, I'd love to see Windows 10 on an 8 MHz CPU.

Attached: 39396464_652530181796429_3244211643046952960_n.jpg (369x496, 25K)

That's not the same as "can do anything", is it? It clearly can't do even most basic virtual memory.

Actually, now that I think about it, it probably won't have enough memory availbe to run some of the bigger modern neural networks, even at a snail's pace.

No it cannot. It cannot enter protected mode. It does not support paging, and it does not have any fancy vectorization instructions.

It can't do the same thing modern processors do. But, modern processors can do the same thing 8086 processors can do, just look up virtual 8086 mode.

>This guy can do anything you can, only slower.

Attached: 507141244-612x612.jpg (612x408, 44K)

Well my Computer Architecture professor said what I wrote so clearly you're all wrong

No, kiddo, you just honestly thought that what you said was right and the replies you've received so far have shown you exactly how retarded you are and how little you know about the subject. Trying to play it off on some fictional college professor is just lame and sad.

it run at the speed of a modern CPU, only slowlyer

can it address a shit load of ram?

He said and I quote
>any processor can compute anything computable, the only thing that varies is time

>any processor can compute anything computable
That's not the same thing you said.

You said: This CPU can do anything yours can, only slower

CPUs do more than just compute.

Now you're not making any sense at all

No, that's just something you thought was true due to your complete lack of knowledge on the subject. Also, this is a bit different from "this CPU can do anything yours can, only slower" as you claimed earlier. You're not in college, kiddo, and there isn't a professor on any college payroll anywhere this fundamentally fucking stupid.

>Now you're not making any sense at all
Virtual memory addressing, 32 and 64 bit addressing, preemption, I/O, etc. are not computation.

>No VT-X
Yep it can't.
Let's not even start on certain niche instruction sets here OP.

For fucks sake.

You can emulate 32bit on an 8bit machine
You can emulate 64bit on an 8bit machine too
You can emulate an entire modern architecture provided you have enough memory

The absolute level of Jow Forums

yeah it's so based i can count the transistors with my bare eye

>You can emulate 32bit on an 8bit machine
That's software emulation, literally the software doing the job.

>You can emulate 64bit on an 8bit machine too
That's software emulation, literally the software doing the job.

>You can emulate an entire modern architecture provided you have enough memory
You don't have enough memory, your toy microprocessor can't address more than a few KB, which means that your microprocessor isn't a true Turing machine but a limited imitation of one. Ergo, it's not Turing complete.

Okay. How will your 32-bit processor support, say, 8GB of RAM? It's obvious you just finished your first class on this subject and want to sound smart.

>provided you have enough memory
So how will you get that memory on an 8-bit processor limited to fucking 64KB of RAM?

you can calculate things in your head as well. it doesn't have any extensions or even basic modern instruction sets, but neither does your cpu. touch rj45 with your tongue and calculate signals for networking support, send back hardware response signals with a piece of wire and battery

some madman ran emulated x86 linux on atmega8. cool stuff, took him half an hour to boot the os, and tty needed around 30s just to type a character

My time died for this thread.

But it ends up doing the same fucking thing, you paper cut in the form of a person.

Let me explain it in a language you can understand. With enough pain and effort it's possible to run Crysis on a 8086k. You will probably end up waiting about 6 months for a single frame to render but in principle it's possible. So it ends up doing the same job, only executing it a million times slower. Yes, even multicore because you can just time slice your threads.

>You don't have enough memory, your toy microprocessor can't address more than a few KB
Not a problem, just an annoyance.

>8086k
Sorry, I meant 8086

Clearly I'm brainwashed by Intel marketing

>But it ends up doing the same fucking thing,
No. Your microprocessor isn't a Turing machine and it can not emulate physical properties of hardware.

>With enough pain and effort it's possible to run Crysis on a 8086k.
Wrong.

>You will probably end up waiting about 6 months for a single frame to render but in principle it's possible.
It's not possible in principle either, you wouldn't be able to hold enough in memory to emulate anything.

>Yes, even multicore because you can just time slice your threads.
How would you timeslice on a microprocessor that doesn't support clocked interrupts? You'd literally have to implement a software emulated clock as well, and you'd use up your maximum 4 KB memory before even managing to load a single instruction you're going to emulate.

I need it to do modern things quickly. Can it do that? My current one can.

>This CPU can do anything yours can, only slower.
Can it do it without emulation? No? That's what I thought.

You can treat a disk as a memory though. Problem solved

Can it do GPU pass through to a VM? Hmm can it even run a VM without emulation? USB 3? PCIe 3.0? FPU? How about my new gaymes?

No?

It can emulate emulation, so.. yes?

>You can treat a disk as a memory though.
Again, you wouldn't be able to hold enough simultaneously in memory to emulate a single instruction. If you're going to have code that swaps stuff in and out of disk in addition, and keeping track of more sectors than you can address, you're going to use even more of the little precious memory there is. There's literally not enough memory to emulate a modern processor.

Surprisingly, it can do all of these things, yes. I don't see a problem with any of those. The only thing that will really bite your ass is how hard you're going to have to work around limitations and how much time you'll be waiting for it to finish.

>you just have to wait an exayear before it to complete, bro!
The universe will be gone before that. Also, this

In principle it's possible

>emulating 64-bit on a 64-bit CPU
ok now this is epic

No, not in principle either. Software instructions doesn't take zero memory in principle, which is what you need it to do in order to fit the emulator in memory.

But user, that IS my CPU.

Attached: IMG_20180823_162805.jpg (2560x1920, 2.67M)

It's problematic but probably not impossible

Based and beigepilled

VMs without emulation? Source or GTFO. The VM must be as secure as on modern systems too since this is a VM.

>Not a problem, just an annoyance.

>It's problematic but probably not impossible

It's slowly sinking in, I see. It is impossible, your solution is "just use a disk bro", which means that you're going to consume even more memory for writing a disk driver, leaving even less space for the emulator, the code you're going to emulate and data being used.

Your solution is only possible in "principle" if you assume that software has zero or even negative memory footprint, which is fucking silly.

It can't do SSE4

Well, it's probably more secure given that 8086 isn't susceptible to spectre or meltdown

Ok you convinced me

>You can treat a disk as a memory though. Problem solved
Then you are relying on additional hardware, aren't you? That pretty much disproves the original claim:
>This CPU can do anything yours can, only slower.
because you need additional hardware to do it.

obligatory xkcd

Attached: a_bunch_of_rocks.png (675x1603, 199K)

Okay, can it run AVX512 extensions natively? Uh oh...

No, you can't. For that you need a PMMU. Which the 8086 doesn't have.

You can emulate one. Problem solved.

>You can emulate an entire modern architecture provided you have enough memory

*time and memory

You can't install Windows 10 on a PC with an 8086 and then play Crysis 3 RIGHT NOW with no changes. You keep saying you can emulate everything. Okay when you come back in 30 years with it working then we can say it can do everything the same, just slower.

Time isn't really an issue as far as processing goes. It's an issue for humans because we don't live forever but it certainly isn't an issue for a lone machine running in a cave for thousands of years powered by a nuclear reactor

The idea behind what OP is saying is technically true you can emulate anything on any processor that is Turing complete but honestly what is even the point of this thread?

How does it change to THUMB mode?

>just look up virtual 8086 mode
No longer supposed on x64.

>Time isn't really an issue as far as processing goes
The universe has a finite time limit.

>it certainly isn't an issue for a lone machine running in a cave for thousands of years powered by a nuclear reactor
Thousands of years wouldn't be a problem, but we're talking of workloads which resource demands are of factors ranging from millions to billions more than available. If you have to wait an exayear for your emulation to complete, the universe will be long gone by then.

user, we've been over this before. You're just using that thing as a dumb terminal and it's not really online.

The only sensible response in this thread.

Just emulate a universe with a longer timespan then.

Forgot that using 64k memory segments really sucks.

The 8086 didn't even have hardware multiply/divide, it emulated them with horrendously slow microcodes.

And yet it's the processor that pioneered the architecture we're currently using

Really says something about how crap the state of modern computing is.

And it uses an NEC V20 anyways. This is a troll thread, I was just going with the flow.
But it IS online, in fact it pulls the current time via NTP every time it boots up.
8087 FPU instructions were executed in a separate chip. That is the chip that pioneered "modern" FPUs, not the 8086 itself.

Attached: IMG_20180823_165833.jpg (2560x1920, 1.93M)

Except phoneposters, hilariously enough. Unless you're using one of those rare, 386-based BlackBerries.

Attached: 1517387555245.gif (390x319, 1007K)

No emulation is not more secure. The processor is able to do tricks that are impossible for emulation. VMs are actually extremely lightweight since it uses almost entirely pre-existing silicone. That is, unless you emulate. Emulation has many times the attack surface. Easily 100x. Stick to talking about something you remotely understand.

Spectre has very little effect on VMs.
>inb4 muh Intel bugs
Most of those bugs only effect Intel because they couldn't even stay to the spec. If you look at AMD, VMs are still very secure and are not effected by Spectre. Those VMs are as effected as bare metal systems.

GTFO faggot. Or come with an actual source to restore your credibility.

It isn't though because recent Intel CPUs are RISC and they just use a hardware emulation layer to execute x86 instructions for backwards compatibility purposes.

You can emulate anything a processor does not physically support, the downside is that you will use considerably more cycles to accomplish the same task. An 8bit processor does not have the ability to store 16bits of information so it takes a minimum of two cycles to store 16bits of information. 32bits would take four and 64bits would take eight. You effectively lose half, 3/4ths or 7/8ths of the clock speed to overhead.

>It isn't though because recent Intel CPUs are RISC and they just use a hardware emulation layer to execute x86 instructions for backwards compatibility purposes.
Yeah, and they got all sorts of performance issues because they're desperately trying to patch security vulnerabilities because of this fucked up design.

>And it uses an NEC V20 anyways

The V20 did have hardware multiply and divide so this fixed one of the big deficiencies of the 8086. On 8086s, you were advised to just use bit shift instructions for multiply and divide like you'd do on an 8-bit CPU.

So it can compute Pi to 5 million digits?

>An 8bit processor does not have the ability to store 16bits of information so it takes a minimum of two cycles to store 16bits of information
The Z80 can do at least partial 16-bit arithmetic.

You're limited by the amount of memory you can address though. At some point you're going to run out of memory.

en.wikipedia.org/wiki/Bailey–Borwein–Plouffe_formula

Copypasta AGAIN because you won't learn:

"RISC was guaranteed to fail from the beginning. The idea of 'twice as many instructions, where every instruction does half as much work' hits clock frequency, memory access, instruction fetch, and thermal walls much sooner than CISC. And before anyone says 'but modern x86 are RISC inside', no, they're not: they're microcoded and micro-operationed, which aren't RISC ideas. The reality is that both CISC and RISC processor groups ended up converging onto the same microarchitectural principles - things like pipelining for superscalar operation, out-of-order execution, speculative execution, branch prediction, simultaneous multithreading, etc. - to improve performance. All modern RISC architectures (including PowerPC and ARM designs) break their 'RISC' instructions into simpler micro-ops internally too - just like a modern x86 does. Remember, RISC basically means only two things: fixed instruction length, and register-to-register (load-and-store) operation. A lot of other stuff - like the things listed previously - has been erroneously assigned as being 'RISC' because most people don't know enough about microprocessor design not to confuse ISA with microarchitecture."

Attached: 1523310286211.jpg (350x276, 26K)

Doesn't mean that the emulation is as secure. Most of the time. It will be less secure due to increased attack surface. In other words, no my CPU can do things yours can't. Security is non-negotiable.

>On 8086s, you were advised to just use bit shift instructions for multiply and divide like you'd do on an 8-bit CPU

The 6809 had hardware multiply/divide. It's rumored that Motorola did this as a test bed for the 68000.

The addressable memory is a hard limit but you can use memory mapping to hack your way around the limit like what old video game consoles used to do. It's a really shitty solution though.

>The addressable memory is a hard limit but you can use memory mapping
Not without a PMMU you can't.

So is like bank switching.

What's your point/how's that relevant here?

Attached: ie2.jpg (1881x4147, 805K)

It explains how "modern x86 is RISC inside" is a lie - in very ELI5 terms, might I add. If you didn't understand that, the implication is obvious.

Oh God my sides

1MB was huge back when you had 8-bit chips with 64k. 64k is enough to hold about 80 pages of text assuming average spacing.

>With enough pain and effort it's possible to run Crysis on a 8086k. You will probably end up waiting about 6 months for a single frame to render
No it wouldn't because any 80s era graphics cards would have like 16 colors or something.

You can just emulate a modern GPU, it will work exactly the same, only slower.

BTW, someone on VCFED tested Commander Keen on a real EGA card and found it ran like dogshit. I'm not sure John Romero ever actually tested the game on a real EGA card, likely not since it came out in 1992.

Windows actually does emulate a modern GPU internally when the end user doesn't have a direct3d 10 compatible GPU

en.wikipedia.org/wiki/Windows_Advanced_Rasterization_Platform

It can't run DES ops.

This can do anything your car can, only slower

Attached: wagon.png (1000x751, 953K)

Is this bait or are you a fucking /v/toddler who doesn't understand how CISC architectures work. You do know that backwards compatibility != forwards compatibility, right retard?

Attached: 1527161698849.png (614x614, 31K)