Intel 8088 IC

I have a bunch of these I yanked from old microcontrollers at work. I didn't want to chuck them, looking for some ideas to make something with them.
>inb4 build a x86 processor, not interested
Anyone have any experience with these? Thinking of maybe synth/waveform type project

Attached: 270px-KL_Intel_TD8088.jpg (270x143, 4K)

Other urls found in this thread:

ndr-nkc.de/download/datenbl/i8088.pdf
geek.com/chips/nasa-needs-8086-chips-549867/
en.wikipedia.org/wiki/Intel_8086
twitter.com/NSFWRedditVideo

Put them all together, take a pic, make a twitter and post something tagging Intel. If you are luckily they will give you a 8086k

>I yanked from old microcontrollers
You have no idea what you're talking about
>build a x86 processor
They ARE x86 processors, again, you don't know shit about this
>synth/waveform
They have no FPU, 8088s were (and are) known for being pieces of shit. Besides, you need a shitload of hardware to interface them to anything.
Sell them on ebay for people who actually might have a purpose for them.
Or this. And then exchange it for a 1950X.

put them up your ass

Sorry to burst your bubble Comic Book Guy, yes - they were in old controllers. I know they are x86 processors, but I have no interest in wasting my time building something that's going to need a metric fuckton of other shit when I can buy a shit tier 80's computer.

Xi 8088 aka IBM PC/XT (on steroids) as a expansion card

>yes - they were in old controllers.
Indeed, they were in old controllers. A controller isn't a microcontroller, though.
>I have no interest in wasting...
Well then forget about doing anything with them. They're processors, and strictly that. They have no I/O other than data and adress buses, which means you'll have to add:
Clock oscillator
Memory
I/O controller
Some sort of storage medium, be it ROM, SRAM, or whatever
PIC if you want to do anything in real time (aka your "wave/synth"

Here, have an example of the most basic implementation. An IBM PC motherboard.
Think you can make one of those? Then go ahead. No? Go buy an AVR or a raspberry pi and blink LEDs until you get bored of it five minutes later.

Attached: 5150_early_motherboard_2048x1489.jpg (2048x1489, 1.43M)

Came here to write this, thanks for saving me the time. Also OP's a faget.

>They have no FPU

Attached: intel 8087.jpg (1024x589, 168K)

Yes. Thanks for proving my point.

Those things get hot as a motherfucker which is why they have a ceramic rather than a plastic shell.

My dad could probably do it. He knows a bit about hardware design.

>They have no FPU

Not even a hardware multiply/divide.

At the start of the 80s, everyone was drooling over the 68000 and Z8000 and the 8086 was kind of treated as a joke. Basically a slightly enhanced Z80 that could access more memory. IBM really did save the chip from an early demise.

A lot of the 8086's handicaps happened because Intel decided to maintain backwards compatible with the 8080 just to make electronics hobbyists happy whereas Motorola didn't retain any 6800 -> 68000 compatibility.

>Not even a hardware multiply/divide.
Yes they do:
ndr-nkc.de/download/datenbl/i8088.pdf
Page 27

The 8086 is not program-compatible with the 8080, if that's what you're trying to say.

The multiply/divide is done with microcodes, it's not a true hardware multiply/divide and it's incredibly slow. Programmers were advised not to bother using the MUL, DIV, IMUL, and IDIV instructions and just use bit shifts for multiplication.

It was close enough that machine code could be easily transpiled to be run on it without any modifications. That was its main selling point.

You're late to the game, Nasa needed those chips back in 2002

geek.com/chips/nasa-needs-8086-chips-549867/

can it run freedos

>the government has unlimited resources and they're reduced to scouring Ebay to find parts when they could just pay Intel or somebody to mfgr new ones
...

This guy argues it wouldn't be economical to do so, although I don't quite understand all of what he's saying, but that's the general gist I get.

Attached: clr99r.png (693x407, 66K)

Apply voltage until it starts smoking. Then apply double the voltage and tell us what happens.

He's saying it would be too expensive to recreate the old HMOS process used to manufacture the 8086, but using modern processes would result in something less radiation-proof and you'd need to sell at least a couple million chips a year for several years to pay off the tooling costs.

Doesn't WDC still produce new 6502s?

Sounds like a plan - mebbe just go whole-hog & throw 600vac through it.

Yes

Make an array of SID chips that you can write to from a shared bus.

Yes as well as new 65816s and 6522s on modern processes. The problem is that those chips were never discontinued from production whereas Intel at some point stopped producing 8086s and probably lost/threw away the dies.

The vast majority of expenses in tech manufacturing are the R&D and startup costs--the actual manufacturing cost is relatively negligible. TI also still produces an 8-bit chip from the 70s to power calculators that costs about 50 cents to manufacture because they never discontinued it from production.

So while it costs WDC nothing to make new 6502s, there haven't been new 8086s produced since the early 90s and restarting production of them would cost $$$.

It's too bad the 8086 was never used much outside of PCs and there was no real reason to keep them in production while Z80s and 6502s were heavily used in embedded applications. NASA probably should have used different CPUs.

You think NASA thought about that in 1982? I doubt they sat down and thought "Hey, 30 years from now we can't get new 8086 chips?"

Attached: 9b1076d1d49df8bcfb9b8282a9ba68033bf8caaf_hq.jpg (600x315, 28K)

I should add that the NES 2A01 and PPU are still produced and used in Chinese Famicom clones since the early 90s. Ricoh may have leaked the schematics since that was around when NES production was coming to an end and they probably didn't care anymore.

>making the 8086 on 0.13 micron still wouldn't happen because diverting engineering efforts away from p4 northwood (as primary focus at intel) to make only one to two or three 25 wafer lots of 8086 wouldn't make sense unless nasa wanted to pay >>$5 million (like $100 million) for the startup costs.

Ok but I don't see how it would have diverted any engineering efforts. The design work on the chip would have been done decades ago. I'm not sure what engineering talent would be taken up simply making a chip from a schematic diagram they pull from an old file cabinet. If you were designing something completely new, then yes.

The way he phrased it wasn't very good but I think he meant to say it wouldn't be cost effective for the relatively small lot of chips NASA would need. As mentioned above, several million would have to be sold over about 4-5 years to pay the tooling costs off.

en.wikipedia.org/wiki/Intel_8086

>Compatible—and, in many cases, enhanced—versions were manufactured by Fujitsu, Harris/Intersil, OKI, Siemens AG, Texas Instruments, NEC, Mitsubishi, and AMD. For example, the NEC V20 and NEC V30 pair were hardware-compatible with the 8088 and 8086 even though NEC made original Intel clones μPD8088D and μPD8086D respectively, but incorporated the instruction set of the 80186 along with some (but not all) of the 80186 speed enhancements, providing a drop-in capability to upgrade both instruction set and processing speed without manufacturers having to modify their designs. Such relatively simple and low-power 8086-compatible processors in CMOS are still used in embedded systems.

This claims there's still 8086 clones being made.

I imagine there's some Chinese sweatshop cranking them out; of course US government agencies aren't allowed to purchase stuff from China.

The much-hated segmented memory on the 8086 was done in the interest of 8080 compatibility.

Why do I keep thinking that reads "much-hatched"?

The clones often haven't gone through radiation hardening validation.

You can't just throw some random 8086 chips into space you know, millions of dollars of hardware and human lives are at stake.

As I understand, it was more a function of the manufacturing processes used in the 80s having much larger chip dies that were less vulnerable to radiation. What that means is that if you made an 8086 with modern fabrication methods, it wouldn't work and it would get zapped by cosmic radiation.

It would require additional radiation shielding that wasn't necessary with the original 5 nm manufacturing process.

I'm surprised anyone would still use this ancient shit instead of upgrading to an ARM or something.

Like we've been trying to explain in this thread, they're more rugged and able to handle harsh conditions due to their large, simple chip dies. The 6502 has all of 3500 transistors in it; the schematic for it was literally sketched on paper when it was first designed in 1975.

An interesting thing you might not know is that the instruction decoder on the 6502 can contain 256 instructions, but less than half of the space is used. The undefined space led to the infamous illegal opcodes that Apple II and C64 games often used in copy protection routines and newer CMOS variants produced by WDC don't have the illegal opcodes anymore.

Those CMOS 6522s aren't exactly like the old NMOS ones in the C64 anyway; they have some timing differences.

Getting the actual manufacturing process right isn't as simple and straightforward as you seem to think

You dolt, there were literally hundreds of different 8088 computers designed and built by hobbyists and companies alike with unique architectures. An IBM PC is far from “the most basic” implementation.

You’re a smelly old Jew sent here to poison the well, you don’t even know a lot about computers, just enough to be a dolt.

Read carefully. Recreating the original HMOS process would be ruinously expensive and take a good 2 years to get right, so nobody would seriously consider doing that.

Also see here. A lot of early model SNESes have dead CPUs because the original revision chips had a manufacturing flaw--this was a pretty advanced IC for early 90s standards and Ricoh took a while to get the fabrication process down pat.

Attached: Stack of dead SNES CPUs.jpg (1024x574, 122K)

The IBM PC architecture was pretty vanilla; there's not a lot to it while some x86 machines like the Victor 9000 were much more sophisticated. It was joked in the beginning that PC was short for "Pretty Conservative".

hey OP,
can you send some to me?
i collect such old CPUs and build small computers around them

It was a little group of engineers down in Florida who had a year and not a huge budget to work with so they designed the PC to be as basic as possible and not have to use proprietary components. IBM were not willing to spend a huge amount of money because they doubted the thing would even be a big success--original sales targets were 100,000 units total.

In the end the demand for PCs ended up being so huge that it took until 1984 to begin European sales because IBM were backlogged with orders. I don't think IBM have ever really understood the consumer market which is why they eventually gave up on it and decided to spend the rest of eternity making shitty servers.

Most of the software NASA uses on these chips have hard real-time requirements. The engineers who wrote software for these things could make strong assumptions about when each line of code would actually be run and how long it would take.

There's no such guarantees on a modern ARM processor.

I'm sure the 8086 could "emulate" an 8080 without much trouble, but their machine languages really aren't that particularly similar at all. There are even
common 8080 instructions that don't have a direct counterpart in 8086, like conditional call/return.

Why would the segmentation model be particularly 8080-compatible? It's just there to be able to address more memory than the 8086 has word-size to address directly.

Imagine a 6502 manufactured on a bleeding-edge process node. It wouldn't even be a speck of dust.

It might not be the absolutely bare minimum solution, but even the actual bare minimum solution would certainly be something that OP would consider "a metric fuckton of other shit".

>microcontrollers
Those are CPUs.

Also...
>tfw no V20
What a pleb!

>Sorry to burst your bubble Comic Book Guy, yes - they were in old controllers.
No, a microcontroller is a IC either designed for a specific job or a programmable one that HAS a CPU in it.
The 8088 is a CPU, a shitty one but stilla CPU. Applications that used it instead of a microcontroller initialised it as a CPU to do simple tasks. Just like Z80's and M6800/M68000 where used as microcontrollers.

OP confirmed for faggot.

OP does not understand how these things work. I agree that he probably thinks adding a IC or two to make it do something would be "metric fucton of other shit".

>Intel was making housefires since the beginning of time itself

Attached: 1529649444416.jpg (252x200, 18K)

>The IBM PC architecture was pretty vanilla;
I'm sure it was very "vanilla", but that's not necessarily the same thing as "simple". It would certainly be a lot simpler if you used an SRAM chip or two instead of DRAM, and in a hobby project there's a lot of things you could take off, like general I/O, stuff for the ISA bus, the DMA chip, keyboard controller, &c&c.

All that being said, it's still of course a lot more work than just plopping down a PIC on a breadboard.