What does "8 bit" or "16 bit" actually refer to in old school consoles? The CPU?

What does "8 bit" or "16 bit" actually refer to in old school consoles? The CPU?

Attached: Part2image.jpg (1280x760, 307K)

i think it's the color range or something

Color depth. 8 bit = 256 color

It's a marketing term that became prevalent with the Megadrive and the SNES. Since the NES and the Master System had 8-bit CPUs, they were marketed as 16-bit.
Funnily enough, the SNES has an 8-bit CPU and the Megadrive has a 32-bit one. The only things 16-bit about them are their system buses.
Imagine being this fucking retarded.

Also

color gamut headroom range

megapixels per hour

It turns out to mean very little.
There are very few "pure" 8 bit CPUs out there because 255 bytes of ram isn't enough to be useful. Even the NES had a 16 bit wide address bus as was common on 8bit CPUs of the time while the data bus was left at 8 bit.

There are examples of all sorts of computers breaking the mould through the 70s and 80s with different combinations of hardware components allowing for all sorts of vague labels to be thrown around

64 bit computers these days have countless components which use massively wider bus widths making the whole point kind of mute. There's not really such thing as a 8bit computer nor a 64bit one, it's all mostly marketing.

complete horseshit, go try to address 8gb of ram with a 32bit computer then get back to me on how everything is just marketing

couldn't you do this with a PAE implementation?

Reading comprehension

8bit games usually use 16bit components, it's just marketing

maximum word length

>64 bit computers these days have countless components which use massively wider bus widths making the whole point kind of mute.
Not really.
Modern memory buses are 72 bits wide, with only 64 bits being used in consumer machines (the extra 8 bits are used for ECC memory). PCIe, SATA and USB are serial buses, effectively 1-bit. Yes, you can have multiple lanes, in which case PCIe x16 is a 32-bit bus (16 TX + 16 RX).
Even in software, PCIe presents itself as 64-bit PCI and SATA presents itself as 16-bit IDE (in IDE mode, of course)

Paging, there are ways to address more than 64k on a 16bit bus by just changing memory bank.

There are 32 bit systems addressing far beyond 8gb but it didn't catch on. Learn to read btw.

x86 memory segmentation

Address space, which, up until PS 3/Wii/X360, was directly tied to CPU. PS 2, Dreamcast, Gamecube, and original very first X-Box, were the very last of the "bit" era consoles, them all being 128 bit. With PS 3, Wii, and X-Box 360, approach to console design changed drastically, so the need in "x bit" marque was completely abolished.

>PS 2, Dreamcast, Gamecube, and original very first X-Box, were the very last of the "bit" era consoles, them all being 128 bit
The PS2 had a 64-bit MIPS CPU
The Dreamcast had a 32-bit SuperH CPU
The Gamecube had a 32-bit PowerPC CPU
The XBOX had a 32-bit x86 CPU
Go back to where you belong.

PSOne, Saturn, and N64 were 32 bit, you dumb fuck. Jaguar was pseudo 64 bit (actually two CPUs by 32 bits glued together). Dreamcast was the very first console to be 128 bit, then PS 2 and X-box followed, then Gamecube. It's not about "bitness" of CPUs themselves, it's about ADDRESS SPACE available for developers. Sega Master System, Famicom, Super Famicom (without the FX), and Genesis/Megadrive - literally had 8 bits and 16 bits available to devs. You are a dumb fuck, holy shit.

It means the cpu works with 8/16 binaries. eg: 0 1 2 3 4 5 6 7 instead of 0s and 1s binaries

I see, chess with a pigeon it is.
Here's your (You).

Attached: Intbecile 0880.png (400x400, 1.22M)

6500 and modified snes 6550c or whatever it was called werent 16bit. 6800 in genesis and apple 2 were the first 16bit.

Its brutal that you refer to chipsets as cpus

Data bus width

Stop.