Why did old computers have such tiny monitors?

Why did old computers have such tiny monitors?
Why didn't programmers connect their computers to 19" or 25" televisions? I have seen TVs from the 80s with composite video btw.

Attached: DdQRVHVVAAI0wXC[1].jpg (1200x832, 184K)

>I have seen TVs from the 80s with composite video btw

Attached: u.gif (331x304, 1.31M)

what anime is that

Text is sharper on a CGA connection as opposed to an RCA connection.

Cost

80 column mode

侵略!イカ娘 Squid Girl

Ever try hooking into a large TV?

The resolution and pixel size is so shit that text is practically unreadable.
It's really only good for video

Computers were pretty low resolution back then. Pic is the 5k iMac screen (scaled down, original was too big to post) ... in the corner is the high-resolution (for the time) original Macintosh screen. And CRTs are huge. A 19” tv would take up the entire depth of that guy’s desk.

Attached: 5A296833-7C37-4D85-B334-B59C1739C366.jpg (2560x1440, 894K)

?

They're both just composite video signals.

CGA most certainly is not.

I don't actually know howbthe displays work, but I assume at some point you need buffers of a row or column of the displayed values
Creating a big enough buffer was probably a constraint on the display resolution
And tv was full analog back then, so it didn't have this problem

You bunch of eedyots, a TV set is designed for PAL/NTSC only, the "resolution" is fixed no matter how big or small the screen is.

When there is no GUI Screen size doesn't matter much.

>CGA connection
h-h-how lewd!

Attached: cga.jpg (359x140, 9K)

it only displayed text that's why they are book sized

neat graphics and UI didn't come until later

>Computers were pretty low resolution back then.
Best they could do was IBM T221 which was crazy expensive and required a dedicated GPU to run it. Took 15 years but Apple was responsible for starting the PPI wars and now UHD is affordable provided the bandwidth is there to drive that many pixels. DisplayPort handles it pretty well but they're always playing catch-up with higher resolutions. Futureproofing a display standard appears to be a long way away.

Are you confusing composite and component? Component didn’t exist until ~2000, and s-video with its combined hue/saturation signal line was a lot more common until HD/ATSC was standardized even later. Component, aka NTSC heterodyned down from VHF, was a fucking mess, and shit looked awful on it. Only thing worse was signals mixed back onto VHF channels, like on NES and the most ancient of VCRs.

It was common for CGA cards to have baseband TV outputs (I.e., RCA jacks, compatible with those NES RF modulators), but actual CGA was a 4b digital TTL signal for a whopping, but perfectly distinct, 16 color palette.

Text looks like ass on a color composite monitor.

Attached: CGA_CompVsRGB_Text.png (640x800, 206K)

the memes have gotten to me, when I saw the word outlet, I briefly interpreted it along the lines of manlet, chinlet, wristlet, etc, someone possessing a weak out

Attached: 1491640803025.jpg (673x432, 33K)

320x200 resolution max

John Carmack coding Quake on a 16:9 28-inch CRT monitor. 2042 x 1152.

Attached: john_Carmack_working.jpg (468x332, 18K)

Why did they use 16:9?

That Mac had a near 640x480 resolution. Standard TVs could only get to 240p, which is 320x240. Professional CRTs that could sync to higher resolutions in the 80’s were incredibly rare and expensive. The first consumer HD CRT was only released in Japan in 1990.

Because the quality was completely arse.
You would not be able to read 80 text columns on a TV.

People used TV's all the time for hobby computing, but if you could afford it you would definitely upgrade to a dedicated computer screen.

as they say
and as i will say when i was kid in early 80s (81/82)dad bought remote big screen tv for that time, it caught on fire one morning while watching it with my grandma, remember it as if it was yesterday

Display standards are never going to be future proofed because of planned obsolescence-no TV model is sold for more than a year now.
At least 4k resolution already has so many pixels that on 40" displays or smaller, there's literally no point whatsoever in even trying to have a higher resolution.

This. Writing stuff like C works fairly well in an 80x24 terminal. It's when you get into more verbose languages and need IDEs you start needing more screen real-estate.

TVs didn't have VGA. Also, composite signal and consumer CRT TVs in general have shit quality. Not good for computers but good enough for watching sports.

what keyboard is that on his knees?

Can you fit a washing machine size tv on your desk?

the original mac keyboard, plugs into the front with a phone/network type connector.

It could fit, the hard part is lifting a CRT that gigantic up there.

if you took it off 16:9 would it die?

>Apple was responsible for starting the PPI wars
Nope. Dell was one of the early pushers of high resolution screens however not the first. The problem with Windows scaling made it really hard to use 1920x1200+ screens on a 15" laptop so PCs degenerated back to lower resolutions.

it would be extremely painful

> Assuming TVs of that era had VGA inputs.
> Graphics cards were too low resolution.
> Some companies made 27" monitors (Sun), but they required proprietary video cards which were very expressive.

It's a big monitor

for you

Because the cost of a CRT display increases exponentially with its size.