Remember when you could upgrade the memory on video cards?

Remember when you could upgrade the memory on video cards?

Attached: video card memory slot.jpg (4452x2792, 1.99M)

No

No

No

No

No

Yep. I think I might even still have a video card or two from that era lying around.

Yes. My S3 Trio had an additional beds for RAM.

no

Lying is a sin.

No, really, I had a 97-98-era PC with PCI S3 trio that had beds for RAM chip, that I've never used, since it was impossible to get it.

Attached: x.jpg (1600x970, 305K)

No seriously, look at the RAM expansion board on my old Matrox card.

Attached: matrox.jpg (2000x1500, 794K)

Attached: peperedge farms remembers.jpg (600x600, 42K)

Well in light of , there's no solution other than to kill yourself.

Genuinely jelly.

This.

Jelly of what? If you want to "experience" that shit just buy a card off eBay for 5 bucks.

soon enough we will have modular GPU's

Base "Graphic Motherboard that attach to the PCIE" which can hold a Graphics processor, a Physics Processors, and Memory modules

Yes. I even upgraded memory on the VGA in my first PC ( SIS 6201, I believe ) from 1MB to 2MB.

...

I didn't know that IBM made DACs

Attached: 1523590278965.png (485x443, 26K)

getting told this hard is a sin

Attached: DSCF1441.jpg (2304x1728, 1.79M)

i remember i pried the chips off another card to put in those sockets. i don't remember it making a difference to anything.

4th degree burn

>Matrox
>ATI Rage Pro
>Dell OptiPlex onboard videocards


yup...

adding ram wouldn't speed it up, it would enable deeper bit depths and higher resolutions. and there wasn't really anything in dos or 3.1 that would've been able to use 2MB.
1MB could do 24-bit colour at 640x480 or 256 colours at 1024x768, which was fukken nuts on a 14" CRT.

It enabled higher color depths at higher resolutions. Kinda pointless on a 14" monitor, but if you had 19-20" it got relevant.

>not wanting 1024x768 at 24/32bit
>not wanting 1600x1200 at 32bit

Not only would the pixels get tiny on a small CRT, the CRT couldn't keep up with drawing all that shit.
I remember writing a modeline for X that gave me 960x720 on a monitor that didn't want to deal with 1024x768 at nonterrible frequencies. At least it was better than 800x600.

> the CRT couldn't keep up with drawing all that shit.
Braindead.

got me wonderin'

Attached: 28685986_1650493745005192_6242787112822129873_n.jpg (295x364, 10K)

>1600x1200 at 32bit
holy fuck yes i wanted that. but that meant asking santa for a maxed-out SGI octane with a 20" trinitron, and where is a little kid gonna get forty thousand good boy points?

Yes.


I don't remember the brand.


But I could upgrade the 512Kb vram to 1Mb


Anyone know what the brand could be?

Lol wut?
I had a Mitsubishi monitor around that time that did 2048x1536@85Hz. I still only ran it at 1920x1440.

Costed less than my current 4K display.

no, that was actually a thing. television/CGA had 240(ish) lines and ran at ~15kHz horizontal. at 1600x1200@85Hz you're scanning at ~100kHz. the circuitry that swept the beam in pleb CRTs just couldn't deal.

and now we get fucked on the backstroke: the newer CRTs can't slow down enough to display TV, so if you've got an early computer you need to chase down a multisync/multiscan monitor.

Attached: sync or swim.jpg (480x360, 9K)

we're talking about the 1990s, OPs card is from 1997, the s3 trio64s are both 1996, i can't read the matrox but it looks like a 1996 mystique.
your mitsubishi that didn't quite work right at 2048x1536 probably means a diamondplus 220 or 230. these came out in 2000 and 2002.
fucking moores law, man.

Attached: who.jpg (970x545, 105K)

Faggot you're using an apple

Attached: IMG_20180424_0102.jpg (5869x3739, 2.44M)

>Remember when you could upgrade the memory on video cards?
Remember how slow and crappy they were?

Attached: IMG_20180424_0100.jpg (4449x3238, 1.93M)

yes

Attached: s-l1600.jpg (1600x1200, 204K)

That RAMDAC is one hell of an aesthetic chip. I have one of these too OP.

IBM paid other to do it and slap their brand on it.

...

no

Dug these puppies out of the spares cupboard

Attached: IMG_20180425_110029.jpg (3328x1872, 1.59M)

Yeah, it was only good for increasing resolution and/or color palette since it was bounded how much memory your card had.

How would I go about getting an upgrade module for a Matrox AGP board? Haven't got it with me right now, it's back at my dad's house, found it resting in the back of a cabinet, don't ever remember using a PC with it (since 1998 at least). Good thing he kept it, for whatever reason, but I have no idea how he got it since he wouldn't be able (nor crazy enough) to afford one, especially since he only played UT98 back at the time.

We could have had removable GPUs, but nooooooooooooo, Jewvidia never let it get past the prototype phase when they bought all of their intellectual property

Attached: rampage.jpg (850x588, 91K)

Sorry gramps, memory is too high speed nowadays to be mounted on discrete chips


>Also, no

I had that same Matrox card sans the expansion. I paired it with Voodoo 2 instead.

I'm kind of sad that I trashed all those old PCs I had.

S3TrioV2 was the GTX 750 ti of its day - hung around for ages way past its introduction as a placeholder. Anyone remember RTL 8139? They always worked in even the most broked computers and software situations, I'm guessing the socket was for firmware extensions?. Nothing like that today, such a shame.

Attached: rtl8139.jpg (640x448, 91K)

got a few NIC's with empty chip slots (3com, dlink and rt) - always wondered what it was for... The dlink has
>BOOT ROM
on the pcb where the chip would sit(?)

I never paid more than $100 for a CRT.
Lucky we had an Oil company and a few Mining companies in town - so a 3 year old DEC or Sun branded trinitron was usually ~$50.

Never had any Sgi stuff come through though - I guess IRIX wasn't for 'real work' (which is shitty, becuase I always wanted an Indigo)

Lolno. My buddies ran the college computer club in the 90s. When RTL8139 came about they went "oh, wow cheap 100mbps" and bought a boxfull straight from the distributor.
They burnt a card every week until there were none left and then resorted to buying 3com.

RTL8169 is a different story, that works pretty well.

Post yfw you'll never have 2tb of vram

Attached: download.jpg (650x650, 30K)

It's slow RAM, anyways. The regular FE would be a steal if it wasn't for the miners, 16GB of HBM...

It has HBM too, 16GB of it.
It also has a 2TB SSD.

IRIX saw use in those industries but just for different purposes, usually visualization
my first good SGI box came out of a mine where it was running maptek software probably off of an even bigger SGI system like an onyx or challenge

Who cares about that OP.
Modern hardware is much much worse.
WHen was the last time you found a cheap laptop without BGA instead of sockets in laptops?

>tfw your igpu dies and you got a bluescreen
>no 1000$ SMD oven to replace solder balls and gpu
>"heatgun fix" is only tmp fix and gets worse over time
>no external GPU support on board
Trash.

>came out of a mine
Like a mining site?

a socket would just be an additional failure point that can become dirty/less reliable over time or even straight up break just as often as any surface-mounted component being poorly cooled over time

there hasn't been a good argument for socketed VRAM since even entry-level cards could handle 1600x1200x32 without giving a single fuck

believe so yeah

I'd still have a working gpu and cpu though.

That's badass man.

until your socket/edge connectors fail which is way more probable than an IC or properly surface mounted component failing

I used them for years as debug tools, headless wonder like RS232 but better. The software was like a bicycle, just werk. Must have had a good manufacture examples (there were lots).

> still into haruhi
Nice.

>Virge
At least they will load all the textures with the expansion but..

Sort of. I was in elementary school at the time when these were only at school computers only and I have 500mhz with nvidia at home. I did burn one of these while trying to "service" the jumpers.

Fucking love the graphics on the box

I've never had a socket fail on laptops or desktops ishmael.

I've had 3 laptops and a dgpu fail because of BGA.
The issue with BGA is because the solder used is leadfree now which means much lower melting point.
Sockets whle lead free are a solid trace not a ball.

It's the same issue that happend to the xbox 360s.
If it was socketed you could replace the Mobo and continue on with your cpu/gpu/ram.

With BGA you pay extra for a replacement mobo because it has the cpu and gpu soldered in.

Don't be a cuck and lie to me.

Mr. Bones discovers new wild ride

No

>BOOT ROM
yep, you put the PXE/etherboot code on a PROM to make the card a bootable device, then you ENABLE LAN BOOT ROM in the BIOS and it'll fetch an OS from a remote server to boot your diskless system.

Probably he's jelly of the fact that companies these days are such fucking filthy jews that they don't create modular GPUs, and instead sell slightly higher RAM version GPUs for shitload more , ever thought of that you numale slime ?Or you are content in gaping your maw to every shitty jewish practice all the companies pull ?

Cool - guess I'm showing my age - most nic's I've used have had pxe onboard.
Thanks for the info.

Yup. Still got the memory boards for some early Matrox PCI cards.

What the fuck are you talking about, bunghole
>what is thermal expansion
>what is tin plague