Oh wow! this card has 256MB of ram, it must be a lot faster than that 64MB Radeon 8500!

>Oh wow! this card has 256MB of ram, it must be a lot faster than that 64MB Radeon 8500!

Attached: 970721791387713347.jpg (625x417, 76K)

>tfw technology will never again improve by leaps and bounds like it did in the 90s and early 2000s

on the bright side i don't have to upgrade very often anymore, but a lot of that early "magic" is gone now.

Yep, but this is not exactly "leaps and bounds".
It's a 5200.

I don't know man, a 4x VRAM increase in adjacent generations? That level of relative improvement is unheard of today

think most of the early ones when the 5200 actually was the latest and "greatest" were 64-128MB though

It was most likely some slow as fuck chip they had too much in the stock to make this monstrosity.

entry-level chipsets like that were still better than shitty IGPs and were a cheap way to get multiple monitors too

Indeed, but its quite obvious that this card was aimed at idiots that buy video cards by the quantity of vram instead of any actual parameter.
I wonder what's the modern equivalent of it.

FX 5200 was my first bought with my money but it was 128mb 64bit bus.

Whichever have GForce on it.

still think there's some truth to that angle, but I find it easy to imagine cards like that being a simple "why not it's cheap" kind of decision with the added benefit of picking up some retard sales too like what you're talking about

not like it matters either way, both groups of users are totally the kind who would have bought those kinds of cards, I just hate assuming the worst of things

"The chip can do it, so we will do it" kind of deal.

exactly you got me

There is no modern equivalent because on the consumer level software is no where close to hardware yet. People who buy new releases of gpus/cpus year after year are actual morons unless they're using it professionally or do video rendering as a hobby. Whereas 15-20 years ago software was almost always being limited by hardware and manufacturers couldn't keep up

I meant more like a product that abuse a mostly useless by itself performance number where people judge the devices on.

I remeber sessioning Battlefield 2 on one of those bad boys, it hated it.

It was not a very good card.
Actually, it was slower than the Geforce 4MX in T&L, because the T&L on the 4MX was so good, it could do CAD as well as the quadros, so nvidia crippled it hard on the 5200.

what a neat thread

Attached: 1462903937182.jpg (638x792, 61K)

Good I like that every major part in my computer is 5 or more years old and I can still gaymen at ultra settings 60fps 1440p

I had a fx5200, got it when Lineage II came out, good times.

>ATI RAGE 4MB VRAM PCI
> ATI RAGE Pro 8MB VRAM AGP version comes out
>its fucking slower somehow

Why was ATI so fucking bad at making drivers?

>was

Cheapest nvidia that supports aero.

Intels i740 was a bit like that too. But for their AGP version, they used a feature of AGP where it used system RAM instead of its own onboard RAM. The PCI version of the card ended up being the better one due to having faster dedicated RAM.

I had the 128MB/128bit one

Pretty OK

what server?

megapixels of cameras, GBs of RAM/DRAM/STORAGE (as in this phone is faster it has 8GBs of RAM, obviously it does matter in some ways)

tfw used a p4 and fx5200 until 2011

Attached: chippu.jpg (816x960, 58K)

tfw fell for the FX meme 15 years ago

The megapixels meme, this one will never die, will it?

Nvidia need to invest in bringing back the boxes with sexy faries and goddesses and shit on the front

They're good for Splinter Cell. The game used shadow stuff that only nvidia cards at the time supported (3Ti, 4Ti, and FX). The game just doesn't look right and the shadows are not as effective as they should be with Radeon graphics cards. I think the later x100 series from ATI supports those shadows in Splinter Cell tho.

And sexy fairy demos as well.

Early soft shadow tech was cool before deferred lighting became a thing
Wish real time raytrace hardware would htfu

I wasn't on the US or UK server.

i want to fug sakurako

Plus! This card contain some form of 3D acceleration!

Attached: MAXIMUM ACCELERATION.jpg (1200x684, 170K)

It's almost happening again for the first game this decade. Ryzen CPUs forced Intels hand and now both of them have CPUs about twice as fast as a Skylake i5 which is fucking nuts. Graphics wise the jump from Maxwell to Pascal was the first major generational leap in Graphics technology this decade and Ray tracing is right around the corner. The difference between 2015 and 2020 will be light-years ahead of the difference between 2010 and 2015

The dedicated memory made i740 PCI significantly more expensive, so it wasn't cost-effective.

>S3 ViRGE

Lol no, it had 3D deceleration.

I bought this fucking thing at a best buy after learning my pentium 4 igpu could barely play halo CE back in like 2005 I think. This wasn't even an improvement.

>Why was ATI so fucking bad at making drivers?

ATI's drivers weren't that bad by late 90s standards, I've dealt with much worse.

Attached: 404_S3_Savage3D_SuperGrace_VA-391_top_hq.jpg (1600x785, 289K)

>YOU HAVE ACTIVATED MY TRAP VIDEO CARD

Not necessarily true. You might not get the leaps in speed but we're going to get other things in return. In the near future we'll probably see more desktops powered by ARM, POWER, RISC-V, and maybe others. We'll see better power management, longer battery life, lower temps to the point of fanless cooling being everywhere in mobile stuff. We might even see a resurgence of SD slots, headphone jacks, and removable batteries. There are lots of ways to improve devices, especially in software. Hopefully new operating systems like Fuchsia take off. You don't have to double the RAM and CPU speed every release. It's less useful than other improvements that can be made, and those improvements aren't that expensive.

Also way in the future we might see extremely modular mobile devices. Imagine a laptop that's just a screen and keyboard basically and you plug in 4 or 5 modules for a hard disk, RAM, battery, main board (with integrated CPU and GPU) and others for USB ports, card readers, HDMI, etc being optional. You could just flip a hatch open on the side after removing a screw or two and swap these modules any time for upgrades. Companies could offer unique modules that offer various I/O for specific things like audio, or eSATA, or PS/2. It's better than a desktop dock, and could be used with those too. The modules could be used in tablets or even large smartphones.

Attached: bladerunner.jpg (1830x2698, 512K)

It might, in high end cameras the trade-off is generally worse noise performance, so some applications (studio work where the lighting is controlled, any other well-lit scenario) more is pretty much always better. That being said, many lenses aren't actually good enough to deliver the resolution of a pixel dense sensor, which is what we see in phones with higher megapixels, and then they just end up with worse noise performance. It seems phones have been cutting back on trying to one up on megapixels tho, and in the case of the pixel 2 they increased the sensor size which improves noise performance.

It's mostly dead. For the last few years, it's been 8-16 MP for phones, 16-25 MP for consumer cameras and 25-50 MP for pro cameras with barely any increases (and exceptions generally having *less* megapixels than average, like Sony A7S)