Let's say

...that computers hit a brick wall in the late-80s/early-90s, and for reasons no one could discover no advancement was possible.

Hardware is still made,and software still written, but there's no going forward.

Would you still enjoy computers if they were that limited?

Attached: P1200651.jpg (3584x2394, 2.98M)

Yes because I had a computer while growing up in the 90s and it was really cool. I don't see why it wouldn't have continued to be cool if I didn't know what computers could have been.

they're still productive machines
you might not be able to process as much info, but maybe it's better since that'll take more people to do the work and everyone stays employed

What if in 2019 computers hit a brick wall and no more advancement was possible.

Attached: Gtsjws+wipe+out+right+wing+america+gtflood+america+with+_f9641709909661fc02cc8440656a08be.jpg (979x832, 201K)

Beige monitors/computers/keyboards are aesthetic as fuck.


Also, you're forgetting: Linux wouldn't suffer any hardware compatibility issues anymore. No problems with Wifi, no problems with the latest video cards, no fancy touchpads to configure with syntactic. It would be heaven.

>no botnet
>no bloat
>no ranjeets

Oh no. How terrible. How could I cope?

At least they should have NE1000/2000 support. Otherwise you're right.

this, but I think the quality would considerable drop because they can't launch new products so they will make them last one year.

Considering the (comparative) simplicity of these machines it's likely all that will do is create a vibrant after-care market.

I would enjoy them a lot more.

maybe. but I think companies will always find away to screw the consumer, at least the normal user.

Computer technology did hit a wall in the late 2010s and stopped improving by 2014. You all seem to still enjoy using tech though.

The idea of free software would probably be a lot further a long too, since there would be little to do to encourage new purchases outside of marginal gains.

I did learn something interesting though the other day: when Lotus 1-2-3 was written in assembly it fit on a floppy disk, but when they changed to C is ballooned to over 4MB.

I don't see where the hypothetical problem would reside? That you can't play "muh flavor of the month TreeDee battle royale XD"?

That's a very good looking setup, everything so clean.

early 90's would be 486/pentium tech, i'd be fine with it. i was using it up to the 2010's because all i had was dialup so any net pages would timeout.

>IIsi
>MFW literally first computer I bought myself in 1996, $35@a pawn shop
>Marathon, Photoshop 3.0, System 7.5.3
>I miss that machine

>amd literally announced a 7nm cpu in 2020

DUURR DURRR BRICK WALL IN 2019

Consider the first ever NES game vs some of the last. The difference in quality is off the charts.

Anyone who is actually into games knows that PUBG is the only BR worth playing. Fortnite is for casuals that go onboard because Drake was paid to play it.

I hope alt-Jow Forums didn't fall for the color meme.

Attached: 2864.jpg (600x450, 28K)

So lets say Intel announces 1nm cpu later today. Then what?

I would be fine. Humans went to space and the moon without iPhones.

So, 486? Having so little ram would suck, but SMP makes desktop usage infinitely more bearable. So long it has enough hardware features to run a proper operating system with preemptive multitasking and protected memory it should be more or less fine.
>still using CRTs
>software rendering for 3D
>having to actually optimize code you write
Sounds like fun.

>preemptive multitasking and protected memory
It's very unlikely you'll be doing anything that would require them on hardware of this level as a home user.

>not multitasking
>not wanting a stable environment for productive work

Also some demoscene creations.

Having such severe limitations is often a good thing since it channels everyone's vision. If you're designing a UI and have a screen of 640x400 or something and that is fucking it, you'll get the absolute most out of every pixel.

>>implying any genuine multitasking on these platforms

Having a clock and a word processor open at the same time isn't "multitasking" beyond the simplest definition.

>>not wanting a stable environment for productive work
You really don't have the resources to spare for protected memory. Do you have any idea how much slower that would make a machine with 1MB of RAM and a 10MHz processor?

We'd probably get a lot of computers based on linking multiple machines together, like the RISC-PC could have a whole other unit act as a slave to the first.

Most modern technologies and ideas existed in some form by that time, and since the brick wall doesn't carry over to ideas I think we'd still be able to keep software and hardware interesting with the right mindset towards it.

But I guess it really depends on how hardware and software vendors would react to such a thing. Would they just give up and accept their fate as boring appliance makers? Or would they go even crazier with attempts to differentiate themselves from the competition? I think in the latter world combined with continued creativity would still make for reasonably exciting technology.

>what is an MMU
Plenty of 10 MHz/1MB systems in the 80s were capable of multitasking, besides if you stretch OP's "early-90s" upper limit as far as you can get it we'd be talking about 1994/1995 hardware, more than capable of some competent multitasking with the right software to match it.

you'd see companies offering access to expensive, high-end computers, with home machines used as thin clients (which was being pushed already in the early 90s)
the big loss you get is regarding multimedia, although everyone would probably just pony up the dosh for an MPEG card

more advanced home users would just buy a new machine every once in a while if they needed real computing grunt and use something like GNU parallel
or maybe plan 9 would actually take off, possibly by someone buying the rights and commercializing it

There's probably be a kind of settling, like how for decades the basic idea of a mechanical typewriter stayed the same (as in a system of pulleys causing a hammer to strike the ribbon to the page). While there would probably be a lot of expansion cards the massive majority of users (probably even a lot of hobbyists as well) would decide "fuck it" and just stick with the stock device for the most part.

Which leads to another question: what happens to price? Would it go down with time, or stay the same? Is it just the chips that stagnate or does storage do so as well? It would be funny to imagine everyone on these old devices, but with entire TBs worth of storage for text files.

Computers would be massively more productive for general users without the array of useless software

Computers would be massively less productive in scientific fields

I know that when I used my old DOS pc for college rather than my (then new) XP machine with games, music, video, etc., I had never been as productive. Christ, even to this day I can't believe I wrote so much and worked so hard.

Massive BEOWULF clusters running the internets.

Well, there's not much *to do* with them other than work, right?

I know. It was wonderful.

>PUBG
>BR worth playing
[citation needed]
btw, i'm not implying fortnite is worth playing

we've had this thread before
eye cancer 70% one thing
im more interested in how far we could push hardware from that era with actual optimized software instead of todays garbage

My 2007 sony vaio is practically the same as 90% of the laptops at bestbuy in 2018 minus the touchscreen gimmick.

Quantum

A lot of software from the 80s-90s wasn't as good as it could be either or didn't utilize the hardware that well, for example most home computer arcade ports were garbage.

> computer hardware would be standardized and commodifed by now
> free software would have BTFO most proprietary software by now
> CRTs
> dial into BBSes
> play MUDs
> no smartphones
> cars would not be locked down, planned obsolesce, infotainment clusterfucks

I would probably be happier.

we would have found a way

Attached: lciii.jpg (1024x768, 486K)

>> cars would not be locked down, planned obsolesce, infotainment clusterfucks
Seriously. I heard an advert for a car the other day where the main selling point was its voice assistant.