Let's say

...that computers hit a brick wall in the late-80s/early-90s, and for reasons no one could discover no advancement was possible.

Hardware is still made,and software still written, but there's no going forward.

Would you still enjoy computers if they were that limited?

Attached: T1i_7474s.jpg (1280x853, 281K)

Other urls found in this thread:

theregister.co.uk/2017/06/05/maxx_interactive_desktop_revives_sgi_irix_and_magic_desktop/
youtu.be/ZXzXFbtOON8
os2museum.com/wp/fast-unaccelerated-vga/
os2museum.com/wp/about-those-trident-vgas/
twitter.com/NSFWRedditImage

Limited? They weren't limited, todays software is just bloated.

Also yes, I still do.

>They weren't limited
Watch a 4K video, while playing a AAA game, and run a web browser with 200 tabs open in the background on a DOS machine.

Well, you could not watch high resolution anime on those, but they would still be cool.

Yes.

To live in a world without bloat and normies I'd even pay 1990 prices for just a floppy disk drive.

>implying any of this would be possible without advances in computer technology
are you retarded

Yeah. I use mostly minimal software on GNU/Linux. It only would've changed my childhood. Possibly saving me from the video game addiction I had for so long. Maybe I would've learned more and been in a better spot now.

>Watch a 4K video, while playing a AAA game, and run a web browser with 200 tabs open
This could also be called "limited".

>AAA game
There were AAA games back in the early 90's also. You don't seem to know what you're talking about.

>on a DOS machine.
Obviously you're a retard.

>There were AAA games back in the early 90's also.
No there wasn't. AAA games are a particular brand and genre of game that pushes the boundary of the medium and has a massive budget.

It would be likely that the shareware movement and the free software movement would coalesce, and due to there being a limit in what can be considered worth buying we'd see the end of propriety software much quicker because there'd be little else to dedicate all that time on.

I could have made Windows 3.1 or 95 work forever. So yes that sounds rather good. The Apple and Unix options of the day seem fun too. And simple terminal computers could actually be very useful to this day.

It would keep people imaginative, that's for sure.

Watching that guy from Traveller's Tales talking about all the tricks in Sonic 3D has really made me appreciate the ingenuity and skill of these people.

>And simple terminal computers could actually be very useful to this day.
They are useful today, sadly just not for anything that's relevant today.

Exactly. What do you think games like Terminator or Doom were?
Fucking retard. AAA is just a marketing term for a category of development.

Well, in the early to mid 90s there were some pretty fucking kickass MIPS workstations that ran Unix with a custom XMotif UI. These were made by SGI and I didn't find out much about them until it was far too late, and IRIX on MIPS was starting on the slow decline towards death. This was around 2005 or so when I bought them used for pocket change on eBay. One of the little $100 O2 machines I got came loaded IRIX 6.5, Adobe and Matador stuff, and best of all a full MIPSPro compiler license and all the software, so I was able to build some cool stuff here and there before freetard compilers got sufficiently ported over.

To answer your question, fuck yeah I'd be more than happy to use Silicon Graphics machines today. The OS called IRIX for them is a bitch and a half to install but it's still to this day the most pleasant and elegant UI I have ever used. Literally nothing can compare to this. There is no place I'd rather be than in front of a nice SGI workstation for the rest of my days. The IRIX login screen is my happy place.

I think what hits me hardest in my feels is that most of the IRIX source code is probably lost or literally rotting away on tapes in a box somewhere, never to be open sourced or even used again due to the licencing clusterfuck. Besides, even if it was open sourced, all of the brilliant people who made it are gone and, they took all the magic with them.

I also miss the old Apple PowerPC machines and Mac OS X 10.4 Tiger. Those were also some nice machines. A close second favorite of mine for hardware and software. Unlike IRIX, at least I can boot this in QEMU when I wanna be comfy.

Attached: 35f6eea5648717cbc0299481e0f35a85--wallpapers-phones.jpg (540x960, 40K)

I think what is missing with modern computing is the importance of elegance. And because the stack has gotten so tall we'd never be able to start from scratch ever again.

nice bait

>Would you still enjoy computers if they were that limited?
Well, software, protocols etc. would all get optimized to the level of demoscene programming, so it'd still all be very different to late 80s/early 90s computing we remember.

Yeah, the level of efficiency we'd be at my now would be totally unreal.

Absolutely

>Watch a 4K video
We had video streaming in the 90s. It looked like shit but we didn't know any better so it didn't matter

>while playing a AAA game,
The mind/imagination is amazing at filling in the blanks when it comes to pixel art. It was just as immersive.

>web browser with 200 tabs
Why would you ever do this? If you have more than 5 tabs open at one time, you are doing it wrong.

Im not sure they would be as limited as many people would think. A lot of programs are either minimally optimized, if at all, because you can just brute force the computation with the tech that was developed since you started designing the program. Think about console vs PC gaming. Console games have to be heavily optimized where PC games don't, because console hardware only advances in leaps instead of constant/ gradual.

I think it would be incredibly interesting to see what solutions they would come up with if they couldn't simply throw more cycles at it.

The nice thing about the Web is that any Von Neuman machine with sufficient CPU power and a bitmapped display can handle it. Imagine if the world was still as locked in to i386 Windows binaries as it was in 1998.

>talking about the late 80's and early 90's
>only mentioning Windows and DOS machines
I feel bad for everyone here who was stuck in the Wintel-only shithole when everything else was literary better.

It's a recognisable paradigm.

theregister.co.uk/2017/06/05/maxx_interactive_desktop_revives_sgi_irix_and_magic_desktop/

>I think it would be incredibly interesting to see what solutions they would come up with if they couldn't simply throw more cycles at it.

It would be great to live in a world where they add features while also shaving off bytes of memory.

Instead of the world where things get more basic by the draw increases.

literally better

And now present your opinion without nostalgia talking.

Absolutely. Keep in mind that 90% of the software I use today would probably still exist because it's almost all command line tools that would run just fine on a computer from the 90s. I know this because I use a lot of that same software on a 450MHz G3 PowerMac. Granted, it is from 1999, but that still counts.

First of all, as someone who still used a laptop with a 133 MHz Pentium processor that came out in 1997 up into the late 2000s without many complaints beyond it not being able to handle some newer things and me not having the hardware to connect it to the internet, yes I would.

I have to ask though, what kind of brick wall are we talking about? Are we talking about no longer seeing process node shrinks back in the late 80s/early 90s but still being able to reduce the cost of building computers, or them completely hitting a wall and computers staying at the same price as well? Also, would we still see advancements in networking rather than most people being stuck with dialup or would that be limited too in this scenario? A scenario with no more process node shrinks, but manufacturing still getting more efficient and networking still advancing would definitely be interesting and would potentially be extremely comfy, as you'd still see home computers become more powerful to an extent due to multi CPU models decreasing in price, at least early 2000s levels of adoption for computers, PDAs/UMPCs/basic smartphones still becoming somewhat popular, and basic 802.11-1997 wifi (since WaveLAN that it evolved from came out in 1991 and offered similar speeds). Honestly, simply being able to access the internet, even if you leave out the web and just have email/gopher/usenet/IRC/FTP, would just be too great of a resource to pass up and I could definitely still see it being adopted by anyone who interacted with it during school and had somewhat of an interest in learning/technology, especially if computers continued to come down in price.

Crap, now I wonder if there's any fiction that takes place in such a potentially comfy alternative timeline.

Still plenty you could do with them, if they remained as diverse and exciting I probably wouldn't really mind it. But if they just became cheap, same-y and appliance-ified as they are now at an earlier point? I would probably be bored out of my ass.
Man, you were either shitting in diapers or not alive yet if you actually believe that era of systems wasn't generally limited. Toyish operating systems with no notion of basic memory protection or security in general, video controllers that could barely handle a usable resolution out of the box without shitting themselves, tiny hard disks on the shitty ATA bus, processors with optional FPUs and tiny caches so slow a bump of a few megahertz was a noticeable improvement. So many every-day tasks, especially multimedia-centric were still the exclusive domain of cost-no-object servers and workstations, and even those still had their own architectural limitations for a number of tasks. Fuck, even something as basic today as real-time textured 3D was pretty much out of reach of anything under six figures, even SGI shit didn't accelerate texturing in hardware unless you paid out the ass.

It's not just the same shit with smaller numbers or something you can just blame on the meaningless "bloat" buzzword so often used by morons who have never worked on a piece of real software in their life, hardware was pretty much visibly limited from the start for a lot of things people would have liked to have until things started plateauing in the early 2000s.

There'd probably be a point in diminishing returns.

I mean, on the subject of multi-core units you'd quickly run into an issue of space if we're talking about 25MHz being hot shit, and needing extra co-processors to help it along, so they'd likely stay single-CPU for most home users with only a minority of enthusiasts having the expanded towers.

And kind of the same with the net. If you have a standard machine that is mostly reliant on text files then people would struggle to find uses for anything faster than 50KBs bandwidth.

And this assumes that storage doesn't also hit such a wall.

>would you still enjoy it?
>posts comfy Amiga
You know it.

Attached: amiga-1200.jpg (4000x6000, 3.69M)

mane you posted literally one of the most /comfy/ computers of that era, I know for a fact some fags STILL do their work daily on the Amiga 4000 and 3000.

>muh gaymen
Games were unironically much better in the 90s.

Only because you grew up with them and they remind you of a time when you weren't a bitter, whiny faggot.

No. Games have advanced in complexity but most of them are less fun. This is the entire reason /vr/ exists.

Games, like nearly everything it seems, have become "experiences".

Yes, because that would mean that there would me no smartphones as we know them today, thus not having to deal with this cancer

I agree that /vr/ is full of mindlessly nostalgic types who believe their opinions to be facts, but it doesn't make them right. That's not to say that they can't still entertain people, especially those who have that sentimental attachment to them, but that doesn't mean they're really all that better than any other era. The '80s and '90s spawned plenty of shovelware garbage with simplistic and boring gameplay, terrible graphics that are difficult to immerse into, laughably pathetic/cringy storylines, characters and dialogue. Even the good stuff had its own share of brainless detractors who complained about it the same way /vr/ faggots do.

Whatever, I just hate it when people mindlessly worship an era when there's plenty of garbage no matter how far back you go.

>on the subject of multi-core units you'd quickly run into an issue of space if we're talking about 25MHz being hot shit, and needing extra co-processors to help it along,
The 486 had an integrated FPU and the higher end models could run at 50 MHz, and while space would be an issue, the gains from a multi processor build would be definitely be big enough that it would probably be mandatory on enthusiast builds.

2D games are comfier than modern 3D games, so at least they have that going for them.

>"comfy"
Please, just fucking stop while you're ahead.

If we hit a hard limit in computation as we are slowly running into now, you would see an explosion in different computing paradigms
Watch a 4K video on a million DOS machines networked together in a small space, oh wait that's a GPU.

Take a look at the demo scene.
youtu.be/ZXzXFbtOON8

if that were the case they would've died out in the consumer realm and been relegated almost exclusively to research/accounting/military use

Revision demos get run on a Titan X or something.

GPUs are not genera-purpose processors

Just imagine how much optimized the software be for such limited resources.

This would be amazing!

this guy gets it

`tism in action.
GPUs are stream processors that only perform floating point computation and have vector instructions optimized to compute matrices.

Intel's "manycore" processor, the Xeon Phi is a general purpose processor, laid out in silicon the exact same way a GPU is, except it supports a wider variety of instructions. That's the only difference.

The echo on mount stupid must be pretty intense.

What's wrong? Games that don't require precise analog control over 4 degrees of freedom like the vast majority of modern games do are objectively better for relaxing/playing while tired. 2D games generally only gave you 2 or sometimes 3 degrees of freedom and were designed to be played with a D pad.

paraphrase that from wikipedia all by yourself big boy?

We'd probably be at a level of optimization that would make even the slim software of the time look like bloated trash.

Attached: commodore_amiga_1200_photo_ad_remake_no_2_in_3d_by_zgodzinski-d7ax9bb.jpg (2560x1440, 1.07M)

>being this unable to handle the fact that your sophomore computer architecture class couldn't teach you sense

Attached: time for your (You)s.webm (1280x720, 2.86M)

thanks for sharing this, I had no idea

Attached: 1518561169949.jpg (900x900, 57K)

>projecting this hard because someone called you out as a retard for comparing a modern GPU to a fucking cluster of DOS PCs
if you knew as much about architecture as your pretending to it wouldn't occur to you to make such an asinine statement in the first place

>backpedaling this much and trying to nitpick at the generalizations of a single sentence answer meant to convey a point
you should just stop now, this is pathetic.

It must get tiring being so autistic that you can't see the forest for the trees.

Attached: 1524307243925.webm (1280x720, 1.5M)

>mfw somehow ended up with 6MB of MOD files

Anyone know a good floppy cataloguer?

>in before calling me a poorfag for not having a 40MB hard drive

I can agree with that. I just mistook you for one of those obnoxious hipsters that constantly uses the word "comfy" to justify liking anything old and crusty.

Though I don't really find a lot of 3D titles that difficult to handle in that kind of state, at least when I have a suitable controller.

I think a big problem with 3D games is that the imprecise nature of the controls means there needs to be a degree of room for error that makes it no longer fun, and just becomes a mode of travel rather than part of the gameplay. In a way moving the character has become busywork.

nigger that was what I was calling you out on from the beginning, and there was no nitpicking there, you made a blatantly fucking dumb statement to convey a weak point just so you could flex on someone despite being just as retarded

yes, we now see that you are very smart and good at dumping google search results on us, but now will you shut the fuck up and learn to think before you type?

I agree, I'd expand this to include movies. CG has advanced in complexity, but I have a much better time watching The Last Starfighter or Goonies over capeshit or Star Wars sequels. The technology raced ahead, the spirit of fun has dwindled.

I think I definitely feel what you're talking about in a lot of modern open-world games, but it's never really bothered me too much personally as long as I can immerse myself in and explore the environment the developers have set out for me. That's always what I enjoyed in games as much as or even more than pure gameplay, even when I was a kid, and a lot of why I'm not really into a lot of popular older genres like platformers.

But I still love very simple 2D games for probably a lot of the same reasons you're thinking of, guess I'm just pretty polarized about it. Either intricate 3D worlds or MDA Tetris, and some in-betweens like early networked shooters where the gameplay really outclasses my environmental nitpicks.

I guess it really depends on how good you are with a controller. I've never been able to get good with analog sticks on a controller for looking/aiming, at least not to the point where I no longer feel it's massively lacking vs a keyboard and mouse and am constantly wishing I had a better input method.

Yes those were the fucking Glory days. Back when you had to make shit work and there weren't wizards or the internet to fall back on. I remember working days on issues and I am pretty sure it made me the diagnostician that I am today. The new kids are pussies and if they can't Google an answer they give up. Nothing like trying hundreds of combinations of fixes to really start to understand the underlying software/os.

Fuck you have no clue.

No shit. Kids I swear. They think everything today is the shit without realizing that the changes and improvements we lived through were fucking amazing. Real ground breaking stuff. The changes today in 'AAA - retarded term' titles are minimal at best. Just throwing more money at rehashed shit, most of the concepts coming from the late 90's and early 00's.

AMIGA

The biggest thing to me is that it's easier to use in a comfortable position without really thinking about it. A keyboard/mouse needs a little more space and the uneven surface kind of fucks with my touch typing a little bit, I feel like I constantly have to feel and adjust to make sure I'm on the right keys, so with more complicated games it kinds of kills the comfort factor. I totally see 2D winning out in that case, using a mouse on a chair sucks shit.

I'm sure it's easy to solve, but I've just found a simple console controller easier for that kind of stuff. Once I get used to it it's pretty alright, but it's probably shit for a lot of you guys that would regularly go between that and a traditional PC setup where you have to keep re-adjusting to the point that it just gets in the way.

I had a 66MHz 486 in college with a co-processor.

Thread is making me install MilkyTracker again.

In all likelihood, yes. Even as limited as they were, computers were capable of all the basic tasks a computer is used for. There were still plenty of fun games back then to keep one occupied, and people would continue writing them within the limitations of the hardware.

Besides, even if no advancement is made, over time the high end would become more and more available to the average person. People, or at least enthusiasts such as us, would not just have their one IBM-compatible PC; we'd all have a PC and an Amiga and a Mac and a NeXT machine and a SPARC and whatever else.

I suspect that we would have gotten different motherboard architecture then we have currently. Likely to integrate a number of more specialized processors akin to modern sound cards, physics cards, network cards and GPUs but likely on board or dedicated sockets instead of expansion cards.

I suppose some trends, such as more and more stuff being onboard rather than as expansion cards, would still happen. Processor cards for different architectures weren't too uncommon back then either, so I guess having a single machine with cards for the other systems wouldn't be out of the question.

So we'd play Lode Runner and Aztec and Lord British would reign.
Great times, I tell you.

I miss having a small enough amount of files that I could keep track of what I had.

Unlike now...

Speaking of tracker music, you know what I miss? Computers being unashamed of people computers. Everything from the UI to the sound all seems to do its best to pretend that it's something else. What's wrong with bitmap fonts and synth music?

I would enjoy them more.

>What's wrong with bitmap fonts and synth music?
They were only ever a rough approximation, necessary because of limits of the technology. I guess synth music was just enough of its own thing, being used in music outside of computing, but synth plus recordings and pure recordings are both more desirable than pure synth in both arenas.

well, you'd see software advance, and the early 90s was just barely when "multimedia" was beginning to be possible
really, if we're saying 1991 or 1992 as the cutoff, I'd be entirely okay
damn near everything I'd use a computer for would be doable, it'd all just a bit uglier and slower

if we're talking early 90s (isn't ProTracker from '90?), you'd probably have a bigger disk than that
not massively bigger, but 40MB is a bit small as far as I remember (the machine from '94 I grew up with had a 2GB disk, although I bet my dad paid out the ass)
too lazy to google for magazines to find disk prices/sizes of the day

do it
I love MilkyTracker. Great piece of software.

That's one of the better things to come out of the flat UI era, and why things like synthwave and pixel art are making a comeback. People are learning to embrace the nature of the medium, just like they do with traditional musical instruments or art.

they're still productive. you can still do the important things in corporate environments: word processing, spreadsheet, database. Windows 3.1 and System 7 were bit clunky compared to what we have today, but they're very functional.
there'd just be a lot of shuffling around bernoulli or SQ disks for large data transfers. 10 Mbps was really slow.

Attached: amipro_doc.gif (640x480, 18K)

Oh shit, I used amipro on Win16 up through 1998 for schoolwork.

Computer architecture on Jow Forums? What did I miss? That’s like, my thing.

>(the machine from '94 I grew up with had a 2GB disk
Oh yeah he would have paid out the ass. In 1994, that would almost certainly have been SCSI drive. 1GB IDEs only started to proliferate in 1995, to my recollection. I remember forking out around $200 for a Fujitsu 250MB in 1994ish.

800KB/sec wasn't that deadly slow when you consider the average hard disk in 1992 was 80-170MB region. Further, those drives probably weren't that much faster than 10MbE - I barely got 2MB/sec out of the aforementioned Fujitsu, and that was with a flashy bus-mastering VESA IDE controller in my ridiculously expensive 486DX/2-66.

Young and stupid was I. I should have gotten a 486SX, and bought a yuuuge 500MB drive.

let's say..........

i don't enjoy using computers newer than a 486 anyway

they would eventually find ways to lower the cost of the absolute peak techology anyway... so not much would be lost.

If you mean no process-size advancement, we'd probably have seen a lot more attention paid to crazy CPU architecture designs like the Mill. Something like 40 instructions per clock per core single threaded, all without turning into a blast furnace. Accelerator chips would also remain in heavy use. Sound, graphics, networking with TCP offload, crytography, RAID...

Not him and idc about the other stuff, but as a programmer I can safely say that when in the middle of development I'll normally have ~15-25 tabs open

>...
***BOOMER DETECTED***

Attached: file.png (640x799, 961K)

So decent boxes would have remained hideously expensive. It's all the miniaturisation and integration that means you can have a fully-functional computer for $200 - that uses hardly any power - these days. Throwing in hard disk controllers, video cards, sound cards, etc. is why we needed monster 250W power supplies in those days.

So...?

Yeah, probably.

> Young and stupid was I. I should have gotten a 486SX, and bought a yuuuge 500MB drive.
it depends on your use. I remember DOOM used to lag somewhat on my 486DX-33, but it was really smooth on my uncle's 486 DX2-66

Misconception. Doom did well on a 386DX-40 with fast graphics. The issue most people had with Doom performance was some "Windows accelerator" with shitty frame buffer access performance, or a slow as molasses ISA video card.

Yes Trident, I'm looking at you.

>Would you still enjoy computers if they were that limited?
considering you would never get to know the level of computing we have now, we would still enjoy it, just in different ways.

os2museum.com/wp/fast-unaccelerated-vga/
os2museum.com/wp/about-those-trident-vgas/

Sure. Multiprocessor motherboards would probably become the norm. Clusters and grid computing would be main areas of research. It would force us to do more in parallel.

What you're describing will eventually happen due to physics. Look into Amdahl's law.