I hate computers

After ten years of programming I can't take it anymore.

>programs and operating systems still take tens of seconds to load
>no software truly works
>hardware has been compromised for decades
>computers don't operate in a sane way, instead of just using memory mapped io everything gets dispatched to northbridge and southbridge sub-processors to glue the clusterfuck of io together
>hardware bugs from insane sleight-of-hand design decisions like lookahead and branch prediction
>software stack reaches to the moon and there is probably some fundamental bug that we won't be able to recover from
>processors probably aren't even executing in hardware when you think they are but are instead running microcode
>no upgrade path for hardware because everything has to be backwards compatible, even when huge translations of software were done in the past
>usb
>uefi
>16:9 monitors
>we're still using qwerty
>programming still revolves around editing linear text files
>linux kernel is tens of millions of lines long
>linux permissions model is terrible
>there is likely a better alternative to the von neumann architecture but we will never use it
>90% of every single binary probably never executes, even simple binaries like echo
>all file formats are terrible
>png is encoded in 8x8 blocks and then stuffed through a huffman and lzma encoder and this is what people think is a good lossless image file format
>kilobytes of empty metadata in every single file
>filesystems are so badly designed it's a miracle we haven't lose everything
>still have to jury-rig your text editor and compiler to work together, probably through a terrible scripting language
>there has never been a decent printer made, ever
Why do we live like this?

Attached: dead.jpg (798x434, 79K)

Other urls found in this thread:

pharo.org/
youtube.com/watch?v=Nbv9L-WIu0s
ibm.com/blogs/research/2018/04/ibm-scientists-demonstrate-mixed-precision-in-memory-computing-for-the-first-time-hybrid-design-for-ai-hardware/
youtube.com/watch?v=_eSAF_qT_FY
youtube.com/watch?v=TH9VCN6UkyQ&list=PLmV5I2fxaiCKfxMBrNsU1kgKJXD3PkyxO&index=2&t=0s
ibm.com/blogs/research/2018/06/future-ai-better-compute/
ibm.com/blogs/research/2018/12/8-bit-breakthroughs-ai/
tonsky.me/blog/disenchantment/
twitter.com/SFWRedditImages

>programming still revolves around editing linear text files
I'm interested in hearing your alternative to this.

>90% of every single binary probably never executes, even simple binaries like echo
>processors probably aren't even executing in hardware when you think they are but are instead running microcode
What do you mean user ? genuine curiosity

>>processors probably aren't even executing in hardware when you think they are but are instead running microcode
Where is microcode running then, bucko?

excellent post
I think the biggest problems with computers and programming are that we're trying to abstract over fundamentally different things, so we always end up losing somewhere or other. Also if you think about it, programming so far is just about replacing humans so that things can be done in greater quantity. It hasn't actually substantially improved anything except maybe making virtual worlds.

The Smalltalk environment presented programs in a non-linear way, you could even navigate and change your program as it ran.

Attached: scrollbar1.jpg (576x268, 64K)

>rom gets poked at an address
>spits out a sequence of bytes
There's no computation.

You can run OpenGenera on Linux, which is a similar OS using Lisp.

If you think about it, the whole point of lexers is to translate text files into the input parsers want, and the whole point of parsers is to translate the tokens into input the compiler or interpreter wants. A visual language is an example of an alternative, plus you could always have lexed and parsed text boxes in such a visual language. You could also make use of visual cues to give people a better idea of what's going on. Some visual languages can even prevent you from inputting invalidly typed programs.

There's a bunch of Smalltalk implementations out there, some are even used in production.
See pharo.org/

>What do you mean user ?
There's a good talk on this
youtube.com/watch?v=Nbv9L-WIu0s

>Some visual languages can even prevent you from inputting invalidly typed programs.
but it wouldn't matter because the compiler would just say "no".
dependently typed languages take this to the nth degree and extend types to include whole expressions, which excludes enormous classes of logic errors from your programs using just the compiler.

>there has never been a decent printer made, ever
brother hl 2140 just werks, user
Unfortunately their next in line is probably a botnet.

>png is encoded in 8x8 blocks and then stuffed through a huffman and lzma encoder
Huh? png is encoded in scanlines, which are packed into filters and compressed with deflate, not lzma, no 'ma' anywhere. Farbfield might be based, but why big endian?

>using just the compiler.
runtime*

Wake me up once compilers are fast enough to compile this stuff at pascal speed levels.

type checkers validate input, i'm saying only allowing valid input. the particular system is unimportant.

>png is encoded in 8x8 blocks and then stuffed through a huffman and lzma encoder and this is what people think is a good lossless image file format
What would you suggest, user?
>inb4 inane videocodec schemes that suck my mobile devices battery dry

The fundamental problem with visual languages is that they are too rigid, you can have only one very tedious way to draw a program, while text is infinitely flexible in manipulation. Visual languages only serve as extensive customization for not-quite-programmers, and they are good at it indeed.

The problem is that you're using a phone instead of a portable terminal.

Attached: consider_the_following.png (1000x1000, 264K)

>png is encoded in 8x8 blocks and then stuffed through a huffman and lzma encoder and this is what people think is a good lossless image file format
But that is wrong, baka.
PNG has no pixel blocks like JPEG. PNG image data is stored in chunks called IDAT which may exist as one large chunk with a length no longer than 2^31-1 or smaller subsequent IDAT chunks of arbitrary size.
These chunks, when concatenated are then uncompressed using DEFLATE method in zlib. The compressed stream is basically lzma starting with two magic bytes of gzip and a sliding window of at most 32768 bytes.
The uncompressed stream is the filtered scanlines each scanline starts with a 1 byte filter method value (none, sub, up, average, paeth) concatenated together. The size of each scanline is width * pixel size. Filtering enhances compression,
The pixels in each scanline are either true pixels with 1 (greyscale), 2 (greyscale + alpha), 3 (RGB triple), 4 (RGB + alpha) channels or are indexes to pixels that exists in the PLTE (palatte) chunk when the number of distinct pixels is no more than 256.
The pixel size is num. of channels(consider it one for indexed pixels) multiplied by the bit depth of channels. Only certain combos orf color type/bit depth are allowed e.g indexed pixels can be 1, 2, 4, 8 bits wide but never 16 bits which is allowed for greyscale, greyscale + alpha and RGBA. When there are bits left in a scanline, they should be ignored by decoders and set to zero by encoders to get a natural number of bytes per scanline
PNG also supports interlacing.
IMO PNG is the most simple and effective loseless format.
Let's all love PNG

You are completely right except for the part about programming revolving around linear text files.
That is a good thing and it should remain that way.

good larp I rate 8/10

That's not true at all, there are a ton of different ways of doing it and you can have textboxes to allow that sort of input in the first place. The point is there are other ways to program visually and that text is merely one of them.

>>there is likely a better alternative to the von neumann architecture but we will never use it
This is an interesting speculation
Let's hear some theories

That's what will eventually happen when cloud takes off.

>IMO PNG is the most simple and effective loseless format.
That would be RLE.

Then the problem becomes that the cloud is made out of spaghetti.

Attached: tired yui.jpg (600x450, 23K)

>portable terminal
I really want one. Give name.

ibm.com/blogs/research/2018/04/ibm-scientists-demonstrate-mixed-precision-in-memory-computing-for-the-first-time-hybrid-design-for-ai-hardware/

I have no answers to your questions (yet), just know you are not alone. we will take computing back one day.

It could look exactly the same as your phone now, it's just the processing would be done on your home machine.

>>no upgrade path for hardware because everything has to be backwards compatible, even when huge translations of software were done in the past

because past methods were not good enough and past methods also didnt have fuckloads of legacy shit attached to it that needs to work for a business to consider it.

you get full system redos and will in the future, with phones. hell, I have argued for quite a long time for the use of a co processor in computers.

imagine an amd 1700 or 2700 used as a co processor for a new cpu to be the dominant one, you get all the good shit about legacy, with all the good shit about brand new.

>imagine an amd 1700 or 2700 used as a co processor for a new cpu to be the dominant one, you get all the good shit about legacy, with all the good shit about brand new.
It wouldn't surprise me if this is already done internally within x86 CPUs really. Who knows how the ISA, with its decades of backwards compatibility, is actually implemented?

Attached: 1550313381205.jpg (1024x1024, 597K)

Already done for entirely different and far more frightening reasons:
youtube.com/watch?v=_eSAF_qT_FY

they often show the actual CPU through a microscope, it's often not a big mystery how the architecture works, at least at the big picture level.

Only way to move things forward will be efforts by individuals to really fix things. On the ground, one step at a time. Stop dreaming and memeing about abstract bullshit and just make things better, easier, faster.

I recently bumped into Jonathan Blows Jai project. It's ambitious, but in a modest and unusually practical way. I can't say I agree with everything he is doing, but what can be achieved even by one individual nowadays is striking.

youtube.com/watch?v=TH9VCN6UkyQ&list=PLmV5I2fxaiCKfxMBrNsU1kgKJXD3PkyxO&index=2&t=0s

This is how we'll advance. People, probably individuals, actually using the tools now available to fucking do shit. Committees and projects and initiative and especially silion valley will not save us. Only we will save us. So instead of talking about 40 year old projects like Smalltalk and NeXt, we can talk about building the future on the development languages being made using the toolchains of today (LLVM, git, the fucking internet, etc)

Attached: maxresdefault.jpg (1280x720, 103K)

uEFI dropped virtual ISA devices, etc.
But sure I get your drift. All instructions in x86 CPUs are already virtual, a lot of CPUs these days do that, not just x86.

>on the toolchains of today
That's like trying to build the world's most stable skyscraper on a fucking marsh, in the middle of a fault, using the absolute cheapest concrete you could find.

you can benchmark it if that's the case, though I don't think it is at least in current cpus. a legacy cpu would imply this is hardware locked now and the only upgrade it gets is through process shrinking, with the dominant instructions getting preferential treatment.

>All instructions in x86 CPUs are already virtual

Attached: 1526269855227.png (400x300, 138K)

>even ASM is interpreted

Attached: 1549103034877.gif (472x472, 2.62M)

Bullshit. Most modern software is garbage, but we have space age tools as well, and more of them now than they had in the 1980s. We have what they had, plus what we have now.

Git/Hg are space age. Fuck off if you only want to complain. They enable what would have been impossible years ago.
Compiler toolchains are fucking space age. We can make new programming languages more easily than you can build your project in an existing language using some deranged propriatary build system.
Editors are fucking spaceage now. Don't tell me you program in fucking ed or some bullshit.
The internet is fucking space age. Despite all the shit, you can download entire codebases in one shell command, compile it in 1-2 more. This is routine now.

20 years agos, you still had to buy some bullshit written by a airhead Californian looking to make a quick buck. Now you can google for everything to do with nearly every language ever.

We are space aged, but it's not translating into better code, or languages, or tools because we're not taking advantage of all the real infrastructure we have. We're sitting around waiting for the committees, the "geniuses", the academics or at worst, the big corps, the "Googles" to somehow "save us". But they won't, because they are as dysfunctional the rest of modern society and not structured or motivated to really make anything better unless it advances someones career or bottom line.

They will never deliver. It will never happen. Code and software will only be advanced by one thing: Individuals making things better. We have more tools to do that now than at any point in history. Bitching and moaning about foundations is just bitching when you have the means to change it all or bypass it is just bitching and moaning. Learn 2 code; Get shit done.

Attached: images.jpg (267x189, 7K)

you missed his point with your space-age metaphor

ibm.com/blogs/research/2018/06/future-ai-better-compute/
ibm.com/blogs/research/2018/12/8-bit-breakthroughs-ai/
The next generation of computation will feature "memristors", aka phase-change memory, at the forefront.

>we're still using qwerty
would you prefer colemak or dvorak? I have learned dvorak myself but the non-standard ctrl-c/v location triggers me

His language looks like a nicer version of C.

Attached: thumbs-up-gif-anime-8.gif (500x281, 1.06M)

That's basically the GNU spirit. RMS already proved us that this is not just a fantasy.

>IMO PNG is the most simple and effective loseless format.
what about .webp?

None of this is actually an issue and half of it is retarded and wrong. If you've actually gotten to the point where any of thus actually bugs you, fucking quit your job as a professional computer toucher.

ABGR raw bytes is better but expensive.

>there is likely a better alternative to the von neumann architecture but we will never use it
Enlighten me

Most of it is because we keep buying for the wrong reasons. We still give money to sloppy shit.

Yes, even ASM is interpreted.
All modern CPUs (not just x86) now are RISC with a few dozen instructions and a decoder that decodes hundreds of architecture specific instructions into those few dozen instructions just because compiler and legacy compatibility (and it's easier to pull off).

tonsky.me/blog/disenchantment/

Another episode of a burn out neet thinking he's smarter than everyone out there and makes shitty complaints

>we we we
Not your personal army

This thread is inspiring. After having burnt myself out on programming for a while I've begun to find some joy in it. Creating small solutions for small problems is such a nice feeling. I used to look forward to accomplishing huge projects but that has an unfortunate way of becoming an intimidating slogfest of refactoring and misguided planning. I want to begin fixing the parts of my userland experience that I feel could be better. Fuck web browsers and fuck the modern internet. I want to see more cool offline browsing software.

Attached: 2009-12-17-224938.jpg (1191x838, 703K)

We need something like UseNet pull software, or BBS mail readers, which allows you to pull new posts from sites, and give you a UI that shows the new articles and allows you to post your own as you do so. Multimail is still in a lot of repositories, but is useless unless you use BBSs (which no one does any more).

Attached: q-blue-reply-taglines.png (760x570, 23K)

Beautiful post user

AI will save us all

No it won't, programming is about replacing people, AI is about replacing programming with programmed people. It's basically just realising the original goal was dumb but refusing to acknowledge it.

Individuals won't cut it. Humans are too slow and fail-prone. Only an artificial intelligence doing everything from scratch will fix the mess we are in.

Programming is about replacing people, and eventually we'll replace ourselves too.

It doesn't matter if the first AI was programmed by humans, because the second will be programmed by an AI, and so on... Once you get it rolling, they'll just program themselves and everything else.

>>we're still using qwerty
nailed it

you're an idiot who doesn't fully understand the consequences

Do you want to know what it will be like? Everybody will be plugged in to virtual reality while everything else is automated. For those who miss the old times, you'll have a simulation of 1712 where you can live exactly as people lived in 1712.

No it won't you fucking moron

Stop insulting me and try to write more than one sentence and maybe I'll hear what you're saying. Why do you think that automating things (i.e.: making the cheaper) is bad?

Unironically wish we could scrap the monstrosity of modern computer architecture and telecommunications and create simple yet secure and forward-thinking open standards that everyone adheres to, but I know it will never happen and instead we'll continue bolting shit onto shit until the end of time.

Attached: 1501391010212.jpg (919x720, 59K)

Perhaps the biggest and most obvious flaw with AI is that they will take the ideas of utopia dreamed up by people like you and rigidly enforce them

Or until it all collapses. Or some horrific bug is found somewhere in the pile that can't be removed without bringing everything down.

My ChromeBook has 3:2 display

drag and drop? WYSIWYG, AI backends it.

And what is keeping you from fixing it you fucking faggot?

I wish you elaborated more.

Do you think that the world I described is a utopia? It's just a world where people work less and everybody is plugged in to whatever virtual world they like.

Why rigidly? It will be a free market competition of different AI's and different virtual worlds.

Not the one you're replying to, but I can't even begin to wade through this naïvety, so I'll just say
>implying

We are not that far. VRchat is already a reality. UBI was prototyped already.

Attached: maxresdefault.jpg (1280x720, 123K)

so is the human brain user
complex systems generated by feedback loops tend to be incomprehensible

I can't think of any better AI than one that is exactly like a human being. Just procreate instead.

Humans are too slow.

No they aren't.

Exactly. The user said you'll have a choice to live in a simulation or not, but the machine (both as a concept and literal machine) wont be able to let so many unpredictable and possibly hostile elements exist that could threaten it even by accident.

That looks like an IDE object explorer.

Who the fuck are you quoting retard?
>kilobytes of empty metadata in every single file
Has it ever crossed your mind the idea that the application/user might want to add/modify such data later on?
>there has never been a decent printer made, ever
Just because your printer sucks doesn't mean every single one is like yours.
>programming still revolves around editing linear text files
And what do you want exactly you freaking idiot? fucking VPL? that shit stinks.

If humans weren't slow, programming wouldn't exist.

I'm tired of having to tell you people that you won't understand, so please stop using these bullshit words that have nothing to do with the topic and are based on pulling it down into some unnecessary domain.
Quantity is not a goal. Utilitarianism is bullshit.
When you see something and wish it was faster, you wish it was comparatively faster such that there's some qualitative difference. As a parable, if everyone in the world were $10 richer, then nobody would be $10 richer.

>people unironically thinking that OOP is a good way to design software

Attached: nagatorothafuck.png (428x339, 134K)

I'll give you a very simple example. I was going through a bunch of genomes (3 billion positions) manually. I realized it would take me 1000+ years to do the calculations I wanted to make. I learned how to write a script and had it done in 1 hour of programming + 10 seconds of processing.

A sufficiently smart AI would tell you you don't really need to go through those genomes because your sub goal itself is retarded

Moreover these sorts of calculations are the opposite of an AI thing. People have done that sort of stuff for millenia.

.webp is disgusting, and has worse compression artifacts than .jpeg when using lossy .webp
.FLIF is the superior format

Will risc-v save us? I'd imagine any truly open-source hardware would have enough of a niche to stay afloat and get some flavour of linux running on it.

From there you at least have one single, simple hardware device that's somewhat usable.