Someone explain me the bloat meme. If you're using modern hardware """bloat""" shouldn't be a problem, right...

Someone explain me the bloat meme. If you're using modern hardware """bloat""" shouldn't be a problem, right? Or are you all using 20 year old laptops or something?

Attached: Iwakura-Lain-lain-iwakura-37431115-509-786.jpg (509x786, 108K)

Other urls found in this thread:

blog.hubspot.com/marketing/page-load-time-conversion-rates
harmful.cat-v.org/software/
twitter.com/NSFWRedditVideo

More code -> more bugs -> more security bugs

wasting resources is bad regardless of if you have 64GB or 2GB. I'm planning to go from 4GB all the way to 32GB (and later upgrade to 64GB) and I'm still going to debloat like normal because I like treating my computer well
it also is a problem in programming today because it piles on and programmers are incompetent, even aside from the resource usage, it can create buggy messes. see:
>bethesda games
>modern websites like twitter and youtube running like shit
>windows 10
etc.
we've been piling code ontop of code ontop of code for far too long and now nobody even knows how to do the low level stuff anymore to fix the base of it, eventually it will become unsustainable if we continue like this

That shit adds up. If you run one application at a time maybe it's fine, but when every program uses up a shit ton of ram and cpu cycles just about any computer will get slowed down.

Fellow 4Gblet here, literally an hour ago I've ordered another 4 gigs of used ddr3 ram for the cost of two pizzas, feels good man

No such thing as bloat just poor people who can't ride the tech wave.

that's good user be sure to debloat it and keep it running well

Attached: Win7_debloated.png (412x459, 42K)

so optimizations dont matter?
input latency doesnt matter?
fps doesnt matter?
2gb chat apps dont matter?
laptop battery doesnt matter?
pushing hardware to 1% of its limit because everyone is a low iq worm that writes in Electron doesnt matter?

Sweet, my KDE config idles at around 400 MBs

I use an Atom N450 with 2GB of RAM. I get by because I use light software.

software has been riding on the back of hardware for the most part and people like him just think hardware will keep carrying us forever
even if it would, the possibilities if software was better and we could fully utilize our hardware directly is immense
it's pretty dumb since decade old hardware is powerful enough to do anything most people want to do but websites and bloat mean you're forced to upgrade

More bloat means you need to upgrade your hardware. It also severy limits how many things you can run in parallel.

Actually since SSDs are more modern than HDDs yet tend to have far less storage space, size bloat is as relevant as ever.

Is this unintentional planned obsolescence?

I'm not gonna explain shit to you, dumb attention whoring anime pedo scum.

>it shouldn't be a problem right?
There's a lot of things that are wrong here. First of all it does matter. Websites take forever (seconds) to load. And its even a normie opinion that that sucks.
blog.hubspot.com/marketing/page-load-time-conversion-rates
Took pic from here.
Applications are exactly the same. Install the atom editor and see how you feel about it when it loads.

Computers are stupid fucking fast. Its inconceivable when you use bloated software. You shouldn't perceive that there's any form of unintentional wait or lag in normal software yet its everywhere. Crappy software is so incredibly slow that it doesn't manage basic shit like opening a mono-colored window and rendering some text in reasonable time.
So get some perspective, your standards are awful. Intel giveth, Bill taketh away.

On the topic of old hardware it should be perfectly fine to use. Expectations should be set by the complexity of the task. Not by how much crappy competition there is.

Nobody who talks about bloat complains that their raytracer is slow. It's always about the basic shit.

Attached: page-load-time.jpg (669x4491, 648K)

Probably its that universities are pooping out moronic python programmers now. So the economic incentive is there to hire them.
But I'm starting to wonder if its not intentional. Especially on the phone front.

Cheers for explaining. Seems reasonable and not just a meme.

computation directly translates into heat and energy use. badly written code uses more energy and creates more heat than properly written code. even on a 'low power' chipset like on a phone you will still thermal throttle if you push it too hard, and you can push a phone hard enough to throttle by installing the facebook app. an n64 emulator doesn't even do that

if you use a battery powered device and don't think bloat is a bad thing then you're probably too stupid to understand the factors involved.

>rendering text fast
thats because the video card jews deleted all support for text modes.

No. Its because they're slow in general. It has nothing at all to do with the actual rendering. That's trivial to do in a shader. Just precompute a distance field and sample. Its all sorts of architectural decisions that bog them down.

Unless you're actually talking about terminals. I wouldn't know. They're usually OK enough imo.

i used to use an atom netbook as a primary computer back around 2010 and a 1.6ghz atom, a fucking pentium 3 equivalent, could browse the internet no problem. it could watch flash videos, even. the entire computer ran on like 10W, it could play world of warcraft and quake 3 just fine. there was no lag when opening browser tabs or anything like that. now with updated software its literally incapable of browsing the internet with one tab. its entirely useless.

i understand that my shitty $100 cell phone blows it out of the water in every metric of computing performance but the availability of processing power doesn't translate into better performance, brand new laptops with i5s run like shit in comparison to my old MSI wind. you need like 8 gigs of ram to even think about running a web browser

it came from whatever autist wrote this page:
harmful.cat-v.org/software/

of course its intentional. you know its intentional because there is no incentive from app developers to make their applications run worse, they have a vested interest in keeping as many people on their platform as possible. the best way to do that is to make sure their app runs well on as many devices as possible. raising the hardware requirements for their platform is literally against their own best interests - unless they are not acting in their own best interest, they are acting on part of some other group

considering twitter still doesn't have a revenue stream, and they are one of the worst offenders when it comes to how badly their app performs on slightly outdated hardware, its pretty much as simple as putting 2 and 2 together.

Ahem, FUCK DOT NET

>there's no incentive
I would agree because I don't buy into it. But people look at react and react native and say that its a big productivity boost and you enable cheaper developers (we developers) to do the work. If you remember the move to webapps it was a similar push.
I think they're idiots but maybe I'm naive.

Just last year Poweramp changed UI and its noticeably slower and just bad looking. I don't think its malicious I just think they don't know what they're doing.

It's just memEing OP. All code is literally bloat and it's so fucking rare to run into code that isn't bloat people just call it witchcraft when they see it. Everything is bloat, when you get a job you'll know all you do is add to it.

>If you're using modern hardware """bloat""" shouldn't be a problem, right?
This is not what happens. All this "modern hardware" just allows shitty nu-devs to write even worse code that requires even more powerful hardware to smoothly run. "Unused RAM is wasted RAM" until it becomes "required RAM" due to bloat.

I'm using your 20 year old mom; you were the unintended bloat

user implies that having a faster machine justifies writing shitty code

They do.
Apparently that wasn't the intent:

>shaders faster than characters stored in ROM on a terminal

I didn't mean that. What I meant was that it wasn't a big problem even with high quality text.

phones are super fucking powerful but run like shit because both android and iOS are bloated, and programmers are braindead

If people wrote programs as efficiently and to-the-metal as they did back when DOS and System 1 were around we'd have overall better computers.
Probably less software though.

actually, hell. The most efficient old software were video games. Take notes from the guys who made SNES and C64 games. Also every programmer needs to be John Carmack

old games were so efficient; now they are framework on top of framework on top of library on top of library on top of drivers on top of OS...

Is it just me or have I been seeing a lot of Lain lately?