How accurate is the premise of this video?

How accurate is the premise of this video?
youtube.com/watch?v=kZRE7HIO3vk

Do all those extra lines of code have to be gone through to read a text file when only a small fraction of each are relevant for that?

Attached: linux growth.jpg (854x480, 79K)

game developers hate current state of hardware fragmentation. not all the drivers run at the time, obviously; but developer doesn't know which will
then there was the argument about horrible giant software stacks in OS. linux/unix graphics stack is a prime example.
then something about GPU api. I know shit about that

wow, great content. Thanks OP, that guy has everything I was looking for about game programming.
OP delivered by mistake, not Faggot card granted.

funnily, most of the points we also made by terry

10 minutes in, that "lecture" is a load of barnacles with little to no factual basis
he is oblivious on how software actually advanced over time. it's not far away from watching a feminazi trying to explain the gender wage gap

Terry was the smartest programmer who ever lived. Of course he'd bring this up.

Who cares as long as the app works?

But nothing works right anymore.

How is he oblivious to it, when he is working as a programmer since 30 years?

his ranting omits the fact how vastly more connected and integrated the world has become thanks to the Internet and what affect it has on programs and programming. he is promoting specific and unabstracted software, ignoring any downsides, over software with abstraction layers. he uses total lines of code as a measure of complexity with the repeated argument that "x software has n lines of code that all must work to open a text file". that is untrue due to modularization and abstraction. and so on, couldn't bare watching much more.

his overall premise of programs becoming resource hogs full of bugs isn't wrong but he uses all the wrong arguments for it

He has engineer syndrome
Where people become experts in one area and they think they're expertise applies to everything so they made huge lapses in judgement even though they're intelligent
this guy is full of it

How does your modularization and abstraction change the fact n lines of code have to work?
They don't. Those things reduce the complexity of managing those n lines of code but nothing else.

>comments are disabled
Why?

>Those things reduce the complexity of managing those n lines of code but nothing else.
Not at all. More lines of code could be (and actually are) used for failsafes, error checking and handling, managing side-effects, etc. This leads to a *more* stable OS, not less. Otherwise Temple OS would be the most stable of them all.

>How does your modularization and abstraction change the fact
A simple example is that each layer has a graceful way to handle exceptions, so the OS (or program) doesn't crash like with a simple buffer overflow or memory leak like in C.

Game devs don't give a fuck about PCs, they only care about consoles and those are unified for the most part, specially now thanks to Vulkan. PC versions are just ports and those are dealt by not the original developers.
At least for AAA games.

With modularization and abstraction I don't need sound, task scheduling, print spooling and so on to work to be able to open a text file. I can have a big codebase but for a specific problem I only need the relevant pieces of code to work.
When the code is modular, it's also reusable. In which case I will have less lines of code total due to less duplication of code.
Conversely, if some random piece of code does not work, it will not make the whole computer broken.

Can someone explain to me how the shit did he have 300k+ git commands in his video Introduction to git

People kept bringing up the fact that the game he's spent 500-1,000 hours developing is complete garbage and not even in a playable state yet

>Game devs don't give a fuck about PCs
except the game devs that make game for PCs of course

>More lines of code could be (and actually are)
>implying there's actually no bloat in modern OSs
are you for real?

That graph is misleading, since most of the kernel growth is to support more hardware (arch/, drivers/).
Support for building minimal kernels has improved since a few years.

Presumably the fact that things are modularized, yet still we have enormous amounts of code seems to be a problem... It's not as if it's impossible to have nice things without throwing tons of shit in a pile and calling it software, it's that the design decisions that lead to complex shit programs built on complex shit code with complex shit data are assumed to be the only way we can operate, when that's clearly not the case.

It's like looking at Fallout 76 and assuming that all games will inevitably have an engine as shitty as it, when it's obvious that things can be better.

Why in God's name does my Android phone run like shit when its hardware is a magical powerhouse of performance? It couldn't have anything to do with software running on software running on software pretending to make my life simpler, could it...?

literal know nothing kike trying to con people out of open source software

Some are good. Some don't seem to think that 6 frames of input delay is a bad thing.

which is to say about 15k of those "lines" are about 15 character strings with driver links

developers who like making unresponsive cinematic masterpieces usually work on console

get with the times they "work" on kickstarter but never goto any platform

They're not mutually exclusive.
Bloated as they are, modern OSes are far more stable than DOS where a single misbehaving program would crash the whole system with no survivors.
Same for OS X or Windows for instance, where drivers are distinct from the kervel and so can't crash the whole system, compared to Windows 95 and the like.

Don't forget to make your apps in node.js.

Attached: android-stack_2x.png (1384x2038, 99K)

You know why it wasn't playable? Because it didn't have it's own bootloader and os built into the disk. If only he had full hardware control he could have made it.

>linux graphics stack is a prime example
The Linux graphics stack is actually pretty decent. What concrete arguments do you have to say about that?
Also note, Nvidia's dogshit driver is not part of the Linux graphics stack.

GPU programming is pretty shitty. You write a shader program where you type "1" as an argument to a function an on amd it looks fine but on nvidia you get a pink texture. The reason is that you have to write "1.0" for numbers on nvidia, but you dont get any warning or anything bout it.

All graphics stacks are big and bloated
Not that I agree with him that that's a problem, but they are

Sure, there is a lot of code, because there are a lot of drivers.
I'm vaguely familiar with how Mesa is structured internally, and obviously they try to share as much code as possible when it comes to implementing OpenGL/EGL/Vulkan or whatever graphics API, but it does lead to lots of layers of abstraction.

Lol regardless it's a pretty stupid derailment of a really interesting video

But it's an engineering topic

I agree modern programs are beginning to take "oh just leave it, modern hardware is good enough that the user won't feel a difference" a bit too much to heart. But what he says in the video is outright ignorant. We have "bloated" OSes because they're general purpose, able to multitask and support most hardware out there. This is a convenience and not a hindrance. He forgets that as hardware got better our demands got higher with it.

Also his arguments are completely bogus. Services like twitter usually go down because there's too much traffic not because of some software bug. A significant portion of lines in modern OSes is because . And just because there are many lines of code in a program doesn't make it prone to more bugs, we have unit tests and specifications for this exact reason. Google doesn't ship a new version of chrome till they're 100% sure that it will work flawlessly.

Just dumb.

Casey is applying the standards he uses in his own work to places where it doesn't belong. Even when he has opinions on gamedev he's usually wrong, because he worked in a small subset of gamedev creating middleware and he believes everything should be programmed like that even though it doesn't actually make much sense

>Sure, there is a lot of code, because there are a lot of drivers.
and that's why I consider it not a problem
Sure, things could theoretically be simple, but the world is a complicated place full of people doing all sorts of shit and making all sorts of hardware so realistically you need alot of "bloat" just to make things compatible