Can someone explain to a noob why this is still around even after decades...

Can someone explain to a noob why this is still around even after decades? How come that in a fast evolving field like programming somebody was able to invent something with that longevity?
I'm genuinely curious. Which features or technical details make it so C is still around and widely used?

Attached: c-programming-language[1].jpg (900x500, 41K)

the basics of how a turing machine works haven't changed since the 1940s, computers are just faster now

fast as fuck boi

The C API is extremely widespread. Every major OS exposes its functions via a C API, so any native program has to use that API.
Every new language that comes out has to have some kind of bindings to the C API to be able to use those OS interfaces as well as many libraries.

It's not about the language itself. C has had its time and it's not really suitable for modern programming anymore. But since the important C API had been designed arount the C language, C is the best way to use the C API natively.

Attached: Lauergans.jpg (793x595, 70K)

newer programming languages are built ontop of it.

Just because technology is more evolved doesn’t meant that tried and true methods are obsolete.

the language was in the right place at the right time to be used in operating system development. as a result, numerous low-level apis were made with the language as a requirement. anyone who wanted to get into computers at that time had to know it, and due to how versatile C is, programmers that used it began to employ it when making other programs at the time to the point where it became an industry standard.

Because it's so simple and straightforward. You put numbers in memory and fiddle with them

it doesn't do anything behind doors, juat works and is as fast as assembler, as is just like a dictionary. Development is a bit slower than in something like Python but you won't have a cluster fuck of shit pajeet code if something bad happens or you want to change something to implement anything else. It is the programming language until now. But it would be better if it were easier or more human readable while keeping it as fast. And there aren't any languages like that, they are slower and not as controlable

This

the minimal virtual machine model it imposes turned to fit nearly every platforms
the cost it some behavior that is practically everywhere but is not guaranteed in standard (e.g. there is not twos-complement enforcement in the specs), ANSI C would even work on machine without linear memory (e.g. those old esoteric Lisp machines)
it completely abstracts away CPU (except the register keyword), which is partially bad (now most performance comes with avoiding cache misses and memory access, at least compiler extensions give a control over explicit cache alignment) but very very flexible for porting C to all sorts of archs

Because it's good. What else is there to say?

Unix system calls you stupid baboon.

Attached: 1529564115264.png (414x459, 263K)

Less stupid people than other languages is alone a good reason to make your project in C.

Attached: this_kills_the_hipsters.png (1597x382, 1.12M)

>Less stupid people...
and more memory bugs!

Haha

Blood and souls for malloc!

>t. hipster

Can someone explain to a noob why this is still around even after decades? How come that in a fast evolving field like networking somebody was able to invent something with that longevity?
I'm genuinely curious. Which features or technical details make it so ipv4 is still around and widely used?

Attached: IPv4.jpg (1000x667, 103K)

Have fun with your memory leaks.

>everything that isn't C is a hipster language
Okay.

I don't need more than 4 billion IP addresses for a LAN. WAN router can be IPv6, but I would prefer to be able to SSH in to my boxes over the Internet with addresses I can remember in my head.

If using a language that is not just easier to use but safer means I'm a "hipster" then I proudly wear that title.

>there is not twos-complement enforcement in the specs
I'm fairly sure the division rounding towards zero was made for one's complement, so in that sense C slightly disfavours two's complement.
Of course, rounding towards negative infinity is neater and often what you'd expect when working with negative integers and modulus.

If only there was a mechanism to map easily memorable strings to IP addresses.

>literally admitting to it

Not op but, is k&r still good for learning even tho it's very old? And is it good for people with no programing knowledge? I know it says it's not at the beginning but I wanted to hear from people who read it

Literally admitting that C's simplicity is a debilitating one. Oh, okay.

It's the step above assembly so it's not a pain in the ass and very efficient. It's better suited for very specific applications though

Can someone explain to a noob why this is still around even after decades? How come that in a fast evolving field like peripherals somebody was able to invent something with that longevity?
I'm genuinely curious. Which features or technical details make it so keyboards are still around and widely used?

Attached: keyboard.jpg (569x467, 47K)

fast, portable, cross platform, extensible, moderately easy to learn (compared to say asm for eg.)... what's not to luv?

Because you cant write a fucking kernel in python, ruby, etc.

> inb4 some asshole "did" it
the shit needs to be adopted, i don't care about PoC that some guy did in their basement

I think what you should remember is that old != bad and new != good, and vice versa. C has proven to be a valuable tool, and while few may still use it to quickly build tools do everyday tasks, such as you might in Python (which is a scripting language meant for such things), it or a very near derivative (such as C++) is always the core of the codebase for the major operating systems today. Unix and Linux and windows and likely even OSX to some extent (not a macfag, can't say for certain) all use C extensively because of it's merits.
That being said, C is important because it has no magic behind the scenes. The code that runs is code you wrote or added yourself. Because of this it is easy to understand what is going on, instead of treating everything as a black box. C is easy to learn with, while still maintaining tight control over the hardware, unlike assembly, because it doesn't care about the hardware for most purposes. Your compiler takes care of that for you. Meaning all you need is a C compiler on your architecture, and you can compile any code that anyone wrote in C (mostly), and if it can't, you can fix it if you aren't a brainlet.
t. too much effort but it makes me sad to see it get hate for no reason, or because someone can't into pointers

The underlying architecture is the same that's why, its like asking why we still drive with a steering wheels when we have gamepads and joysticks.

>Their shit needs to be adopted.
And around for 5-10 years before I trust it for anything that matters to me. If that shit craps out or gets abandoned and all my data is in some virtual file system that this guy made up and didn't document so that I can't recover it, I'mma be pissed.

*rises from the grave*
good languages never die

Attached: ritchie dab.jpg (437x500, 58K)

>fast evolving field
>programming
t. webdev kiddie
books from 50 years ago are still extremely relevant in the programming field

Attached: .jpg (275x183, 10K)

>2018
>memory leaks
How retarded are you to stop know how to call free on something you malloc'd? In an age where modern IDEs can tell you that you fucked up and things like Vanguard exist to detect and tell you that you fucked up.

Oh yes because Java and Go are flawlessly airtight

This. For example sqlite. Since sqlite is written in C, you can use it on pretty much any device easily and use it from pretty much any language with minimal boilerplate.
The fact that C doesn't have built-in garbage collector also makes it easier to use with other languages because different languages handle garbage collection differently, but calling a function to cleanup resources can be done the same way in all languages. Although not as important today but still important for embedded systems/weaker systems is for language specification to not be too strict. C code runs fast on all devices because int for example doesn't have to be at least 32 bits. Modern languages have much stricter language specifications, which is both a strength and a weakness, because modern languages are more likely to behave the same on all devices but run badly on slower devices unless the code is specifically written for them.

It's finished. Like Lisp.

Because it was written as fairly thin abstraction/generalization above machine operations and the machines haven't really changed all that much from an instruction set architecture viewpoint, all the changes have been on the microarchitectural side of things, which C never sees. x86 was introduced 40 years ago and we're still using it (albeit updated quite a bit).