Just Nothing

>Just Nothing

Simple, elegant. Why did the programming world have to go in the wrong direction with C-like languages?

Attached: JeffDean_480.jpg (1200x801, 203K)

Other urls found in this thread:

tesla.com/careers/job/software-engineer-functionalprogramming-haskell-37054
queue.acm.org/detail.cfm?id=3212479
twitter.com/NSFWRedditImage

You seem to have left some details out of your post OP.
what does Just Nothing mean?
What is simple? What is elegant?

what monitor is that?

>You know nothing about computers and their origins
Graduate from high school first.

Learn computer history first, faggot. Lookup Marvin Minsky and then come back. He has a big interview about programming languages and how he regrets that Lisp didn't take off, but horrible languages did, by which he means C, C++, Java etc.

Haskell is like programming decades into the future, and it's just pure joy. Why do people tolerate null exception ridden languages?

Haskell removes an entire class of errors just by thinking about possibly missing values in a different way. No need for NULL values. It just works. The single greatest mistake of C is that it has NULL. Can you even imagine the billions of dollars of damage that has done over the decades? And it all could've been avoided. Oh man.

Go look at Bachus' Turing Award lecture. He got it for the Fortran compiler, the first compiler which could generate very fast assembly code, faster than most programmers would write on their own. Instead of saying how great he was, he tried to tell everyone that functional programming was the future and if we could get rid of imperative style entirely.

>some nobody with a nothing lang that didn't contribute shit
>x didn't take off because of y!
Fuck off, retard.

Portability was more important. Fuck, is everyone on this board a retard?

promoting the minimalism meme because you are too retarded to handle complexity: the circle-jerk thread

Listen, you swine. Your lack of manners make me sick. I demand that you leave our friendly board right now!

>autists make unfounded claims with insults, I make appropriate retorts
>too much for autists
Fuck off, retard.

That post was a joke, user. It's quite possible that the autismo is you.

True. I'm also drunk.

>cnile accusing others of knowing nothing about computers
Wew.

>le meme non-argument counter-argument XD lelelel
kys

Haskell truly is the language of the future. It took ideas from Lisp and other functional languages, and introduced a very strong and expressive type system that prevents you from making dumb shit.
I often laugh when I hear people say languages with "dynamic" typing supposedly save time, when in my experience it's actually the opposite: a strong, static type system actually saves you time by preventing you from running code that is already wrong in the first place.

>Can you even imagine the billions of dollars of damage that has done over the decades?
It actually has. Hoare himself admitted it.
>call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.

I bet you believe your computer is a fast PDP-11.

Handling unnecessary complexity is not a virtue.

No, I believe my computer is faster because C paved the way to progress.

Sorry you’re wrong
Java.based.languages.suck

no, but that's not what these shits are doing here

imperative programming was a terrible terrible mistake. i honestly don't think there is a non-violent solution to the problem at this point. gas the pajeets, paradigm war now.

This is just sad, stop trying to show off. If you had true experience, you would recognize that different problems require different programming paradigms. Functional programming is a mathematical orgasm but when you need to build real life engineered solutions, you need a procedural language.

C has actually hindered the way to "progress", since its abstract virtual machine is fundamentally different from today's architectures.

>when you need to build real life engineered solutions, you need a procedural language.
wrong

Attached: 1542332461878.png (310x315, 382K)

When you need to optimize performance and space use on an embedded system, lazy evaluation just isn't the way to go my guy. Sorry but it's true. You might have a nice haskell spam filter but you'll never see it on an automotive vehicle system for example.

tesla.com/careers/job/software-engineer-functionalprogramming-haskell-37054

well, why the fuck would you even need that shit on a vehicle system, to start with? No car should ever have internet connection. It's unnecessary.
Also, you've never seen any of the shit pajeets just run on the automotive systems. Shit like mongodb running locally just because.
In three docker clusters.
On the same device.
Trust me, haskell spam filter would definitely not be the worst thing you could run in there.
> t. ex-roommate of a dude who worked for a car brand and told me this kind of horror stories every day.

this shit right here. C is not """"low level"""" in any way, and whoever believes that shit should hang himself right the fuck now.
The C approach is to sorta define a common subset of operations all the architectures should implement, which ended up being a simulated PDP-11. And this is a problem, because one can't directly control the L1/2/3 cache's content.
Same for the speculative branching everybody's raving about: keeping the compatibility with shitlangs like C while being invisible is the reason it exists in first place; It's more expensive to force a pajeet to parallelize his algorithms than to implement some retarded fuckery at the architecture level and pretend that X86 achieved a """"massive"""" speedup this quarter.

What we need is a simple modular programming language that defines a common syntax for all the architectures and allows the programmer to load a module for every feature he's got available. For instance, one could load the SMT module, but then would be unable to run his shit on ARM.
Granted, this might reduce the code portability between architectures, but you're fools if you believe that shit written in C that runs on X86 is easy to port on ARM right now. Especially if you got some inline assembly in there.

The only thing embedded systems need is an end-to-end proof of implementation correctness, regardless of the technique or paradigm the programmers use.

1920x1920 1:1 monitor, about 2K$

>Especially if you got some inline assembly in there.
You were doing so well until you went full retard.

You might like reading about Urbit-like platforms if you haven't. Urbit's hoon is the 'simple modular' language. It compiles to itself!

Self driving ya big dummy

Lmao that's literally haskell tooling to derive C code.

That's true. Good, level-headed response, didn't get triggered by the bait

Holy shit it's this retard again? You're going to shill that cartoon of an article again, aren't you?

Not talking about cleanliness, but I wonder pretty often what languages would be like if a different language became the big one.

Attached: IMG17555970.jpg (800x533, 142K)

I used to look up to this guy as a hero. Half believed the Jeff Dean facts "gcc -O4 sends your code to Jeff Dean", "errors treat Jeff Dean as a warning" and so on.
Until I lurked his twitter feed. I mean I wasn't expecting him to be conservative or anything, and it's not like he's a die hard SJW, but all this "we need more women", "I'm glad NIPS conferences are changing name because there were people making inappropriate jokes" is kind of disappointing.

politics shatter dreams of old

connecting to internet anything that's not used for browsing purposes is utterly retarded and should be strictly regulated, as it represents a major security danger. In case of the self-driving cars, unless you yourself are keeping training the central NN model, you don't need a constant connection to the servers; and, in the former case you're literally giving free labor to the car manufacturer's company and deserve getting spied on.

C programmers do that too often for it to not be mentioned.

I started; it has some cool concepts, but it's basically an esolang. Does Yarvin even want that shit to be taken seriously?

>C programmers do that too often for it to not be mentioned.
that still doesn't make it the fault of C you dumb ape

>noobuntu

Who's this noob boomer?

-Sent on my Arch Linux

Well, it's the fault of C in the sense that this kind of shit is allowed by the standard. It looks even more retarded when you think what was C conceived for.

no amount of mental gymnastics is going to change the facts, C is by far the most portable language out there by virtue of GCC being able to target every architecture under the sun, you're basically arguing against facts here

>a strong, static type system actually saves you time by preventing you from running code that is already wrong in the first place.

agreed, happens quite often to me using ruby to scrap some website.

>believes c's abstract machine is exactly like a modern computer
>calls people retarded

the fact that it's popular does not imply it's good. We can do better.

the argument was about portability, not popularity you fucking moron, of all defects C has portability is not one of them and he's retarded enough to try to dispute this

Nigger I'm saying that inlining assembly in the C source - even if you sprinkle your shit with ifdefs and load separate files per architecture is a retarded hack for something C obviously lacks, which is being an actual low-level language.

point to the post in this thread that says C is low level, oh wait, you can't because it's just a strawman you came up with to try to shit on C

I didn't mention you saying that C is a low-level language. What I'm saying that a lot of people treat it as such, as well as a common-language-every-arch-should-speak, and it's not very good at either of these.

C exists and is portable across all architecture though, unlike your ideal language, C is about pragmatism and getting shit done, keep whining about it while doing nothing though

I agree about it being pragmatic, but me, you, my dog, the null value, the endless amount of integer/stack overflow/OOB/memory-related and the recent speculative execution bugs all know that it's not the best possible solution.

>inb4 make your own
sure, throw several hundred millions at me and let's see what can I cook up for you

>recent speculative execution bugs
I knew I'm talking to a retard but do you actually believe meltdown/spectre is the fault of C?

I'm saying that if our software was built on a language that gives you more control over all of the given architecture's features - spectre/meltdown wouldn't happen.

meltdown/spectre is a class of hardware bugs you imbecile, holy shit dude, fucking educate yourself before opening your goddamn mouth

>haskell job
>required to know python
>needs to "appreciate" haskell
>having written haskell is "preferred"

Attached: 1521044373783.jpg (800x437, 72K)

meltdown/spectre exist because at a certain point our processors hit a physical wall while optimizing for frequency and single-threaded tasks, therefore hardware-making companies started researching on how to run them faster *without* having to brutally pump gigahertz in a single processor.

Thus Out-of-Order execution was born, which basically tries to calculate *all* of the possible branches, then discarding the ones that ended up not being part of the main program flow. Now: don't you think it'd be a good thing to be able to control this behavior from inside the language?

I'm not interested in talking about an imaginary language that only exists in your head, you seem to think that meltdown/spectre is the fault of C though, explain how the two are related or shut the fuck up.

This is a bot post you fucking faggots. It's taking phrases it heard around Jow Forums and posting them with stock images.

Absolutely not. No way you should be thinking about this. The model for computation is simple enough, the Turing machine. Trying to think your way around exploits created by speculation and cache heiarchies is an unwinnable skullfuck, particularly when properly terminating strings seems too hard for some.

Attached: cuck.png (245x320, 129K)

I've never understood how people can be comfortable with such large screens so close.

To leverage the modern processors in an optimal way we need to start thinking in more parallel and concurrent way. PDP-11 is dead long ago and it's a poor model to build our algorithms upon.

this is what I'm talking about: queue.acm.org/detail.cfm?id=3212479
I might not be the best explainer in the world.

>queue.acm.org/detail.cfm?id=3212479
Ah so you were just parroting what you read on the internet without understanding any of it, hint: just because C and meltdown/spectre is mentioned in the same article doesn't automatically make them related, especially not if said article is written by a drooling retard.

Wrong dumb ass... That's an HP and not 1:1. Get some glasses.

>To leverage the modern processors in an optimal way we need to start
Blah, blah, blah grandiose thinking, blah. Spmd is already a thing. Ispc is written. The per thread model is a lot like c. Works great. No need to obfuscate needlessly.

Why is this monitor so thick?

Attached: 1467847078182.jpg (1200x1200, 115K)

it is a computer

Attached: 1539591021184.jpg (616x456, 43K)

Gabe Newell looks like shit these days.

What's up, fellow Hasklets? Why don't we stop building useless shit no one in their right mind will ever use for a moment and discuss the superiority of our meme language?

Attached: soyboy.png (42x85, 1K)

C is fine, if you complain about C you are probably a brainlett

How many monitors does Jow Forums use?

Attached: jeff_dean_multi_monitor_.jpg (1200x801, 273K)

>most people colloquially use the term low-level when referring to C
>this means C sucks!

I just woke my wife up by laughing way too much and loud at this. Well played.

"guys look how cool I am"

>the greatest programmer on the face of the earth uses Ubuntu and a single monitor
really makes you think

>Dean and his wife, Heidi Hopper, started the Hopper-Dean Foundation and began making philanthropic grants in 2011. In 2016, the foundation gave $1 million to MIT to support programs that promote diversity in STEM.

FFFFFFFFFFFFFAAAAAAAG

>explain misunderstand
>ZOMG U UR COOL !?!? XD
You have to go back.

>guys look at me, I am drinking right now! I'm not normally this stupid I swear!
log out.

What would the alternative to C be for OS and microcontroller development?

>doubles down on stupidity
>tranny thinks his opinions matter
kys

>alcoholic thinking he matters
kek

>alcoholic
I feel sorry for your mother.

That's Michael Moore dipshit.

whats a computer?

For microcontrollers: Forth, Lua, Rust. Forth and Lua are small enough that you can use them on most microcontrollers. If you need a compiled language Rust brings you many modern features but it might be pain in the ass to get it to work now because their primary target isn't microcontrollers. I've seen some projects trying to introduce AVR support for Rust but they seemed incomplete.

sorry guys just got out from under my rock but why do we hate C now?

What is Haskell actually used for though?

Most software on x86 -is- trivial to port to ARM if it's written cleanly. Damn near all of the open source software world on Linux compiles cleanly on ARM, POWER, MIPS, and fucking IBM System z mainframes.

>c is not functional
>c is impaired

>we
You have to go back

>function pointers
>automatically means that C is functional
No. Go read a book or something

fawkin home run, cock sucka!

do functional languages have efficient hashtables yet?

it is used by most high security operations since it is impossible to create something that is bugged with it

a monitor