Haskell BTFO

Attached: HaskellBTFO.png (1263x135, 49K)

Other urls found in this thread:

benchmarksgame-team.pages.debian.net/benchmarksgame/faster/haskell.html
en.wikipedia.org/wiki/Cluster_(spacecraft)#Launch_failure
twitter.com/SFWRedditGifs

is this true?

i want to learn haskell just because it's the most well known pure functional language but is it useless in the real world?

Yes, just look over complex code from Haskell experts agains average Java Code and Java still wins.

benchmarksgame-team.pages.debian.net/benchmarksgame/faster/haskell.html

Haskell is an eternal research project, no practical language.

>is this true?
it is true for every language in existence, you simply can't write simple code for complex problem.
competents mathematicians wrote on this in the 50-60s, see Turing article on AI or Kolmogorov complexity for example.
>is it useless in the real world?
yes, FP is NOT superior to "traditional" imperative programming.

Just learn D and use pure, nothrow functions where you can and design by contract to avoid many bugs. Boom, you just got 100x more productive than you would be with Haskell.

The problem is that nobody knows at the beginning of a software project what it will end up being in the end, unless they spend enormous amounts of time and money up front, like they have to do with stuff that they shoot up into space. The consequence is that the awesome solution that you built to elegantly solve the problem you thought you had at the beginning now looks a lot less awesome and elegant because the problem changed on you.

D finally backtracked on GC (as least partially), and it will be a decade before they realize that exceptions were a mistake too.

>Individuals and interactions over processes and tools

Shit developers write shit code, good developers try hard not to write code at all, because once the shit developers touch it, it will turn to shit. Language is orthogonal.

Fuck you exceptions are great, go is cancer

Exceptions are cancer in every language that has them

Factually false
No boomers allowed on this board

Attached: 1514140833513.jpg (297x597, 36K)

Okay, now tell me why unpredictable stack unwinding and oodles of spaghetti and boilerplate code is a good thing, rather than returning an optional or a result/status tuple.

Haskell is useless, just learn F# to leverage the CLR and .NET ecosystem while still working within an FPL

Learn J

>if err != nil
>if err != nil
>if err != nil
Wow, so clean and intuitive.

Early return is far more intuitive than try ... catch a thousand nested times.
Also
>His programming language doesn't make null falsy
Pleb

That's why Haskell is superior, monads avoid this kind of boilerplate

Fuck you acronym nigger.

Attached: 1300213856149.jpg (251x240, 8K)

>D finally backtracked on GC

They didn't 'backtrack', so much as they recognized that everyone who actually used the language actually used custom memory management and couldn't afford to rely on their garbage-garbage collector and so they added the option to just strip it out officially. It always supported manual memory though, as it was always intended to work in the same domains as C++.

Monads just obscure the error so now you have no idea why something failed.

Dont learn J, it's a meme language

They don't. The Except monad interface is in fact similar to the one of exceptions, but it's actually nothing but a wrapped Either.

Uhhh no
The GC is why the C++/rust faggots refuse to touch the language. The idea that "everybody" is using custom allocators is completely retarded and overblown.
There's a -betterC mode but it's just a subset of the language (no classes for example). SOME people are using it.
People are happily using the normal language, with GC, no matter what all the manual memory management zealots say.

use D as a betterC with no GC.
what's the excuse now?
C is deprecated and people insist on using it for elitist purposes

That's fine but I was responding to the moronic notion that vanilla, GC'ed dlang is somehow unusable and unwanted by all.

>GC'ed dlang is somehow unusable and unwanted by all

It's not "unusable" and can work fine for tons of things, but a lot of the projects people use D for are latency sensitive and hence there was real demand for more non-gc features.

There are exactly 2 acronyms in my sentence and both are 100% recognizable if you spent more than a day on Jow Forums

haskell is literally java of functional languages, dont learn it, better learn erlang, its much more fun

Lack of exceptions is LITERALLY why C is safer than Ada
en.wikipedia.org/wiki/Cluster_(spacecraft)#Launch_failure
You don't use exceptions on critical infrastructure. It's a recipe for disaster.

Almost everything you read in here is ass pulls without any arguments or evidence.

Attached: 1547974834290.png (665x672, 607K)

You're an idiot. It crashed because they tried to fit a 64 bit value into a 16 bit slot, didn't check for overflow and never tested the code on a simulator. If anything, Ada made the right decision. It hung under undefined behavior with no handler. If this was C, it would have just filled the 16 bits with junk data and tried to navigate based off of that. More importantly, all modern hardware is exception based. When you fuck up, you get an interrupt, not a return value. You seem to think programming languages shouldn't have support for the same error handling that the physical hardware has.

Attached: 1550701506298.jpg (720x833, 27K)

java is great though, no one can hate on it without pajeetFactory memes

it does the job but the language is really ugly, boring and tedious

>imagine listening to a fucking reddittor. Sage and kys

>>D finally backtracked on GC
>They didn't 'backtrack'

Read the early shit Bright wrote about GC being better than manual management in every regard, including theoretical performance. It absolutely has not aged well.
Unfortunately GC is already baked into core language libraries and types, so no amount of waving 'nogc' around can fix everything.

Are you some Windows/SEH babby who magically equates segfaults with catch() blocks?

Software exceptions are language control flow frameworks, not hardware watchdogs. Protip: a single term can have different meanings in different contexts.

Link if you don't mind, I'd like a peak into history