/dpt/ - Daily Programming Thread

What are you working on, Jow Forums?

Last thread:

Attached: 1500175584571.png (1024x768, 384K)

Other urls found in this thread:

godbolt.org
twitter.com/NSFWRedditGif

Ada

nth for nim!

zth for zig

Why would you write anything in any language other than Haskell? You can mathematically prove that you'll get no run time errors in Haskell. No need for "Let it crash" when "It can't crash".

becaus crashing isn't actually a concern for people who aren't amateur programmers

What did he mean by this?

>no totality
haskell a shit

I tried to read that book but i cant do math at all.

Is it possible to learn one of c, c++, c#, java while only being able to do basic algebra at a high school level?

Haskell has given me a new found aversion to ((parenthesis))
It used to just be Lisp I found ugly, now any language which enforces parens around ((arguments)) irks me.
Are there any usable languages which don't suffer from ((parenthitis))?

Attached: 1447277169577.jpg (425x419, 43K)

yes

Elm, but it's webshit.

Runtime errors exist to inform the programmer, they aren't bad
The only people who think runtime errors are bad are people who don't know what they're doing, and program by tweaking things until it "works" and then get upset when they see a runtime error because they don't know what went wrong

nim.

F#

lisp
just use braces or square brackets instead

basic

You will suffer paren hell in elm, trust me.

Can't spell minimal without nim
Can't spell trust without rust
Can't spell cuck without c
Can't spell God without go
Can't spell black'd without the d

Error handling is an anti-design pattern though. You shouldn't have to care about errors.

Yeah, you can learn C# and Java. But you're giving us pajeets a bad name. Why not be a webdev hipster? You'll save us the shame and be more of a generational debate than a cultural one. Or the opposite, I don't know or care.

brackets and braces are just parenthesis in disguise

Attached: 1446600213460.jpg (1280x720, 168K)

You're an idiot, and compile time errors are infinitely better than runtime errors.

That's too bad, I thought it was basically Haskell for webshitters.

>I've never written a program that has to handle user input
as expected of a Haskellfag

it is, but elm is just a haskell subset.
Besides that and the sort of fucked eco-system right now, it's still a really nice language if you want pure-frontend.

subset, so it takes on the same problems as haskell*
well besides custom operators, because Evan is a fucking autistic faggot.

>he thinks runtime errors are used in the place of compile time errors

How do you even have a "pure" frontend with web dev when the entire goal of something like Elm/Js is to modify the DOM? Otherwise you would just stick to html/css only.

do your error handling in an option lol, exceptions are twentieth century carp

The more runtime errors you can turn into compile time errors, the better.
You sound like some kind of low IQ Java/C#/Python shitter.

modify in place, only re-render what's changed. Same concept haskell and other langs use to emulate mutation with copies of variables with nev values.

Attached: 1492857006680.jpg (1052x1342, 402K)

Java has far too many exceptions. It's an abused feature that is sometimes used to substitute for return codes.

CHRIST I finally fixed the C compiler's shit. Don't even ask me why or how it's working now. I just know that it's fixed.

>calling libc's write

Call linux directly faggots

/* example.c
* gcc -ansi -ffreestanding -Wall -Wextra -Wpedantic -Os -static -nostdlib -s -o example example.c
*/

long system_call_linux_x86_64(long number, long _1, long _2, long _3) {
long value;

__asm__ volatile ( "syscall"
: "=a" (value)
: "a" (number), "D" (_1), "S" (_2), "d" (_3)
: "rcx", "r11", "cc", "memory");

return value;
}

long write(unsigned int fd, const char *buf, long count) {
return system_call_linux_x86_64(1, fd, (long) buf, count);
}

void exit(int code) {
system_call_linux_x86_64(60, code, 0, 0);
}

void _start() {
static const char hello_world[] = "Hello, world!\n";
write(1, hello_world, sizeof(hello_world) - 1);
exit(0);
}

It's still 8.7k for some retarded reason but it has exactly 0 dependencies.

>has exactly 0 dependencies
>relies on a linux defined syscall
Cniles.

>It's still 8.7k for some retarded
Post output of objdump

Cniles are retarded but that program links with zero libraries, thus zero dependencies.

>can't depend on linux kernel

You might as well just kill yourself

$ objdump -d example

example: file format elf64-x86-64


Disassembly of section .text:

0000000000401000 :
401000: 48 89 f8 mov %rdi,%rax
401003: 48 89 f7 mov %rsi,%rdi
401006: 48 89 d6 mov %rdx,%rsi
401009: 48 89 ca mov %rcx,%rdx
40100c: 0f 05 syscall
40100e: c3 retq
40100f: 89 ff mov %edi,%edi
401011: b8 01 00 00 00 mov $0x1,%eax
401016: 0f 05 syscall
401018: c3 retq
401019: 31 f6 xor %esi,%esi
40101b: 48 63 ff movslq %edi,%rdi
40101e: b8 3c 00 00 00 mov $0x3c,%eax
401023: 48 89 f2 mov %rsi,%rdx
401026: 0f 05 syscall
401028: c3 retq
401029: b8 01 00 00 00 mov $0x1,%eax
40102e: 48 8d 35 cb 0f 00 00 lea 0xfcb(%rip),%rsi # 0x402000
401035: ba 0e 00 00 00 mov $0xe,%edx
40103a: 48 89 c7 mov %rax,%rdi
40103d: 0f 05 syscall
40103f: 31 f6 xor %esi,%esi
401041: b8 3c 00 00 00 mov $0x3c,%eax
401046: 48 89 f7 mov %rsi,%rdi
401049: 48 89 f2 mov %rsi,%rdx
40104c: 0f 05 syscall
40104e: c3 retq

It most certainly has a dependency if it can't function without the linux kernel running.

Runtime errors are for when the user inputs invalid data, you can't turn them into compile time errors because you don't have the user input at compile time

Wait a second. I could 5 syscalls. What the FUCK is going on

GCC is inlining your functions. 60% of that code isn't reachable because it inlined everything in _start()

No one said anything about user input and you're arguing something completely irrelevant.
Runtime errors should be reduced as much as possible and turned into compile errors whenever possible if not completely eliminated entirely by good abstractions.
Keyword: as much as possible

That has nothing to do with why there's 5 syscalls

>No one said anything about user input
I did, because thats the correct use case for runtime errors

>I did
I know you did so fucking stop it because it's off topic.

What the fuck? Why the fuck would it do that? I expected it to inline the system call assembly into the wrappers functions, and maybe the wrappers into _start, giving a total of 2 syscalls in _start.

Looks like it inexplicably put the syscall function into _start as well, inlined the syscall function into the wrappers AND put them into _start, and then created the optimized code I just described and put them in _start as well.

>start address 0x0000000000401029

Why wouldn't gcc delete that useless code, I even supplied -Os

ruby / elixir / javascript (if one argument)

> I expected it to inline the system call assembly into the wrappers functions, and maybe the wrappers into _start, giving a total of 2 syscalls in _start.
That's exactly what it did.

>Why wouldn't gcc delete that useless code, I even supplied -Os
idk get cucked?

btw you can use: godbolt.org to easily see what it will produce

It's not off topic, you're saying runtime errors should be avoided and I'm saying no they shouldn't, they exist for that reason

>output is the same even with -O3

lol

I have no idea what to say

-O is just a suggestion ;) remember that.

this is what haskelltards actually believe


virgin FP langs:
>if there's any error, you get an empty option! if it worked, you get the thing you wanted, except it's wrapped so you have to unwrap it first!

chad adult langs:
>if there's any error, you get what went wrong in whatever delineation you want. you can treat all errors the same or know exactly that the user tried to put letters in their integral value. if it worked, you get just the thing you wanted without any bullshit added

Runtime errors should be avoided
does not equal
Never EVER use runtime errors
retard

It is OBJECTIVE FACT that compile time errors >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> runtime errors and if it's possible to turn a runtime error into a compile error, it absolutely should be done.
You are arguing for something completely fucking stupid. You are LITERALLY saying it's better to waste CPU cycles at runtime doing something the compiler could've done.
Please never touch a computer ever again.

False

>if there's any error, you get what went wrong in whatever delineation you want. you can treat all errors the same or know exactly that the user tried to put letters in their integral value. if it worked, you get just the thing you wanted without any bullshit added
this is literally what an option is though? mad!

I like to think of designing software in the same way Network engineers design their networks.

Attached: file.png (508x488, 187K)

They shouldn't neccessarily even be avoided, having lots of runtime errors for user input is good because you can tell the user exactly what they're doing wrong
Even disregarding user input, making your program deterministic enough to catch everything at compile time can mean writing a less efficient program, so it can be better to let it crash than reorganize it in a way so that it doesn't
This fear of runtime errors seems to come from people who haven't programmed much and don't know what their program is doing

>you get an empty option!
No you get result type in the error state, which contains the error information.
>except it's wrapped so you have to unwrap it first!
Not an issue if your language isn't 100% total and utter trash.

I'm going all SDA
t. >70k port network

just shovel it in a kafka topic lol

Attached: kafka-organization.png (307x371, 18K)

Well, I didn't mean literally make a server run to validate your input and send it to you. Just to separate code that is pure or can be proven from code that is IO and must validate and potentially report issues.

>except it's wrapped so you have to unwrap it first!
you mean check if the return was an error or not? are you doing something bad like not checking your returns because your shithouse lang lets you be retarded?

fedbook.net the ajax is killing me

You literally can't even understand my argument.
I'm saying runtime errors should be turned into compile errors where they can, and even better make them not even possible with well designed abstractions. Such as how RAII and iterators eliminate whole classes of memory errors.

WE
ARE
NOT
FUCKING
TALKING
ABOUT
USER
INPUT
SPECIFICALLY
HERE
How many fucking times do I have to say it?

If the thing is 100% runtime dependant then no shit you need to check it at runtime.
That doesn't mean turn everything into a runtime error like you seem to be fucking implying.
The user is not fucking interested in a "Null pointer dereference exception", the null pointer deref could've been prevented at compile time. It has fucking nothing to do with what the user input.

>you get the result type in the error state in an option

i know what an option is you lying little bitch dont you try to trick me

>hurr durr anyone who isnt completely fucking kool aided by the FP cult probably doesn't know anything about muh REAL FP LANGS haha xD ill just make shit up

the FP brainwash is real

no, the return is just the return. you handle errors at whatever call level you want. and, yes, you can even *not* check them if you want, a fact that however much you screech about is strictly an upside rather than a downside. but do go on on that thought. why is it BAD to empower the programmer to allow a crash in a situation where there is nothing meaningful left for the program to do?

How do we get rid of OOM runtime errors?

Attached: animu.png (149x148, 58K)

more memory

don't run the program

and like I already said, even disregarding user input it can be preferable to have a runtime error instead of a compile time one because making your program more deterministic can mean making it less efficient
I've explictly written code that could crash because it ran better than code that couldn't

>i know what an option is you lying little bitch dont you try to trick me
Yeah and I'm saying if you wanted error information you use a result type. You seem to be implying only option types exist.
>the FP brainwash is real
I'm not even a functional programmer, you're just simply retarded.

this
dont allow any variants so the program is resolved at compile time, you don't even need an executable

>running the compiler

>why is it BAD to empower the programmer to allow a crash in a situation where there is nothing meaningful left for the program to do?
how does the programmer know there is nothing meaningful to do? maybe some other thread was in the middle of a crucial task and you just killed its parent and compromised consistency
and even if you KNOW that there is nothing happening, why not exit with an error code instead?

>to have a runtime error instead of a compile time one because making your program more deterministic can mean making it less efficient

hearsay

Stop using Haskell for number crunching

Try writing a real program that isn't some CRUD webapp

I'm not implying shit you disingenuous cunt, post I replied to said option not me

>if you wanted error information use a result type
better hope you want to handle the error at this exact level and with the exact level of granularity that whoever wrote the function anticipated!!! oh wait no, this particular function just returns an option. guess we can't distinguish between invalid user input and an IO error. oh well!

Building an ERC20 wallet (non-browser). Anyone know where I can check out existing security audits for eth wallets? Would prefer the auditors not think I'm completely retarded when I deliver

> it can be preferable to have a runtime error instead of a compile time one because making your program more deterministic can mean making it less efficient
Sorry, but this is just wrong. A compile time error is more efficient than a runtime error, because it is caught at compile time. A runtime error must be caught at runtime, which means doing checks every time the error causing code might be run.

If your worried about the performance of doing validation at runtime rather than just letting things break then you are a moron.

>how does the programmer know there is nothing meaningful to do
if not me, who the fuck else would?

>maybe some other thread was in the middle of a crucial task and you just killed its parent
maybe there are still meaningful fucking things to do then you idiot

you are basically saying "what if you just wrote shit code where you crashed the app when you shouldn't" well gee I guess I'd be a shitty programmer then, lucky thing I'm not actually retarded though

You have no idea what you're talking about.

>better hope you want to handle the error at this exact level and with the exact level of granularity that whoever wrote the function anticipated!!! oh wait no, this particular function just returns an option. guess we can't distinguish between invalid user input and an IO error. oh well!
Can say the exact same thing about a C/C#/Java/Python programmer returning only a bool on failed user input.
What's your point? Retarded people can write stupid code? We all already knew that.

>if not me, who the fuck else would?
the person using your library or your colleges running your code elsewhere would appreciate it if didn't just crash the application

>"what if you just wrote shit code where you crashed the app when you shouldn't" well gee I guess I'd be a shitty programmer then, lucky thing I'm not actually retarded though
lol, this is a funny delusion

Attached: kokoro_laf.gif (480x270, 414K)

>A compile time error is more efficient than a runtime error, because it is caught at compile time
Not if the code you have to write for it to be caught at compile time is less efficient than it otherwise would be. I don't think you understand that you have to program things in quite a specific way for them to be provable

Some people do work where performance matters, hard to believe I know

>Some people do work where performance matters, hard to believe I know
You missed the entire point of my post. Congrats on not reading. It's not the 80s anymore. Simple validation isn't going to kill your performance. There are far worse things that will.

no, it's not the same. I want to handle the error X layers up, I just handle the error X layers up. FP dweebs want to do that, they have to refucking write everything in between those two points to pass around the possible error state as well.

...why would my library crash the whole application? is this just a complete non sequitur or is there actually a vague thread holding whatever point you are trying to make together

>I don't think you understand that you have to program things in quite a specific way for them to be provable
This is a limitation of the language and implementation. Languages like Haskell and Rust provide the mechanisms for you to write which it is must easier to prove those properties.

I think you missed the point of mine
I have a significant algorithm that could be written in a way that would never cause a runtime crash, but because there's an unsafe way of writing it that is faster I prefer the latter

>...why would my library crash the whole application?
If your subroutine crashes from an unhandled error, then it's going to take down the calling thread with it and maybe crash their program.

LOL how deluvial are you

>This is a limitation of the language and implementation
Partially, yes, but you're never going to have a provable language be as power as an unprovable one (unless being provable is your idea of power)

hey guys let's have some vague arguments with too little context

>I want to handle the error X layers up
Then do so
>they have to refucking write everything in between those two points to pass around the possible error state as well
Not in good languages they don't.

yeah dude, i got that. what im trying to get you to answer is, why would I ever do that?

Your example doesn't hold up. If you have an algorithm that operates on input X and you can guarantee that it will only receive input X at compile time there you can make an optimal algorithm that has no unsafe behavior and does exactly what you want without issues because it's guaranteed by the input at compile time.

Unless you're relying on undefined behavior, in which case LOL.

True, but it's much better than giving up and letting anything happen whenever.

Xd