/dpt/ - Daily Programming Thread

What are you working on, Jow Forums?

Last thread:

Attached: DPT.png (934x1000, 389K)

Other urls found in this thread:

youtu.be/sQurwqK0JNE
complexitylabs.io/rational-arguments/
runestone.academy.
twitter.com/NSFWRedditGif

First for Rust

Stacked languages like forth and shit. Anons have been shllling them lately, but they're shit languages and memes. Just stick to C.

you are supposed to be shilling zig

Repost from last thread:
What are some good (preferably free and offline) resources for learning Python? I took a shot at learnpython.org but their solutions to their own problems still generate errors, on top of them not explaining everything they want you to do in the exercises. I'd rather have a .pdf with bunch of exercises that clearly explains what it wants from me and the concepts behind what I'm doing, even if it's a tad dry.

Don't learn Python. Learn C and never use any other language.

And fuck forth

>you will never leglock a janitor
A-at least we have Dart, right guys?

>-at least we have Dart
It's better than forth

Javascript sucks!

It runs faster than Python though and doesn't have the Python2-3 mess.

test

>It runs faster than Python
Everything runs faster than Python.

onetwothreesixeight

not an argument

I think you just don't understand Forth very well...

Forth is shit user

not an argument

There is no reason to learn forth in the first place

maybe so, but you have done nothing to demonstrate that.

It's not as good as C

I was told that python could be a good first language but after looking at a couple /dpt/s I'm not so sure anymore. What are your suggestions?

C

tryin to get a programming job with a non programming resume

Isn't C supposed to be really verbose and hard to learn for beginners though? Or am I being memed on?

haskell

What's so great about forth? It's honestly a shit language.

youtu.be/sQurwqK0JNE

Every language is terrible after looking at a couple of /dpt/s

meme'd. C is great for beginners.

complexitylabs.io/rational-arguments/
Do yourselves and everyone else a favor by learning how to argue persuasively.

There is no reason to learn forth

You could start with C, but don't expect to be really productive. I started with C but I think Python is a vastly better choice for a first language.

jumping from nice, safe, happy linux programming straight into windows .NET cancer, all for a shitty snipping tool knockoff. I hope i never have to include windows.h in another program in my life.
Should've stuck to java xd

To persuasively argue there is no reason to learn Forth, you must first know about the goals of your audience.

Why would anyone need to learn forth?

For example, programming language researchers may have a reason to learn it.

lol

Forth is great for bootstrapping hardware from nothing (no C compiler, no anything) and writing code that does a lot but fits in a really tiny ROM (much slower than hand-optimized or compiler-generated machine code, but also much more compact)
It is extremely unlikely that you need to do either of these in 2019, unless you're building something that will literally go to outer space
Nobody posting here will ever accomplish either jack or shit so it's completely true that none of you should bother learning Forth

print hello test

Why is Python often considered the best first language anyway? Because retards can't indent without language forcing you to do it? Because curly brackets look too technical and scary?

you don't know who posts here

how do you guys handle the swap and undo files generated by vim? surely there's a better way than just shitting them out into the working directory and cluttering everything.

learn haskell

The basics can be learned relatively quickly and reading the code feels relatively natural.

I know everyone who posts here because I'm a literal God. I became that way by writing C. Suck my dick.

Everything I was told is more or less what said

>nobody posting here will ever accomplish anything
>I'm a literal god
I sense a hint of contradiction.

Going through thinkcspy on runestone.academy. Any pointers?

set dir=/tmp//

Which will shit them into /tmp/ instead. Don't do this on shared systems though.

This. It lets you know exactly how bad things can be.

god of faggots maybe

I made it sound maybe too resentful about Python. Python is an ok choice. Better you get started than keep wondering where to start. It doesn't matter much what you start with.

Lisp is the most powerful programming language.

VScode feels botnetty, is it botnet Jow Forums?

join us

isn't it open source? compile it yourself if you're worried. i just opted out of any reporting after installing

trades performance for easy syntax

books for assembly x86?

based

it provides only very limited abstraction, yet lacks wide adoption, and its most-often used runtime compiler is reliant on an underlying C compiler anyway.

I used to tutor a few people at my university in programming. There's this required class for all non-CS/CE/EE majors that covers Python. For most of them, it's their first time writing code. It was just depressing watching them. The problem with Python as a learner's language is that it is extremely permissive- as long as your have your parens matched and you have colons in the right place, Python will try to run your code, even if it's fundamentally broken. The lack of compiler/interpreter-enforced structure in Python doesn't just encourage bad habits by experienced developers, it never teaches good habits to newcomers in the first place.

Attached: 1533057448262.jpg (1024x765, 146K)

Devils advocate: my first professional work was done in C, and became proficient in vim (not specific to C but many other langs have IDEs), learned about toolchains, make, linking, loading, how to search through a large codebase (not specific to C), and many other things.

set noswapfile

Or it might be noswap. You can google it

Programming is fun!

Attached: f5ed8b61442b291ce583d3c44bb9a4c7.jpg (500x398, 42K)

There are good python habits. People write clean, production, interpreted code every day

Learn Lisp.

Im messing with the operator framework for kubernetes rn

Agreed

yes

I know lisp. Clojure is great

I have! Clojure is great

Fuck

Can someone explain in simple words what undefined behavior is?

I learned (very basic) python before moving on to C and I don't think it was much of a problem besides having to get used to a few things.

In C it means that you can’t reliably know what will happen. The command that leads to undefined behavior may succeed and spit back the correct return value, may succeed and spit back a garbage return value, the program may crash, the very spacetime around the computer may cease to exist.

Lisp

Depends on the precise context, but most people are referring to C.
This post () sort of gives an explanation, but I'll expand on it with the other "behaviours" in C, but can apply to other things:

C is a standardised language, which means that there are many different implementations which can claim to be a C implementation. Naturally, all software can't be exactly the same, and it's extremely difficult and limiting to define _exactly_ how every implementation should act in every situation.
So various behaviours are marked in special ways to give compilers flexibility and make the language easier to implement.
There are 3 types of special behaviour:
- Implementation-defined behaviour:
The implementation is given a choice for what they can do, and they must pick one consistently. For example, a compiler can choose the size of 'long' to be 32 or 64 bits (or even another value), but it must always be that.
- Unspecified behaviour:
The implementation is given a choice for what they can do, but it's entirely up to their discretion how they do it. Any real program is going to have unspecified behaviour, but they should not rely on a particular behaviour.
For example, it's unspecified which side of the + operator is evaluated first,
so with the code
#include

int foo(void) {
printf("foo");
return 1;
}

int bar(void) {
printf("bar");
return 2;
}

int main(void) {
printf(": %d\n", foo() + bar());
}
a compiler can rightfully output code which is going to print "foobar: 3" or "barfoo: 3".
- Undefined behaviour:
Basically, this says that your code is invalid. The compiler is under no obligation to do _anything_.
You might wonder why they might have this kind of behaviour in a language, but it makes the language a lot easier to implement (some situations are basically impossible to detect at compile time), and compilers can optimise on the assumption that UB never happens.

programming is the best desu

I'm not criticizing interpreted languages. I'm criticizing Python specifically.
Python is terrifically bad for newcomers because it allows you to do stupid shit. Novices do stupid shit. Not only that, a newcomer also isn't going to be good at grokking a stack trace. Like I said, I've witnessed this firsthand with the people I've tutored. Its lack of structure and emphasis on runtime errors makes it incapable of actually teaching good habits. Python does not enforce its own best practices, not even implicitly through warnings.
If Python is bad for learners, it's worse for production use. The language, again, is highly centered on runtime errors and a "write fast, debug when it crashes" philosophy. The end result is hastily written, unmaintainable, non-performant, and fragile software. I should know- I spent a month refactoring the Python code of some shitty startup my employer bought out. There's a reason they were going bankrupt when we bought them.
The "go fast and break things" philosophy runs rampant in our industry at the moment, and Python is leading the charge in many ways. While Python's package management is second to none, and I can't criticize pip, Python's combination of structureless programming and stellar package management is a poisonous combination. It not only permits, but actively encourages gigantic and unnecessary dependency webs, and the avoidance of actually meeting design requirements because "X API doesn't have that feature."
Sure, there are good Python habits. There are also good COBOL habits, good Simula habits, and probably good Brainfuck habits, too. That doesn't make any of those languages good, and it doesn't make any of those languages a sane of productive choice for production software.

Attached: 1436466816362.png (760x720, 439K)

If I malloc memory to a variabe defined in a function, but the function doesnt return that variable, can I somehow free() the memory in main()?

C is "verbose" because the language spec is very, very simple.
This means you have to write a lot compared to other languages that predefined that function for you, but it also means *you are learning a shitload* which is the damn point.

Unless you have a reference to that memory block somewhere else no, thats a memory leak.

Queen of COBOL, IBM books, IBM website, >tutorialspoint.
Have in account most COBOL stuff is hacked together year to year, revision to revision hence you’ll rarely find a “let’s make a back end in cobol!” Tutorials, there is no much reason to learn cobol other than to shitpost with it, historical and academically valuable insight and information and/or you working in finance already and covering your ass.

no, you should free that memory inside the function where it was allocated. If that is not possible, there are a lot of options.
You could introduce a secondary "cleanup" function, keeping records of your allocations which will then be used by the cleanup function to free the memory. e.g. somefunc_cleanup()
You could also take a pointer to a pointer as an argument of that function, and write that with the pointer to the heap memory space when it is allocated. Then, an external function can free that data. e.g.
char somefunc(int **freemelater, someothershit)
{
*freemelater = malloc(...);
}

I created a dynamically allocated char *string and tokenized it into a char **list of strings, all inside a function.
The function returns the char**.
I could scan the string input in another function and then pass the string into the tokenizer function as an argument but lets suppose I dont want to do that for one reason or another.

This was a massively overcomplicated suggestion

I feel like the criticism you raise here could be raises against all other languages ever. No language prevents you from being stupid, nor does it inherently enforce its own best practices. Dependency hell is not unique to python. Unmaintainable code is unmaintainable in any language.

I agree that since it is not compiled you open yourself to runtime errors with no compile-time checks, but I won’t revive the statically-types vs dynamically-typed debate. A solution to getting around the unexpected runtime errors is to have a robust test suite.

I would point to the large amounts of widely adopted and well written python code that exists

t. Write compiled and interpreted code professionally

which one? Those are both pretty run-of-the-mill ways of doing it, assuming the allocation absolutely must live longer than the function's scope.

A pointer to pointer argument is usually good for functions that may want to use realloc on the argument. getline would be a good example.
Otherwise you may as well just do it as the return value.

** is fine, building your own GC system is insane.

If you have the char** you have a reference to any of the char* stored within it and can free at your discretion.

>building your own GC system is insane
Who said anything about garbage collection?
getline() will just reuse and resize the argument given as needed, as an efficiency thing.

Im referring to the suggestion of a secondary record-keeping-and-cleanup function.

I agree that you can write shitty code in any language, I won't deny that. That same startup I mentioned also had smaller codebases in C, C#, Java, and Objective-C, and each one was a complete mess. I myself have written stuff I'm not proud of. However, I believe that Python is uniquely prone to these problems, for reasons I already mentioned, mostly tracing back to its lack of structure, not only in its type system (or lack thereof), but the very structure of the language.
Part of it does come down to statically-typed vs. dynamically-typed languages. At the end of the day, types (properly-implemented ones, at least) allow for provable behavior(s). The realm of behaviors any given piece of code can have is constrained by the types, and thus the behaviors are more clearly defined, and more bugs can be caught at compile time- this is a problem with Python you mention.
t. also write compiled and interpreted code professionally. Again, this isn't a debate of interpreted vs. not interpreted (I do like some MIT Scheme), it's a debate of Python vs. literally any other modern programming language. Unlike most other dynamic, interpreted languages, Python also lacks strong semantics or strong logical flow (see: assign anything to any name whenever you want), and thus will allow broken, buggy, and flawed code to run without complaint.

I was operating under the assumption that other user was probably using the return value for something else. The solution I presented was meant to be as general as possible, insofar as it can be used regardless of whether the function has a return value or not.

dumb question lads

what's wrong here? the console says "You have dealt" and nothing else

Attached: damage.png (541x155, 7K)

im guessing it uses a format string, so you want to use something like
Console.WriteLine("You have dealt %d damage.", damageString);

naisu it worked, thanks.

Ok , so i realized that i am a huge fucking retard because i coulsnt implement heaps algorithm.
Any good book in algorithms that will help me with this?

Earlier I was asking advice on creating a mini project in c++, a imagine compression program. I'm new to coding and haven't taken data structures yet. Would that kind of program be out of my ability range?

depends what kind of compression you're doing
basic compression is easy

like I'm trying to do something that acts like a zip file?

yes but it depends what algorithm you want to use
there are really simple ones and really complex ones

where could find info on these algorithms?