/dpt/ - Daily Programming Thread

What are you working on, Jow Forums?
Last thread:

Attached: 1518509913643.jpg (500x333, 36K)

Other urls found in this thread:

gnu.org/prep/standards/html_node/Option-Table.html
twitter.com/SFWRedditVideos

J!
pascal =: +/@:(28&^@:|.@:i.@# * }:@(1&,)@(*/\)@:>: * -: * >:)@(7&#.^:_1)

What are some essential books for learning Prolog?

>rust

Attached: memoryleak.png (2058x747, 924K)

He's following in the footsteps of Romero

I'm looking to get into Coq and Prolog, please help me, what do I need???

Is there a benefit to marking parameters as const in C? I've made a habit of doing this if I'm not modifying a parameter within a function and I don't know if it's worth the time or not

>getting into prolog
yeah nah that won't happen

That's more of a C++ thing.

Variables in C and C++ should be const by default.

Why not, Prolog looks fun.
I've already dabbled in Haskell and Scheme.

this

C and C++ are inherently procedural. It would only make doing anything useful a pain in the ass.

Marking parameters as const in C is essentially useless in declarations, as the qualifiers are deleted during compilation.
int f(const int x);

and
int f(int x);

declare the same function.

However, it can be useful in function *definitions*, particularly longish functions, to avoid accidentally changing those parameters inside the body.

if i get a function pointer a virtual function with a hack do i need to explicitly make sure that it's __thiscall?

you know, there's that prolog honeymoon phase that everybody goes through where they're amazed how unification essentially gives you a free execution environment for predicate logic.
then you start doing stuff beyond basic predicate logic and you start to realize that reverse evaluation basically never works where you need it (and it's always unintuitive why it doesn't work) and that prolog-people essentially subdivide all their predicates into two classes: one class to evaluate regularly, one (or more) to evaluate in reverse.
at this point, the whole benefit of prolog is lost and you start investing way more time than you would in a regular programming language.
and then somebody introduces cuts and the like to you and everything gets even worse.

fuck prolog, except for prototyping things that it's good, but super slow at (basic predicate logic, type systems, etc.).

>msvc
yikes

Should I finish the work project I'm working on just write some haskell projects for fun?

So I should avoid it then?

work project, then haskell

Is there some website or other resource which catalogues the typical/most common short options in CLI tools?

e.g. like -v for verbose, -f for force and so on

>theres only 1 person that isnt a weeb
lmao

no, learning it is valuable because it's a very cool use case for unification, i just wanted to provide my rationale for why you likely won't 'get' prolog to the extent that you can use it for general problems.
prolog is deceptively simple in that it's easy to understand how it works internally, but debugging (and correctly writing) programs is still extremely hard, especially since all you get is either a non-halting program, a program that is *extremely* slow (in the sense that you can't distinguish it from a non-halting program) or a bunch of unresolved symbols where you expected a sensible output.
oh, also, it's common that after writing prolog for a while, you have trouble picking up a regular programming language again for a short while because you forget how to think about executions in normal languages.

if the halting problem is undecidable in regular programming languages, it's super-undecidable in prolog.

don't shit the thread pls

man [command]
[command] --help

capping your posts, thanks for the input

You can change the OP post style all you want, you can't change the fact that you're retarded and you always start off the thread in the same way by quoting posts in the previous one whatever man.

O B S E S S E D
B
S
E
S
S
E
D

I mean globally, not for a specific command. i.e. I'd like to know (search for) what "-N" or "-Z" tends to be used for in various programs. Kind of like there are sites like fileinfo.com where I can search for what different file extensions tend to stand for and what programs use them.

I guess I can just try to dig through global apropos via man -K though that's just limited to what I have installed.

how hard/difficult is C# to learn by someone how has no expirience in programming?
I dont know shit about programming, but I'm interested in picking up Unity

You'll live.

Why do people on this board whoreship C?

Fuck off Klabnik.

Because it's fast. They think they can get the most power out of their hardware by using it.

>I mean globally
No such thing. You have to thank the Unix philosophy™ for that.

Obsessed

If I know Python, should I learn Haskell if I'm interested in back-end? I'm following a shitty github roadmap and it says to pick a function and scripting language.

Alternatively I can just learn C, I'm a freshman in uni so I have time to not create useful stuff.

What a punchable face

>pick a function and scripting language
Python is the scripting language. FP is nice but you need to work with OO for around 3-5 years to appreciate what its doing differently and how useful it can be.

Just learn C++. You can get a job with it and it integrates seamlessly with Python, so it's a natural compliment. Once you've been working for a while, then you will have more of an idea about what to pick up for FP be it Haskell or CL or something else.

Nim

stop posting

I don't think Haskell encourages readable code.

I agree, it encourages unreadable one liners. It's too powerful for its own good.

Every programmer should know C.

Those of you who are diagnosed with ADHD, how do you manage to get started on stay focused?
I've switched up my toolset so much, I know the very basics of so many things, but I still don't know how to make a small project from scratch and I really need to get to that point.

do challenges for fun, they give me the same dopamine rush like a fun videogame

justify that statement

Are flowcharts still used?

What sort?

DFDs are still used in their niche (e.g. data warehousing).

You should check out miniKanren.
Prolog has those "extra-logical" predicates that are impure and cause all those inconsistencies you talk about. miniKanren doesn't, it's basically the Haskell of declarative/logic programming, so e.g. order of declarations really do never matter (unlike Prolog), you can run anything in reverse, etc.

The generic diagram for representing algorithms.

Sure, though pseudocode is more useful/common as it always was.

Are Python dictionaries suitable for a cache of a hundred thousand entries? The key is a short string and the value is a timestamp and integer pair.

*Flowcharts being essentially pseudocode for non-programmers, i.e. something you may want to stick into a technical documentation addressed to non-developers (say as a project deliverable documenting your process) or possibly in a publication in a journal that doesn't directly deal with computer science (those would favour a mathematical treatment followed by pseudocode).

That sounds like an interesting project to implement, a database of single-letter flag meanings. The context of CLI tools can differ greatly, so there's no universal standard, but as long as you make it kind-of mnemonic and (more importantly) have a good help message/man page, you'll be fine.

Here's something for a large number of GNU programs:
gnu.org/prep/standards/html_node/Option-Table.html

will check it out, thanks

Should I still read SICP if I hate Jews?

I am using Pandas in Python to open a huge CSV file and doing a lot of stuff within it. I've managed to do everything up until this issue right now.

I have one column named 'Numbers', just for example's sake. This column has lots of repeating numbers (sorted previously earlier in the code). I need to do something like the following C code:

for i in df['Name']:
if i == i + 1:
print("Combo")
else:
print("Single")


I need the Combo / Single to be written into a new column.

I know this is the pseudocode to do so but it doesn't work in python. I just can't figure how the hell to actually do it.

I tried using df.apply() but I can't think how.

If anyone could give me a hand it would be great.

Attached: 1549727890047.jpg (1294x478, 115K)

That sounds fun, but where can I find fun challenges to do? Preferably something that feels very gamified.

Shit, that reply is full with errors.
>column named 'Numbers'
>using 'Name' in the code block
>following C code

Just ignore.

Based.

> if (i == i + 1)
This will always be false

Use Nim.

Your picture is so apt to your post.

l = df['names']
i = 0
while(i < len(l)):
cur = l[i]
issingle = true
i += 1
while(l[i] == cur):
i += 1
issingle = false
print('combo')
if issingle:
print('single')

something like this maybe? not very pythonic lol

sorry, forgot a length check in the inner while

try codewars or leetcode

codewards, its very gamified, the site basically does everything you you just need to get to the correct output

She doesn't need to. Knowing C is essential for building a good mental model of what your computer is actually doing.

Only if your computer is a PDP-11. Otherwise, this is just bullshit.

Your computer is not a fast PDP-11.

Any python wizards here that could help me? I'm trying to make a class support "if [key] in [classattribute]" syntax to check if a key is in a dict of params inside the class object.

What i have so far is this:
class Params():

def __init__(self, params={}):
self.params = params

def __len__(self):
return len(self.params)

def __iter__(self):
yield self.params


My goal is to do something like this:
params = Params(params={"key": "my_key"})

if "key" in params:
print("Key exits in params")


Is this shit even possible?

Not really, at best it's a mental model of what the OS is doing.

Yes. "in" is __contains__ if I'm not mistaken. Why don't you RTFM?

Working on your try. Not sure if it works.
Thanks a lot.

I should mention though that you're reinventing the UserDict wheel.

I've used Python sets for up to 17 million entries, so a hundred thousand should not be a problem. The best way to find out is to try it, of course.

Yeah, you're right. I've just started diving into function overloading and i didn't know which underscore function was used in the if statement. I'll probably rethink the way the class object is used. And thanks for the quick reply.

Thanks, guys, I'll check out Codewars.
Is HackerRank like Codewars and LeetCode?
A few people I know recommended me HackerRank a few months ago, but I forgot to look more into it.

If you're just passing a pointer for evaluation, it's fine, and can be used to indicate this is a "read-only" function. You could go further and make a script that runs through all your code and prints out functions which don't and do have const type const pointers/arrays, as a sort of "safety-marking" for your code. It won't change a damn thing in the running of the code, but it would guard against troublesome people who work on your project.

I take medication and supplements. I normally don't program in my spare time but have been lately. At work, dosh, respect, and learning were incentives. For my own project(s), I occasionally think "I feel there's a better way to do something" and ask on Jow Forums or search around the net. Lately I've started doing an additional structure for certain structs, and defining const pointers to constant returns of function pointers, so I can have a "dictionary" of shorthand functions for my structs, which don't pollute the namespace. (a "private" header declares all my functions and the definition for my structure of functions, and in the "exposed" header I extern the structure, and among all of these is a header explicitly for the structures and the... prototype or whatever of the function structure. It's a bit of manual work I want to try and tidy away with scripting (I program mainly for C89 because I do like the way it does things, but it does have some strictness and verbosity).

For parsing shit I use Perl. I learned it from where I worked, and damn does it feel good when it spits out the right shit, even if it takes an almost excessive number of debug runs. C and Perl are great for a "write-debug-write more" style, which is probably why perfections will likely hate it over a more higher-level language which has all the tools they (expect to) want built in or in library.

In Perl for example I slapped together a binary dumper and partial file analyzer, because in my work I needed to read the binary and gleam at least some interpretation of it (the output of our program wasn't correct).

Not sure but you can't pass a non const argument to a function if the function don't take a const. That's supposed to avoid modifying const data through a function.

Right, that's the drawback. Having to cast all your pointers when passing them. What an annoying crock of horse-shit, even if it's safe.

how do I get around to doing this without a recursive function?

Attached: problem.png (606x156, 54K)

You niggas are wrong. In C, you can pass a pointer to non-const just fine to a function that takes a pointer to const. The function simply won't be able to modify the object that is pointed to by the parameter.

Hell, even functions from the standard library, such as puts, take a const char * as a parameter.

Use a loop. That's what your compiler would likely optimize your recursion to in any case.

#include

int main(){
int m, n, c, gcd;
printf("Please enter two integer numbers: ");
scanf("%d%d", &m, &n);
if(n==0){
gcd=m;
}else while(m>0 && n>0){
c=m%n;
m=n;
n=c;
}printf("Greatest common divisor: %d\n", m);
return 0;
}

how's this?

Attached: unknown.png (707x714, 88K)

What ASCII symbol should I use to express set subsets (i.e. ⊆) in CLI expressions?

I'm using &, |, -, ~, ^ for intersections, unions, relative complements, absolute complements and symmetric differences respectively since they have relatively clear equivalents (with the exception of - I guess), but I'm not sure what to use for subsets.

Is there any serious project (created by someone else than Nim creators) using Nim?

why not [ for set, { for proper subset and then have it only parse ( and ) for disambiguation.

if you don't have any other partial orders, just use < and >.

Using logic symbols to denote set operations isn't just convention. For instance, & as intersection makes sense since x∈(A∩B) iff x∈A and x∈B. Similarly, | for union because x∈(A∪B) iff x∈A or x∈B. Now for ⊆. Think of it this way: A⊆B iff x∈A implies x∈B. So you could use something like A->B or A=>B to denote ⊆.

I think I'll go with < and >, thanks guys.

>Using logic symbols to denote set operations isn't just convention. For instance, &

As far as I'm aware using & to denote ∧ is just a compsci convention though.

That's irrelevant.

Then I don't get your comment which was pointing out the obvious.

It doesn't matter whether you use & or ∧ for logical and, the connection to intersection is the same.

Wut? It makes the most sense regardless of convention

& - literally "and"
∧ - logical and

i dont get it

If you want to go the lattice route instead of the logic route then you should use