/dpt/ - Daily Programming Thread

What are you working on, Jow Forums?

Last thread:

Attached: BPS.jpg (853x480, 198K)

Other urls found in this thread:

scanlibs.com/functional-programming-c-improve-techniques/
leetcode.com/problems/k-closest-points-to-origin/
en.wikipedia.org/wiki/Object-oriented_programming
p.Jow
twitter.com/NSFWRedditVideo

double compile!

Attached: 1548990183441.jpg (583x580, 78K)

Reminder that pointers are not arrays, and that C is poorly designed because it erroneously conflates the two concepts.

Reminder that Trump knows more about technology than anyone, which includes all of Jow Forums.

Attached: proxy.duckduckgo.com.jpg (474x315, 20K)

assembly itself is actually easy
there are only so many opcodes
the challenge comes from doing anything worthwhile with it

programming is a form of magic and not technology

This is the kind of code that people who shit on OOP write.
typedef struct {
unsigned long pix;
XftColor rgb;
} Clr;

typedef struct {
Cursor cursor;
} Cur;

typedef struct {
Display *dpy;
int ascent;
int descent;
unsigned int h;
XftFont *xfont;
FcPattern *pattern;
} Fnt;

typedef struct {
Clr *fg;
Clr *bg;
Clr *border;
} ClrScheme;

typedef struct {
unsigned int w, h;
Display *dpy;
int screen;
Window root;
Drawable drawable;
GC gc;
ClrScheme *scheme;
size_t fontcount;
Fnt *fonts[DRW_FONT_CACHE_SIZE];
} Drw;

typedef struct {
unsigned int w;
unsigned int h;
} Extnts;

/* Drawable abstraction */
Drw *drw_create(Display *, int, Window, unsigned int, unsigned int);
void drw_resize(Drw *, unsigned int, unsigned int);
void drw_free(Drw *);

/* Fnt abstraction */
Fnt *drw_font_create(Drw *, const char *);
void drw_load_fonts(Drw *, const char *[], size_t);
void drw_font_free(Fnt *);
void drw_font_getexts(Fnt *, const char *, unsigned int, Extnts *);
unsigned int drw_font_getexts_width(Fnt *, const char *, unsigned int);

/* Colour abstraction */
Clr *drw_clr_create(Drw *, const char *);
void drw_clr_free(Clr *);

/* Cursor abstraction */
Cur *drw_cur_create(Drw *, int);
void drw_cur_free(Drw *, Cur *);

/* Drawing context manipulation */
void drw_setfont(Drw *, Fnt *);
void drw_setscheme(Drw *, ClrScheme *);

/* Drawing functions */
void drw_rect(Drw *, int, int, unsigned int, unsigned int, int, int, int);
int drw_text(Drw *, int, int, unsigned int, unsigned int, const char *, int);

/* Map functions */
void drw_map(Drw *, Window, int, int, unsigned int, unsigned int);

Totally NOT seeing an OO pattern here, right?
Also
>abstraction
>abstraction
>abstraction
Are they trying to convince the reader that those "abstractions" are effective, or themselves?

Attached: 1523336729860.png (549x560, 258K)

The kernel is full of badly coded manually managed OOP. Cniles are the biggest hypocrites.

This, and anyone who denies this is nothing but a contrarian autist who tries too hard to look smart on a Mongolian forum.

Linux does use lots of OO methodologies, but that code is from dwm, a community which actively condemns OOP.

>null terminated strings
>null pointers
Otherwise you have to have a huge performance and linguistic overhead for all VALUES, and someone would beg for pointers to have the same protection fucking the whole thing up. It would have to occur for all values because any memory access could occur inside of an array. You either take a hit on all pointers, or have no pointers.

If you don't have them: There is _no_ means to indicate an invalid character. There is _no_ means to indicate an invalid pointer. You lose nearly 100% implicit safety and have to do 100% explicit safety in terms of runtime and compile-time complexity.

>undefined behavior
If you understand computation, you'd know why the behavior is necessarily undefined. The only complaint is maybe comparing ints to chars, but that's the only case you can make.

>horrible declarations
Sure, left to right would be more sensible, but there is simple ordering. Most languages require 8 layers of abstraction, shitty performance, or lack safety or predictable behavior if they don't have the explicitness of C's declarations. Ironically C's typing is one of the worst things to insult. You're the reason why there's an even/odd package (with updates) for Java or whatever.

>experts say it's bad therefore it's bad
Experts say it's good therefore it's good too, right?

I'm not seeing the OO part

i don't mean to continue the trend of bullying them, but why are c programmers scared of enums
i genuinely don't know
i have literally seen them used once, and even then magic value macros were still also used
it baffles and confuses me
is there some reason for it?

someone in another thread mentioned there was considerable amounts of FUD and superstition surrounding C development practices
is that all it is?

Friendly reminder AppImages are the best way to distribute your programs across all Linux distributions.

Attached: AppImages rock.jpg (868x6656, 724K)

>thinks that's bad
You've never had to look at the code OOPsies have to have you? That code can all be in separate header files, and although it's not as easily maintainable, you can stuff all those functions into structure which are relevant to each struct (I'm writing scripts to make it easier for myself).

Meanwhile in C you can get 150k line class files, with multiple inheritance and inferred shit that takes 20 years to compile. Pic related, good abstraction.

Attached: cards2.png (2560x1400, 156K)

we know, appimage has even been approved by Linus himself.

Attached: linusappimage.png (509x430, 37K)

extern structure const char * const
More to fix

I've found limited uses for enums. In performance code, it's better to use definitions for masks vs enums.

Attached: cards.png (2560x1400, 201K)

Reminder that ordering of bitfields in C should be defined in the standard

I have to think about this. The structure is not a pointer and it's members are constants. Declarations are a bitch.

Attached: 5CTj5co.gif (640x360, 521K)

Your entire post shows you can't distinguish the concepts of a programming language and its implementation.

>Otherwise you have to have a huge performance and linguistic overhead for all VALUES, and someone would beg for pointers to have the same protection fucking the whole thing up. It would have to occur for all values because any memory access could occur inside of an array. You either take a hit on all pointers, or have no pointers.
Complete non-sequitur.

>If you don't have them: There is _no_ means to indicate an invalid character. There is _no_ means to indicate an invalid pointer.
Maybe/Either types can solve this at the type system level.
>You lose nearly 100% implicit safety
There was none to begin with.
>and have to do 100% explicit safety in terms of runtime and compile-time complexity.
It can require more compile-time complexity, and that's a fair price to pay for more safety.
As for run-time complexity, it depends on the domain, of course.

>If you understand computation, you'd know why the behavior is necessarily undefined.
There is absolutely nothing about "understanding computation" implying that some behavior should be undefined.

>Sure, left to right would be more sensible, but there is simple ordering. Most languages require 8 layers of abstraction, shitty performance, or lack safety or predictable behavior if they don't have the explicitness of C's declarations. Ironically C's typing is one of the worst things to insult. You're the reason why there's an even/odd package (with updates) for Java or whatever.
These are the ramblings of a madman. I can't possibly conceive defending shit declaration syntax this vehemently and throwing unrelated spergings.

>Experts say it's good therefore it's good too, right?
Not necessarily, but that's to point out that they are not memes randomly found on Twitter.

The sysv ABI does this and that's good enough for me.

enum aren't that uncommon in C code, if you're talking about the crap that looks like this it's usually in system headers where it's required that it also be a macro, so that you can #ifdef.

enum {
THING = 1,
#define THING THING
/* ... */
};

By making it an enum too you can have meaningful values in a debugger.

trump has age induced dementia aka alzheimers so i doubt it

I share his opinion, when I used one of their applications it really "just work".

Jesus Christ, Suckass faggots are beyond salvation.

trump can't even handle 140 characters, let's not even talk about coding.

Attached: trump_struggling.jpg (1200x1200, 64K)

that doesn't explain why it seems literally all C programmers prefer to pass magic value macros to ints for configuration
it makes function parameters extremely unclear
that's not a strawman, it happens in zlib, GLFW, lzma, opengl, vulkan (whose C++ header undefines them and uses type safe enum classes) and probably many others
there has to be a reason for it - especially considering vulkan, which is well made, undefines them and replaces them
my gut tells me it's something to do with portability

i'm talking about using them for config settings

Good. Lines should be 80 characters at most anyway.

C enums suck: they offer no type safety, and all enum member names share the same global namespace, so you end up with ENUM_PREFIX_ENUM_VALUE crap.
It's just retarded.

>vulkan, which is well made, undefines them and replaces them
*in its C++ header
does the C standard not deal with some element of enums that the C++ standard does?

>i'm talking about using them for config settings
That's because of autotools. You can pass -DDEFINE_THIS_THING to the compiler to change the way #ifdefs get handled. You can't do that with enums that I know of. With enums you would have to actually change the sources, which would be much more bug-prone than using -D switches.

>checking if every a value is legal to read or write from is a non-sequitur
You actually are that retarded aren't you?

>maybe/either types
Literally having to tack on a bunch of baggage to your language instead of just knowing whether you've reached the end of a string or set of pointers is "good".

>no implicit safety
I can stop reading a string if it's null. Literally every single time. I can stop reading a set of pointers if it's null or I exceed the set. This covers literally 100% of uses cases of groups if you're not a brainlet.

>gibberish
>more gibberish
>not memes found on twitter
Except they are? The guy who invented NULL doesn't realize how fantastic and natural of an extension to null terminated strings it was. John Carmack burned millions on his own failed projects because he didn't understand the problems of scaling programming. Linus might be an asshole but he at least kept his head on shoulders and out of his asshole until last year. Without C we end up with unfinished projects, version-specific platform-specific toolchains, featureless programs, and "it'll be better the next version" treadmill that reinvents shit people have done for decades instead of moving forward.

Just get out of programming. It's not for you and you're dragging everyone else down.

you seem to be mistaken, i am well aware of using the preprocesser in combination with build tools for configuration, i don't mean using them for compile time configs
they're always used for runtime configuration, as function parameters, never involved with #ifdefs, and only used as magic numbers
like GLFW_KEY_ESCAPE, or Z_FINISH

>By making it an enum too you can have meaningful values in a debugger.
This is real reason to use enums in C, if you inspect a variable in gdb of an enum type you'll get back the symbol, if it's int you'll get a number.

Is there any reason to use include guards instead of #pragma once

Oh man. The company I worked with made a single codebase that ran through 10+ compilers, thousands of options, hundreds of configurations.

We had custom tools for it, because there was one file with more than 3000 define/undef lines, at least 2/3rds were relevant, and 300+ configurations. There was a table "stored" in the comments, and it had Y/N/C(ustomer)/V(business initial) to determine whether or not stuff was turned on/off for certain builds.

Additionally, V or C was determined at compile time by doing a build or build.companyname, to tweak things on/off, like if we're using a different compiler for them or enabling certain debug paths for ourselves.

It was ludicrous, because once options were turned on/off, we'd run scripts over the code so the headers only included functions and data which were needed. Saved a lot of compiler headache (not all optimize the same way), compiled faster (only needed code was passed), and we could see directly which things were/weren't being passed to the compiler. It got even messier because some data was moved into different storage for performance reasons, and some data may or not be accessible to multiple threads (code was strictly reviewed to prevent collisions and stuff).

Fun fun.

oy vey, i-it's not oop because it doesn't have language support!

#pragma is a non-standard directive and some compilers don't support it.

>You actually are that retarded aren't you?
You don't know what a non-sequitur is, you tell me.
>Literally having to tack on a bunch of baggage to your language instead of just knowing whether you've reached the end of a string or set of pointers is "good".
It is. It can prevent many subtle errors at compile time.
>I can stop reading a string if it's null. Literally every single time. I can stop reading a set of pointers if it's null or I exceed the set. This covers literally 100% of uses cases of groups if you're not a brainlet.
Except you can't in general. You are implicitly assuming that you will encounter a null value before exceeding your buffer. What you are describing is literally the source of most security issues caused by C.
>gibberish
Sorry you can't understand English.
>Except they are? The guy who invented NULL doesn't realize how fantastic and natural of an extension to null terminated strings it was.
>potential segfaults
>fantastic and natural
Holy shit.
>John Carmack burned millions on his own failed projects because he didn't understand the problems of scaling programming. Linus might be an asshole but he at least kept his head on shoulders and out of his asshole until last year.
I don't see how these statements are relevant.
>Without C we end up with unfinished projects, version-specific platform-specific toolchains, featureless programs, and "it'll be better the next version" treadmill that reinvents shit people have done for decades instead of moving forward.
Ah, yes, we've finally come to the moment where I can write
>source: my ass

nonstandard compiler extension ubiquitous it may be
and knowing whether or not a header is present has plenty of usecases

>Without C we end up with unfinished projects, version-specific platform-specific toolchains, featureless programs, and "it'll be better the next version" treadmill that reinvents shit people have done for decades instead of moving forward.

Attached: 1519543507942.png (1280x835, 6K)

no, only ancient and non-standards compliant compilers don't support it.

it sounds completly idiotic, desu ne

Programming is a terrible profession. For one, you are not respected at any normal company. You're known as 'IT' or 'tech' and everyone gossips about how disgusting everyone is in the department. Every day, some Alpha who probably does nothing at his job except get his secretary to suck his dick in his corner office cucks you into doing random bitch work, and then yells at you when it's not done by the deadline despite it being impossible to complete the work requested. Not only that, the code rarely works, your co-workers are smelly Indians brought in by Tata Consultancy and connive to replace your job at all times for half the salary, and the work never, ever ends. You pollute your body with sugar and toxins, deprive it of sleep, and let it rot while you sit the majority of the day, neglecting any healthy exercise, social interaction or life goal attainment. It's like a Postal worker, but coupled with feelings of patheticness, lonliness, helplessness, rage and total hopelessness.

Women, when they hear you are a programmer, instantly remove you from the potential pool of mates as they know your earning potential is maxed early and your career over at 35. They are also instantly disgusted by you. It is far better to tell a woman you are on welfare than to out yourself as a computer programmer. It's also highly embarrassing for a woman to date or be married to a programmer, as virtually everyone knows they are the grown up version of the hopeless virgin in high school. One who never really grew up and became normal and fit into society, but rather found an environment where he could escape the reality of his situation and be invisible, able to hide the toxic shame and utter humiliation that is the programmer.

After Dentists, programmers have the highest rates of mental disorders, especially depression and suicide.

Programmers, why haven't you taken the cyanide pill?

Enums are only really useful for things which A) don't need order, B) don't need explicit values and C) are useful as a type.

If you don't need them, you probably shouldn't be using them. I really like them as error codes now, and for my deck of cards, I used enum index = value, so I could explicitly map card values to strings, as well as have an easy way of doing guards.

Oh, I got it. You said 'config' and my mind went straight to './configure' lol. The GNU coding standard recommends enums in place of magic numbers, something about the way the compiler handles them. I use them all over the place, it makes reading the code much easier for others.

>abstraction is OOP
You need at least a vtable before it qualifies as OOP.

...if the header is absent your compiler throws an error. Even if it reported a shitty error, you could just dump all the defines from the compiler table and know whether or not it's there because it would have failed right at the #include header step.

I think you're agreeing with me. The suckless.org site has a lot of "C99" suggestions that are basically "program like you're in C89".

It worked fine, great actually. Company had been around for 30 years, and the code was used in shit like end point terminals, satellites, VoIP infrastructure, sound-recognition for criminal investigation. I actually wrote new tools to make the table more accessible. Literally the old way they changed an option for a configuration was MANUALLY. They had to find the column and row, highlight using editor-specific commands, go down/over, and change it manually. It was asinine. I was pretty proud of my tooling, especially when I had it even include the .company releases, since they were implicitly defined (as extensions of normal releases). I even wrote basic cross-correlators at the end, so if one option was on/off, what other options were most likely on/off.

I'm about to 10x my cpp skill.

Attached: fpcpp.png (1920x1055, 116K)

so GNU recommends using enums instead of magic numbers
they do so because of the obvious code readability reasons and because compilers, one of which they wrote and maintain, handle them in a way that is preferable to magic numbers
and yet magic numbers are everywhere while enums are rare as fuck in many well regarded well supported C libraries

i am left with more questions than answers

how do you propose enabling certain features when a certain header is present

No you don't.

lmao, you actually sound proud of working with that level of shitcode.
It will (rightfully) get scrapped after the last boomer that knows how it works dies.

You couldn't be more wrong.

What the shit is going on? I have a function with a couple of if and else if in javascript like so:
if(condition) {
console.log("1 fired");
inject(file, element);
} else if(condition) {
console.log("2 fired");
inject(file, element);
} else if(condition) {
console.log("3 fired");
inject(file, element);
} else if(condition) {
console.log("4 fired");
inject(file, element);
}
it works just fine for 1, 2 and 3 and the inject function is successful, but for whatever reason the inject function fails on 4. It logs "4 fired" but inject() returns failed. Even when I use the exact same code from either 1, 2 or 3 number 4 just fails. Am I retarded?

Explain your reasoning.

god that's hideous

Kek I thought the same thing. He sounds like those boomers who proud themselves on wageslaving for 50 years at some company.

Functions operating on a data type is basic data abstraction, not OOP. In OOP, the object itself determines which code to run when a method is invoked on it.

Use a debugger you dork.
>Even when I use the exact same code from either 1, 2 or 3 number 4 just fails
what is that supposed to mean exactly? "exact same code" = same inject call?
I mean, how else would you handle this?

>i am left with more questions than answers
I suspect many programmers are socially inept, and an important part of socializing with your peers is communication. The fact that enums communicate a much clearer message to other developers is lost on the average programmer, who is likely to name global variables things like 'eqr' or 'si' and say anti-social nonsense like 'my code is self explanatory.' I'm willing to bet if a study were performed we would find that socially adept programmers use enums and comments more often, and give global variables more appropriate names.

This is all speculation, but judging by the attitudes of the average c programmer on Jow Forums it doesn't seem far-fetched to me.

that's not oop and not even abstract data types, just plain data types.

Attached: pug.jpg (400x800, 246K)

Firefox gave me nothing, but I just tried it with chrome and it immediately gave me an actual useful error message. Before number 4 fires it changes the url (pushState) and inject() tried looking for the file in the wrong folder because of it.
Yes, I'm completely shit at this and yes it's for a school project.

hideous from an aesthetic perspective

>programming general
Can I lift your skirt and fuck you in the boipussy?

>when a certain header is present
A define which includes or doesn't include the header. At work we'd have options lines, and although they didn't use SDL, you could have a config have SDL N SFML Y and the USE_SDL would get undef'd USE_SFML would get def'd, and in your relevant files they'd be in the ifdef'd blocks.

I was getting a $50k salary before I graduated my Bachelor's, and was doing exactly stuff I loved doing in and around a code base which required a lot of skill and knowledge. They normally give $10-15k raise after the first year.

Switch case. Perfect reason for it. Use an enum too, switch's are optimized for starting from 0, and an enum would let you set flag values since it seems relevant that all your conditions are related and pass through the same path.

Looks cool.
Can you share the pdf?

How do I get the data from the clipboard in Qt that tells me if files are copied or cut? I can only manage to get the urls of files.

scanlibs.com/functional-programming-c-improve-techniques/

the /dpt/ type system prevents homosexuality

Forgot to add, our makefile included ifdef'd sections, and a script would run over the config file, so files were linked/etc whether or not they were turned on/off

Perl > sh

trump > hitler

I mean, Firefox still has the step by step debugger. You should notice that change if you're using it.
eh, maybe. I could imagine something like using a (condition, function)-map and while that may be more 'aesthetic', it just makes the program harder to read imo.
I don't understand. Are you saying he should evaluate the conditions first to determine the correct enum value and then go into a switch? Because I really wouldn't consider that better than using an if else ladder.
And if you don't mean that: How else are you going to use a switch statement to evaluate multiple expressions? Is this some black magic JavaScript fuckery I don't know about?

Can anyone help me out with this problem?
leetcode.com/problems/k-closest-points-to-origin/
I've already finished the algorithm, stored the points in the array and can print out the correct return values within the function. But I just don't understand what I'm supposed to return besides the 2d array itself.

You probably have to print it the right way. It tells you how to print.

That's a clear case of emulating OOP with a language that doesn't support it out of the box.
Also, OOP doesn't imply "abstract data types".

what is the OOP part? just having types and functions that operate on them?

>In OOP, the object itself determines which code to run when a method is invoked on it.
No, that's a common restriction used to shit on it.
The moment you associate data with the functions that access it, you have OOP.

>The moment you associate data with the functions that access it, you have OOP.
Is this bait?

From en.wikipedia.org/wiki/Object-oriented_programming
>Object-oriented programming (OOP) is a programming paradigm based on the concept of "objects", which may contain data, in the form of fields, often known as attributes; and code, in the form of procedures, often known as methods. A feature of objects is that an object's procedures can access and often modify the data fields of the object with which they are associated (objects have a notion of "this" or "self").
OOP doesn't imply vtabes, inheritance or any other fancy shit.

When the condition may occur, that's when you set the condition variable. You switch the variable elsewhere. An enum would provide a start-at-0, would "type" your conditions (since they should be related, right?) and would be more efficient since a switch can often be optimized to a jump.

If-else should be related to specific, and often functional conditions. If all your condition is, is an arbitrary value for code to take, set the proper condition and let somewhere else follow the numbers. If you need to read the conditions first, then do that to set the case value, then do the jump. It's much, much cleaner and more consistent than using a huge if-else block with actions intermingled with value.

if-else is okay for small cases, but if you're ramming through a loop or planning to extend the conditions (or even have more than 3-4), a switch block is better. Once you understand switch blocks, you start thinking "is this something I can set now and act on later in a more complicated way?" which is probably "yes".

Pic related, don't be this guy.

Attached: brainlets.png (828x801, 149K)

This isn't saying that everything that could be partially put that way is OOP. for example isn't OOP

OO does imply inheritance models. A minimum model is a function is declared inside of a structure and only operates within that namespace, but in actual paradigms, yes, an object-oriented language supports interfacing, inheritance models (at least parent-child), overriding methods, etc. I don't speak French because I can yes and no in it.

>OO does imply inheritance models.
It doesn't.

Do you just shitpost and don't program?

>OOP doesn't imply vtabes, inheritance or any other fancy shit.
It implies that objects dispatch on their own methods. vtables are just a common implementation strategy.

>which may contain data, in the form of fields, often known as attributes; and code, in the form of procedures, often known as methods

please take a look again at , the operations are not part of the objects at all.

How did you come to this conclusion?

The operations are tightly related to (and modify) the object passed as the first parameter, and it becomes clear when you look at the actual implementation.
The "are not part of the objects at all" because, syntactically, C doesn't have member functions. From a design point of view, however, they very much are.
This is a clear case of emulating OOP in C. The Linux kernel itself is full of it. Hell, they are even pre_fixing the functions to indicate how closely related to the data structures/namespaces they are.

I'll just leave this here
p.Jow Forums.org/g/

Reputation and signature system when?

i think it's just the return type which should have those items
kClosest n = fmap fst . (take n) . (sortOn snd) . map (\x -> (x, (fst x) ^ 2 + (snd x) ^ 2))

Lisp is the most powerful programming language.

Lisp is the most lost post host worst frost

Should you do every single exercise in SICP?
[spoiler]I admit I skipped a couple.[/spoiler]

There's no medal for that, and I doubt your future employer will give a fuck about you wasting your time on outdated textbook focused on a dead language. You might as well read Fortan manuals from 1954.

Not him but what is wrong with Fortran?

>caring about what your employer thinks about you
>outdated textbook
>dead language
Have we found the epitome of a wagecuck?

I mean, yeah, if you condition is simple/needs to be checked often relative to how often it's updated/you don't need lazy condition evaluation it definitely makes sense to use a switch and the example in your pic is obviously fucked (especially the abusage of string values), but I don't feel like there's enough information to make that judgement. More complex conditions aren't that rare after all.

Absolutely nothing, and in fact, it's still the king when it comes to scientific computing.
Leave him to his fantasy world where the only relevant languages are whatever memes he uses.