/dpt/ - Daily Programming Thread

Terry Davis edition.

Previous thread: What are you working on, Jow Forums?

Attached: 1502049818561.jpg (480x640, 111K)

Other urls found in this thread:

notabug.org/koz.ross/awesome-c
github.com/vinta/awesome-python
twitter.com/SFWRedditVideos

temple OS is shit.
holy C is shit.

Degenerate.

Third for why are C function names so vage

terry literally fucks junkie hookers.
take your schizo obsession to

>fopen
>SDL_CreateWindow
>memcpy
gee I wonder what these do

You are literally glowing in the dark, and you know that.

Attached: they_glow_in_the_dark.jpg (474x400, 37K)

>itoa()
Gee i wonder what this does

>force open
why did they include star wars reference to the language?

Is it considered bad form to obfuscate initialisation in libraries?
Something like
void draw_sprite(position) {
if (!initialised) {
compile_shaders()
}
//draw stuff
}

Attached: DVczTmYX0AEG3d1.jpg (480x417, 49K)

>fcntl
>atoi
>cacoshl
>atoll
>brk
>cbrtf
>csinl
>hypot
>munmap
>strfry
>strtouq
>btowc
..etc

A code monkey is usually envious of a person with an academic degree. A code monkey claims to be able to "write code" (whatever that means) better than a professional with a degree, yet their code is completely worthless, because it is not built using theoretical frameworks which are taught to someone with a degree. A person with a degree is able to write highly optimized, concise, and demonstrably efficient code, whereas a code monkey usually writes nonsense code which routinely crashes, leads to inefficient use of processor and memory resources, and is overall worthless in a global, competitive market.
This is why nobody will hire you if you don't have a degree. It isn't because you're some misunderstood genius who learned to code on their own mitigating years of academic study. It is because your code is worthless.
Still think you're some misunderstood genius, and not a mere buffoon? Start a firm, and see how well that goes for you. Chances are, you will make a fool of yourself within the first days of trying to create anything of remote value.

int input;

printf("Enter a number: ");
scanf("%i", input);
printf("%i\n", input);


I'm learning C. Why doesn't this work?

Looks fine. How isn't it working?

You need to pass a pointer to scanf (&input)

You need main function and use %d

based śóýbóí

Attached: boomer.jpg (1920x1920, 1.43M)

lol

Fucking horrible. Do it in an init function.

the absolute state of Jow Forums

says that input is beibng used without being initialized
alredy using it. This thing works with a char array

#include

void Reeee()
{
int input;

printf("Enter a number: ");
scanf("%i", input);
printf("%i\n", input);

}

void main()
{
printf("Hello world!\n");
Reeee();
}

...

Because they're made so regular people can write them too, just like the commands - grep, sed, cd, tar, awk. It's also easier to read, if your eyes don't have to skip over lots of garbage.
For instance, allocatemem or memoryallocate are read out in your head as 4 and 6 syllables respectively, while malloc is read out as two. This decreases the cognitive load.
Notice how even if they are heavily abbreviated, they're not filled with consonant clusters.
It does nothing, since it's not a C function, atoi is.

All the XtoY functions convert something, so you know it converts "a" to "i".
Of course, it must convert avocados to Indian Rupees.
>fcntl
Not C. cntl/ctrl/ctl is widely used to mean "control", "file control" isn't obtuse.
>cacoshl
>cbrtf
>csinl
>hypot
That's how any calculator would write it. What else than cbrtf? cuberoot_float?
>atoi
>atoll
>strtouq
>btowc
Also follow a completely reasonable pattern, see above.
>brk
Not C. Would it be any clearer if named break?
>munmap
Most people already know mmap. How else? unmmap?
No, then it segments wrongly - what does "unm" mean?
>str___
Also a perfectly reasonable pattern. You know it handles a string, you know it converts a string to "uq", u usually means unsigned, q must be quad.
Also, neither strfry nor strtouq are standard C.

Attached: fuck the haters.jpg (1280x720, 651K)

#include

void Reeee()
{
int input;

printf("Enter a number: ");
scanf("%d", &input);
printf("%d\n", input);

}

void main()
{
printf("Hello world!\n");
Reeee();
}

%i is valid C.

Isn't %d explicitly decimal while %i can be octal or whatever?

What the fuck? previous is still ~230 you autist

%d and %i are the same

Why are you using 46 years old language? Do you hate productivity?
Using language without basic features is the only way you can feel smarter? If you were so smart you would use language that gets shit done without wasting your time implementing basic shit again and again

Attached: dpt_hime2.png (581x1133, 616K)

Sorry my bad, you're right %d is always decimal and %i can be decimal, hex or octal

Although I do agree, cacoshl is bad. It should be, standards permitting, acoshfc, or else acoshcf.
But the other ones are flawless.
No difference.
d, i The int argument is converted to signed decimal notation. The
precision, if any, gives the minimum number of digits that must
appear; if the converted value requires fewer digits, it is
padded on the left with zeros. The default precision is 1.
When 0 is printed with an explicit precision 0, the output is
empty.

Although %d is more common.

holy shit, try harder

ok now with the & works. But what does it mean? what does & do?

Fuck off, pedo.

Attached: 1500313615396.jpg (1469x1102, 320K)

Learn about pointers. & gets the address of the variable as a pointer.

That function expects an address to modify its contents, since C doesn't have pass by reference. An integer can be a valid address, so it was likely segfaulting writing to some random address. &var takes the address of something

I like how in the 3rd one he jumps from something as controversial as guns to compilers and code

Nothing wrong with it. There are more C libraries than there are for other languages.
C99 says they are the same, see Say you have the following code
int input = 0;
printf("Enter a number: ");
scanf("%d", input);

Then it is equivalent to the following code:
int input = 0;
printf("Enter a number: ");
scanf("%d", 0);

How does scanf know where to put it? It tries to put it at 0, not at input's location.
If you put an & before input, you get the address of input, in other words a pointer to input.

ack thanks

>C99 says they are the same
They're the same for printf but not for scanf

I didn't know C couldn't pass a variable. I guess it's the equivalent to ref in c#

Attached: 1510159397463.jpg (657x387, 39K)

>There are more C libraries than there are for other languages.
false

the absolute state of /dpt/

pajeet brainlets please leave this thread

>jumping from guns to code
But cryptography code is literally munitions user, why do you need to jump?
Ah, I see
It's existed for 46 years. What other language would have more?

Attached: 1520426102927.jpg (499x574, 57K)

i-i'm spaniard. The change from c# and c is hard

how bad thing is executable stack?

In a sense, C automatically has less libraries than any other language, since C is the only language that can make libraries for other languages.

>What other language would have more?
Javascript or python

It's not hard, just different.

Oh look, the C master, everyone! He has never made any mistake or assumed something wrong whatsoever

Attached: 15262997830891.png (657x539, 110K)

>Javascript
>libraries
>isod
That's cheating

Considering how many obscure C libraries there are floating around on various forums without any proper packaging, vs. npm or pip where everything is neatly documented, I think C will be underrepresented.
Also, there is nonsense like left_pad for JS. I've never seen any C libraries of that "caliber".
C has 19 years on Python, there's no chance they can regain that.

The best way to gauge it is probably to name various features you want, and then see which language has the most appropriate libraries.

Compare notabug.org/koz.ross/awesome-c and github.com/vinta/awesome-python for instance. About as long, but the C one is mostly libraries while the Python one has all sorts of stuff thrown in.

Attached: dpt2.png (905x629, 1006K)

Makes me have a big thonk, if anyone has made a plug-in for a C ide to download a package to a standard place, and when you go to use it, has a box containing the include path, standard link options, shortcut to open functions/headers, etc.

The thing is C can be used for pretty much EVERYTHING, so although it may help improve cross-platform support, use cases can be very simple and small. I mean like SDL2, I download the packages from the web-page or package manager, and pull up doc pages. Nothing much more complicated than if I want to include them in my current project vs system-wide, and setting the link/include paths.

>package manager for C
fucking hell man
you make a folder named include, you put the 'packages' in it, then you put the directories in the makefile
why make it so hard?

i need some shitty CSS help.
i wanna have a header div and below it a content div that will have a list in list , the thing is that the list can grow very large like

x
y

y


I've made the content div overflow: auto however the list inside does not expand, every list inside the list gets smaller and smaller instead of the entire content div growing to fit it.

me? i write in nano ;)

Also of course there are single-header libraries for the small stuff. Then you just dump it in the root folder and do #include "lib.h"
>>>/wdg/

...

mg > nano
mg is like emacs but not bloated.

Different build targets, different make systems, different operating systems, etc.

It'd definitely be nice to have a pre set-up package with compile flags for MS vs Linux, even in the most basic case. Flags are different on MSVS vs GCC, so why should I have to memorize makefiles or other systems if it's supposed to be a standard process?

>make a LinkedIn profile
>only get pajeets spamming my mail with "offers"

how many fucking pajeets are there I thought this was just a meme

Makefiles are standardized, just avoid GCC extensions.

makefiles are standardized, but I've worked in an environment where we used a single makefile on multiple systems, different compilers, and it scanned a header definition file to enable/disable linkings and compiler choice and stuff. Super interesting, I didn't code the tool but I may for my own uses.

Start googling for programming things and you will find thousands of "guides" with horrible advice. Check the name, fucking pajeet every fucking time.

Superpower by 2020 fampai. We will own the world in under a decade. And white bitches love us.

How you gonna be a superpower if you all keep dying from e. coli infections?

Attached: dpt1.png (901x631, 981K)

Somebody explain to me how RAII in sepples work, because I can never seem to determine when the fuck my pointers are getting auto obliterated by stack clearing even though I called new, which I was told is just a replacement for "malloc"
Except if I try to delete new on the stack I get violation errors half the time and half the time its fine
What the fuck is this shit?

They've infested youtube as well. Most educational stuff on there is in hindi.

I need to do an array of key-value pairs in Java.
What looks better? Are there better ways?

This is one of the few times I actually like JavaScript.

/* Example 1 */
public void doSomething() {
doSomething(new HashMap() {{
put(1, 1);
put(1, 3);
put(2, 9);
}});
}

public void doSomething(HashMap map) {
for (Map.Entry kvp : map.entrySet()) {
doSomething(kvp.getKey(), kvp.getValue());
}
}

/* Example 2 */

public void doSomething() {
doSomething(
new KeyValuePair(1, 1),
new KeyValuePair(1, 3),
new KeyValuePair(2, 9)
);
}

public void doSomething(KeyValuePair... pairs) {
for (KeyValuePair kvp : pairs) {
doSomething(kvp.key, kvp.value);
}
}

public static class KeyValuePair {
public final K key;
public final V value;

public KeyValuePair(K key, V value) {
this.key = key;
this.value = value;
}
}

RAII is literally only used by retards who don't care about how efficient their code is, and don't want to have to even consider, in the abstract, what their program is doing with memory. Zero cost abstractions are anything but, in practice.

malloc allocates on the heap, pointer is on stack. Malloc doesn't call the constructor, new does apparently. You would have to call the destructor, not free if you're using new.

Attached: 1525891708311.png (229x220, 6K)

Youve just repeated what I know without clarifying anything. Why does new leave things half the time and not the other? Is new not actually the same as malloc?

Don't do DBI. It may lead to memory leaks.
Every time you do DBI you create another anonimous subclass. Restrict DBI usage for stuff that is done one time, like configuration.

I dont disagree with that but mixing C memory management with C++ management tends to be a fucking headache, especially if you want to make use of vectors.

>vectors

the absolute state of /dpt/

pajeet go home

It doesn't. Go step by step and determine when the compiler could for certainly destroy your object for you, REGARDLESS OF WHAT YOU TELL IT OTHERWISE

Theyre convenient stretchy buffers.
C++ is supposed to be about more convenient C programming.

If i put something on the heap then it shouldn't be deallocated until the program exits or I say so, period. What youre suggesting defeats the point of RAII.

Letting an object with a destructor go out of scope is saying so.

what's wrong with reallocing twice as much when you fill up your buffer, in the very few scenarios where you don't know how big of a buffer you need (seriously 99% of code, you should be able to determine a reasonable upper limit for your buffer by thinking about it for a few seconds)

show us your code to be proven how retarded you are.

pastebin it.

change
scanf("%i", input);
to
scanf("%d", &input);

Attached: 15305220959860.jpg (560x330, 36K)

Except I put it on the heap with new, thats not on the stack, so that shouldnt happen.
Yet it keeps fucking happening. Why?

A) tedious and a hassle
B) figuring out exactly how much you need *every single time* you need a collection is a waste of time. Stretchy buffers are extremely useful. Its one of the few things sepples did right, and promptly fucked up with their retarded ownership system.

why d instead if i. Isn't i an integrer?

None of these issues werent solved in under a minute and I couldnt post the codebase in any context anyways, its not some tiny hobby project.
When it happens I immediately know the problem. WHY it happens I dont get because sometimes its fine and sometimes its not despite making the same keyword calls.

(%d)igit

>пyк
goto php thread, manya

Attached: WxLpmcNOBLQ.jpg (500x750, 60K)

okay, as long as you don't let your shitty program manually chase pointers for 10 seconds and deallocate all your """"zero cost"""" RAII vectors when i try to close it, i don't give a shit
but if you do, fuck you. if you ever deallocate memory when the user tries to close your app (which RAII inevitably leads to) you're worse than the pajeets
it shouldn't be faster for me to open a console, find the pid, and pkill a fucking program than let it gracefully exit

You can use %i instead of %d, but I'm used to using %d

Attached: 15305220959861.png (369x500, 159K)

If I wrote a menu library that has network awareness in mind, I have a setting that would make menus have a apply button to commit all changes on the menu, and some menus will be set up to be actively updated by the servers data, like toggle buttons and plain text. (Only text prompts will not be capable of this feature since I don't want it to update while someone is writing something, but they could be disabled whenever.)

Do you think there are people out there that would to combine both a apply commit style of menu with a dynamically updating style?

>C#
>most logical language
>most performant language
>most aesthetic language

Is C# literally perfect?

Yes.

It closes instantly. Im not heaping fuckhuge 800MB chunks or anything, I take very explicit care with those (with malloc/free no less).
Its all the tiny maybe-3kb-collections of data points that I cant seem to make stop popping off into the aether

>logical
Arguable for OOP yes
>performant
Not even fucking close
>aesthetic
More than java but no framework that microsoft encourages is aesthetic

>most logical
AHAHAHAHAHAHAHA
HAHAHAHAHA

>logical
non sequitur without meaning
>performant
nowhere near the top, gets beaten by the jvm, c, c++, rust
>aesthetic
non sequitur without meaning

Okay I should clarify before some semantic dipshit corrects me
Im heaping huge 800Mb chunks but /track them very closely and always free them as soon as Im done with them/
Im not loosely allocation shit and hoping RAII fixes it. The opposite is true. RAII is freeing shit I didnt want to.