/dpt/ — Daily Programming Thread

What are you working on, Jow Forums?

Previous thread:

Attached: dpt.png (787x498, 28K)

Other urls found in this thread:

en.wikipedia.org/wiki/Berkeley_Yacc
en.wikipedia.org/wiki/GNU_Bison
en.wikipedia.org/wiki/Flex_(lexical_analyser_generator)
twitter.com/AnonBabble

JavaScript rocks!

Attached: js-rocks.png (1000x494, 286K)

nth for nim!

I have a bunch of executable binaries which I want to disassemble and have their respective assembly code into individual files. All files are named "executableSOMETHING" so that's helpful.

What I tried was:
ls exe* | xargs objdump -d > output.txt

Howeever that doesn't solve my issue. I'm not great with cli stuff, so I'd really appreciate it if someone could show me how to pipe the xargs for both objdump and file creation.

still contemplating suicide

Can anyone experienced with bison parsers offer advice?

Don't use bison, it's crap. Nobody uses that crap. Make parser combinators.

If you're very new, like "still getting used to for loops" new, then you shouldn't be worried about it yet. Get the fundamentals down before getting concerned with the details.

That idea is actually one of the most important things in programming: don't jump the gun on things. Make it work, then make it clean, then make it fast. If you do those things out of order, or try to do them all at once, often times your code just won't even work at all. At best, it will be a tangled mess that breaks if you so much as look at it the wrong way.

I can't tell you how many times I've learned that lesson the hard way. I'm a bit of a perfectionist, so i have a tendency to overthink things from the get-go and try to do all 3 at the same time.

no, Lua rocks

Attached: 1489627784278.jpg (573x892, 113K)

>unimeme dash
fuck off

Thanks user, I'll be sure to keep all this in mind.

Recursive descent parsing is the most powerful and simplest parsing method.
Learning anything else is useless.

for file in exe*; do
objdump -d "${file}" > "${file}.disasm"
done

recursion is the most powerful tool we have period

too bad our computers are too garbage to implement it properly lol

Reminder to say
>in layman's terms....
>to put it simply...
when you don't know what the fuck you're talking about but want to appear knowledgeable

>To put it simply, quantum bits have 2 states simultaneously, kind of like Schroedinger's cat

Attached: 1539464759703.jpg (838x638, 138K)

>>To put it simply, quantum bits have 2 states simultaneously, kind of like Schroedinger's cat
What's wrong with this?

>unimeme dash
>fuck off
ASCII doesn't have a dash.

Ok?

How are you supposed to write a dash without Unicode?
Should OP use a different character, e.g.
/dpt/ @ Daily Programming Thread?

He should use the ASCII symbol that looks like a dash and is commonly accepted as a dash and is the defacto standard dash when a dash is needed but memes aren't.
That is, ASCII character '-' (0x2d)

>'-'
That's a hyphen and only molesters of the arts accept it as a dash

Attached: 2018-12-20_14-57.png (945x622, 6K)

Most of your programming is implicitly copying ideas of what other people have already done. 99% of the time you are using known design patterns whether you realize it or not.

C:
#include

gchar *invert_case(const gchar *s)
{
glong len, i;
gchar **inverted;
gchar *out, *current;
gunichar c;

if (!g_utf8_validate(s, -1, NULL)) return NULL;
len = g_utf8_strlen(s, -1);
inverted = g_malloc0_n(len + 1, sizeof (*inverted));

for (i = 0; *s; i++) {
c = g_utf8_get_char(s);
current = g_ucs4_to_utf8(&c, 1, NULL, NULL, NULL);

inverted[i] = g_unichar_isupper(c)
? g_utf8_strdown(current, -1)
: g_utf8_strup(current, -1);

g_free(current);
s = g_utf8_next_char(s);
}

out = g_strjoinv("", inverted);

for (i = 0; i < len; i++) g_free(inverted[i]);
g_free(inverted);
return out;
}

Python:
def invert_case(s): return s.swapcase()

Ah yes, the C tards

Attached: C grugs.png (1200x699, 345K)

Go meme somewhere else.

>using glib
>ever

>autism
LOOL let's see your implementation of portable unicode library in C

>he says while interpreting his code with CPython

Why would I use unicode?

I don't see why a NEET should use it, so idk.

I'm not NEET. The world uses English as the universal language. There is no point to unicode.

can't write swastikas without unicode

sweet. for some reason I thought I'd have to write this into a file and not directly in the terminal. thanks a bunch!

>Why would I use unicode?
Your input might contain a dash — proper “quotation marks”, phonetic characters, e.g. /dʒJf/ (the correct pronounciation of “GIF”), or mathematical symbols, e.g. y ≤ ⅓ • x7 ∀ x ∈ C

You live in a world where you don't understand 4th grade maths, may be there's no point of unicode for you but people with 3 digit IQ need unicode.

Attached: degrees-radians-formula.png (1280x720, 10K)

>u don't need strings

Attached: 1544729016111.png (1098x1126, 492K)

That pic is golden. Saved.

>J
Why does Jow Forums have these filters on IPA characters anyway?

Remind me again why I would be manipulating strings with symbols like that?

You don't need to know about unicode to handle it in C when passing it along to a graphics library to display it. There is no point whatever to know about it. If you need to parse it use a parser that returns better symbols.

Which was compiled with a compiler written in C++.

A 2D isometric dungeon crawler RPG written from scratch in C++. I've figured the best way to learn programming is by jumping straight into the raging sea. Sink or swim.

A few weeks in and I seem to be able to swim so far, already made most of the menus, combat and inventory functions, items and their interactions, basically the combat system is fully functional and I hope I'll get the rest of the basic system done by the end of January but then the real trial begins, getting the 2D shit done for it to look very slightly like an actual game and not some text-based commodore game. And then if I manage to do that without sinking, I would have to optimize it for best performance. My goal for the next year is to reach Pajeet-tier programming skill. Should be doable in 1 year I hope.

fuck strings and fuck white people

>Remind me again why
That assumes your NEET-self experienced the need of unicode in first place.

struct {
size_t length;
char *str;
} string;

Wow! I have strings now equivalent to modern languages!!!

>no unicode adapter
AHAHAHAHAHAHAHAHAH

>keeps falling back to personal insults and projections
filtered since you have no real argument.

wrong

>char *str;
That's retarded. Use a flexible array member instead.

typedef struct {
size_t len;
char str[];
} string;

>2018: a c programmer recognizes the value of an abstraction (colorized)

lmao

Use mbs functions.

Arguing with someone who hasn't had to parse strings in his life is not a good use of my time. Insulting NEETs like you strokes my ego, and lot less work at the same time.

Use void.h at this point.

That's retarded. C should be more like Java and use UTF-16.
typedef struct {
size_t len;
size_t hash;
uint16_t str[];
} string;

?

>utf16
Oh God, the horror. But unironically, use wchars for that.

utf8 > utf16

Python:
def invert_case(s): return s.swapcase()

C:
invert_case(s);

wchars are poorly supported and in most cases compiler dependent. They are not safe to use.

>wchar makes my code unicode aware

Attached: 1537975511081.png (211x239, 5K)

>wchars are poorly supported
Not really, they just suck.

>and in most cases compiler dependent
Isn't it mandatory in C90? Or do you mean the width is compiler dependent?

Anyway, it's a poor concept. Always use UTF8, no exceptions.

UTF-32 was the only good encoding.

That's retarded. You don't indicate the BOM anywhere. Enjoy your noncompatibility with different endiannesses.

enjoy your non-portability and over 9000 hours of debugging

neither does Java's java.lang.String yet it has no issues.

Because the JVM is specified to be big endian, you fucking dolt.

Sounds like every C program ever, so why is it different?

char swapcase(char c)
{
if (c >= 'a' && c = 'A' && c

>I program GUIs and webpages all day so unicode is what I base all my languages off of

Attached: 8nRqoXW.jpg.png (800x729, 48K)

c ^= ' ';

English dictionaries, both American and British, contain hundreds of words with non-ASCII characters.

How many of those worlds are actually spoken? Right none. Next.

>Pretending to be a kernel dev to hide the lack of understanding the basics of unicode and its usefulness of string parsing
>call him a GUI dev using a GUI in a webpage, t-that'll show him

Attached: 1537771566378.png (1200x1400, 502K)

Why don't you go complain to the ASCII able that it doesn't support them?

A bit different because it's counter intuitive, requires C99, extra PITA when porting to C++ as it doesn't have flex arrays, memory management for that kind of structure is prone to errors.

>cliché C tard damage control

Unless your program needs to parse individual characters that may contain unicode there is literally no reason to care about unicode. C is not a string manipulation language. It's a system's language.

another good example is Arrays of Length Zero

>It's a system's language.
>can't even set the direction flag
LMAOOOOOOO

BTFO. How will Linus and the Linux kernel ever recover. unicodelets BTFO.

Sorry, that kind of language is not compatible with CoC

Can't spell CoC without C

>Unicode symbol in SSID causes kernel panic
lmao

You can intermix ASM in C so yes you can ;).

but instead you could inline asm into Rust

Lisp, the most powerful language in the world, does not have these issues.

That's ASM, not C, and it's not even standard C
haha C tards are the ultimate laughing stock

>t's counter intuitive
No.
>requires C99,
But frequently used in pre-C99 as well, it's all over the Linux kernel for example they use arrays of length zero. Also, so? C99 is almost 20 years old.
>extra PITA when porting to C++
Why would you write it in C if you plan on porting it in immediate future?

BTFO. How will Linus and the Linux kernel ever recover. unicodelets BTFO.

>C is not a string manipulation language
Yet flex, bison and yacc are implemented in C.

>implying C++ is any better in this regard

Attached: 1536063434957.jpg (1000x1000, 80K)

*sips* Ah yes, now Rust is a truly a system's language. It even has utf-8 support by default.

Attached: file.png (245x233, 65K)

Sorry, that kind of language is not compatible with the CoC

Can't spell CoC without C

>implying flex, bison and yacc are C

Sounds pretty BASED AND REDPILLED to me.

>choosing classes for next semester
>successfully avoid java

Attached: celebrate.png (865x526, 36K)

Learning C.

God, what a disgusting language. It's like every step of it was designed to cause problems, and you have to go out of your way to write boilerplate.

And the tired old argument of "B-b-but it's how the machine REALLY works, it's a wrapper for assembly) doesn't even apply, since the compiler will basically completely restructure your code anyway.

Enums aren't even type safe, what the fuck. Wait, nothing is type safe. Fuck.

Attached: Im3D0GwHaCw.jpg (400x400, 29K)

OH NO NO NO NO NO OH NO NO NO NO NO OH NO NO NO NO NO

you just need to work in industry for a couple of years and you'll understand

Linux isn't C. It's a hodgepodge of GNU C, ASM and whatever you're supposed to call the macro-ridden hellscape every nontrivial C project becomes.

en.wikipedia.org/wiki/Berkeley_Yacc
>ANSI C
en.wikipedia.org/wiki/GNU_Bison
>C and macros
en.wikipedia.org/wiki/Flex_(lexical_analyser_generator)
>C

>being happy you successfully avoided a job

>Enums aren't even type safe, what the fuck. Wait, nothing is type safe. Fuck.
use sepples lmao

Anyone here with experience regarding emulating limited color / fixed palette graphics?

I just keep getting shit performance.

Is it better to just store the images as grayscale 32bit because graphics cards are retarded and most formats are deprecated?