/dpt/ - Daily Programming Thread

What are you working on, Jow Forums?

Last thread:

Attached: 1555987863042.jpg (1280x720, 738K)

Other urls found in this thread:

en.wikipedia.org/wiki/ANSI_escape_code
twitter.com/AnonBabble

this post has nothing of value

Attached: 0b3.png (534x400, 188K)

nth for Nim!

sorry, i'm trying my best.

Attached: bocchi.jpg (596x682, 277K)

C++ is the most powerful programming language.

What is the possible use case of an object of class
IteratorFactoryParameterSetterGeneratorMutexWatcherThreadPool

?

Attached: .jpg (1280x720, 434K)

impressing your manager with useless shit code

this

if anybody let you get away with obvious trash like that then they deserve everything that happens to them.

Fuel for the barbecue fire.

I don't know the first thing about operating systems research beyond containers, microkernels and capability-based privilege models. What are some good resources to get up to date?

it's a factory that produces iterators for containers of generators that create parameter setters that watch the mutexes of a threadpool
so it's probably used everywhere in your typical enterprise java program

Anyone ever use sizeof(char) when allocating strings just to maintain memory safety?

>memory safety
sizeof(char) is guaranteed to be 1 by the C standard.

Eh, figured as much, but I just wanted to clarify.
Thanks, user!

If safety is your concern, you should avoid using sizeof(T) as much as possible and use sizeof *ptr, e.g.
struct foo *ptr = malloc(sizeof *ptr);

Usually, I'll only malloc if i need a mutable array of objects that will last outside of the scope of a function (say, if i wanted to return it from said function for example).
So, something like this:

struct foo *my_foos = malloc(10 * sizeof(struct foo));

What language should I go on to learning next now that I’ve mastered python and C++?

You haven't "mastered" C++. Literally nobody has.
Now, you've certainly wasted a lot of time, so just forget all that useless nonsense and get good at C.

>mastered C++

Attached: 1540810595735.png (677x373, 151K)

Holy shit

>I’ve mastered python and C++?
you have not mastered C++, no one has
Mastering C++ is not a goal, it is an unending process
it is the journey of enlightenment
you have a long way to go, you do not even yet know how much you do not know

If you become one of the 500 people to master C++ the C++ standards committee mails you a check for a $100k and an invite to help draft C++25

Didn’t know what other term to use, but besides C what would be a good language to learn? Or are these two languages good enough combined with bash?

depends what you want to do.

Well I’m not very interested in gaming development, and I’ve been avoiding .NET since I’m not interested in Windows development. I mainly just want to have a third language under my belt so that I’ll be prepared for a decent range of dev jobs once I graduate next semester.

How to clear screen in C?

jobs is vague too, java is useful to know.

C just writes to stdout. Clearing the terminal is done by your shell.

Am I a code. Does God want to code me?

Attached: db946a737182e4d4ea9307ef8bfa3cfeb91ac5eec2a8f85df3223801a57e1772.jpg (400x400, 19K)

depends on the terminal and OS. windows has it's own win32 console functions, on everything else it's an ANSI escape code.
en.wikipedia.org/wiki/ANSI_escape_code
printf("\033[2J");

I'm so new I don't even know what that means.
There's no "clrscr;" or something to clean the program after scanning a lot of variables?

>grades can't be negative
I got two -20s in a high school physics class. Don't underestimate your users.

Just keep writing newlines until history limit is reached

based

>printf("\033[2J");
do I need a specific library for that to work?
Also, which is the library to make C print á é í ó ú â ê î ô û à è ì ò ù etc?

baseado e redpillado

Look up ncurses/ a curses thing for windows. If you're trying to do a lot of terminal things that's the standard way. Wide characters, or unicode, utf8, whatever you want to call it for extended characters. Look those up.

i + a b - + - *^ `!~3~$a -> z && ./bin/bash

thoughts on my new language? Its more efficient that Jazzy

If you're using any terminal emulator on gnu/linux, bsd or osx it should work. Windows is the weird one.
>Also, which is the library to make C print á é í ó ú â ê î ô û à è ì ò ù etc?
Unicode is weird in C because C has no strings. Although if you're only on not-windows, it's all UTF-8 and should work. If not your terminal is misconfigured or your font doesn't have the glyphs.

>C has no strings
le ebin maymay!

Those aren't strings, they're 0 terminated byte arrays. It can't even deal with characters outside of ASCII.

simply eric, bro! I love this one!
I'm going to post this on r/programmerhumor!

why did microsoft think utf-16 was a good idea

They started by assuming that a fixed width of 16 bits would be enough (UCS-2), and then for some reason decided to go from that to UTF-16 instead of from 8 bits to UTF-8.

but why did they think that was smart

Why do sea and sepples shills talk like they’re mentally ill?

I guess the paradigm was already "8 bits for ASCII, 16 bits for more than ASCII" and they wanted to keep that going.

>not programming in .NET IL

Because they implemented UCS-2 in NT, not UTF-16. The latter was an afterthought.

Also, UCS-2/UTF-16 has the best memory/performance ratio if you have to deal with common non-Latin script languages. UTF-8 is in an awkward position where the aforementioned languages end up using a lot of 3-byte surrogates whereas UCS-2 (or UTF-16) only needs 2 bytes.

UCS-2/4 and UTF-16/32 having larger code units also allows aligned reads/writes, but in UTF-8 you have no choice but to parse single bytes - and you have to deal with all the ways UTF-8 streams can be broken, and there are a lot of them. Almost every single Unicode related CVE is because of someone somewhere mishandling UTF-8. On the flip side, you don't have to deal with endianness.

Because UTF-8 wasn't the standard yet. Microsoft tried to be forward thinking, but was TOO forward thinking that time. So they were already too invested in UTF-16 by the time it was clear UTF-8 had won.

I'm new to Assembly and I'm working on a simple program right now where I essentially need to modify this code, which uses the stack to add 3 numbers, to take 2 entered strings and combine them. Looking for some tips to point me in the right direction, I thought it'd be as simple as replacing ReadInt with ReadString and whatnot but it hasn't been working so far.

Attached: Untitled.png (202x527, 9K)

My wife/daughter Bocchi is so fucking cute.

>This meme again

Attached: 1546899632844.jpg (750x1000, 355K)

Why does seeing anime women making faces piss me off and make me horny at the same time?

Dumb Ctard knowing absolutely nothing.

The proper way to clear the screen is to write appropriate escape codes to stdout that your shell interprets.
Look them up.

Ah yes good old ReadInt and ReadString, those functions that every assembly language programmer definitely has heard of

my bad forgot mention it's in Irvine

>What are you working on, Jow Forums?
I'm a beginner. What should I do as a first project?

Attached: 1490335052899.png (816x587, 116K)

A -> T skip A

T c A * T c' B -> T (c | c') (A * B)

T c (T c' A) -> T (c ; c') A

Write a multitasking kernel.
If you can't do that as your first project then you should just give up programming.

In languages with actual string types and memory safety, UTF-8 eats pussy like nobody's business
Endianness is a motherfucker if you're receiving shit over the network, and most of the time you don't even have the neat UTF-16 pseudo-header bullshit to help you out

Depends on how "beginner" you are.

If you can write a working fizzbuzz, then you should move on to Project Euler and solve the first 10 or so problems.

>make me horny
I can understand this part.
>piss me off
Can't understand this part.

They're just so smug looking.

Yes, but I don't understand why that pisses you off.

This sounds really difficult. Are you making fun of me?
Alright, I'll do that. After that, should I just keep going with the list? The first ten look easy 2bh.

more for the patricians among us
!f: i j k, i j + > k. `k#j ~ k * i. i& < j k + i - @2

>This sounds really difficult. Are you making fun of me?
Yeah. It's the whole reason we talk about Linux as an OS instead of GNU

Actually I misread what the user. Disregard my post.

What are lambdas?
Why should I care about lambdas?
What is the best syntax you have seen in programming to express a lambda?

Attached: Lamda.png (225x225, 2K)

i am programming something that will fetch a random post from Jow Forums and make an image with with text-post + image posted. actually an user already did exactly that but i got jelly and wanted to do it by my own.
so far i have been able to fetch the text post of random image only thing that is remaining is fetching the posted image. i picked this project to learn python.
comparatively user's program was very nicely written but mine looks like shit and amateurish. oh well, as long as it werks

>After that, should I just keep going with the list?
Go as far as you feel you can manage, skip ahead if you want, doesn't really matter.
They should just give you a better feel for your toolset and the specific ways to solve computational problems therewith.

>What are lambdas?
Upside down y
>Why should I care about lambdas?
They only matter in Australia.
>What is the best syntax you have seen in programming to express a lambda?
[](){}

>
i dont think theres any valid sepples lambda with that

-std=c++2a

Every time when I was studying Pascal people would come up and talk shit about it if I posted code and asked for help. Now that I'm learning C I can't understand why someone would want to use it when something as simple as print "ã" is so convoluted.
In Pascal you just print it. It just werks.
English is my second language. My mother language and the third one I'm learning uses a lot of accentuation and I feel like retarded writing without accents.
Fuck C. I'm still gonna learn it, though, because I need to.

Attached: download.jpg (209x241, 10K)

printf("ã\n"); just werks for me.

>studied c++ when i was in my adolescence(always were a introvert)
>cant fucking stop declaring my variables as type
I almost lost a test because of this shit FUCK

>>cant fucking stop declaring my variables as type
what did he mean by this

Forgive me lord for I have sinned. And on the Sabbath of all days.

Attached: xmljava.png (1386x993, 93K)

>What are lambdas?
ask the greeks
>Why should I care about lambdas?
church wrote a funny algebra once
>What is the best syntax you have seen in programming to express a lambda?
\x -> f x

You already sinned when you used Java and Android.

he means you're a fucktard kys

>church wrote a funny algebra once
I did have quite a laugh at it too when I learnt it.

That's what I meant though.

Attached: 1462082471526.gif (380x238, 1.92M)

>\
He said best not worst.

I meant
variable = int
I know, you dont do that shit in C, you do
int variable = 0
But for some fucking reason i cant stop doing that in python, its fucking my shit up.

What the fuck is that creature.

Thoughts on Objective-C?

your not trying hard enough. you have to search for the code snippet you need because nobody tells you what to do after watching 50 hours of instructions of people saying the same thing over and over never addressing the problem you have...

then you have to search forum posts and git hub for answers... then you have to cobble some spaghetti code together from random acceptable syntax that mono develop doesnt hate and then say your a dumb faggot this aint code and then run it and pray that you didnt just skip 2 nights worth of sleep for nothing because at some point in day 1 you forgot to go piss but you kept drinking your mountain dew code red and now your spleen may explode

Reminder that there's nothing wrong with a little bit of bloat.

Attached: 1531562114159.jpg (960x720, 451K)

If it's not detrimental, it's not bloat by definition.

fuck singletons

Lambdas, user function, function without name

Lambdas begin shorthand for send functions to other functions.

Haskell as syntaxis /var var1 -> var + var1

*\

Ok so representing AND, OR, NOT, and XOR is relatively simple:

AND -> '&'
OR -> '|'
NOT -> '!'
XOR -> '^'

but what about NAND, NOR, and XNOR? How do i represent these in a way that isn't painful to read?

so one expression for it could be lambda b -> function(x) + b ?

whoops i think you meant ∧, ∨, ¬ and ∃!∈{,}

that last one doesn't work but whatever

Attached: 10262051_10202271916658945_1058480794194592846_n.jpg (600x611, 53K)