/dpt/ - Daily Coding Thread

What are you working on Jow Forums?

Attached: 1448851456787.jpg (1625x1117, 552K)

Other urls found in this thread:

facebook.github.io/react-native/docs/getting-started
coursera.org/learn/algorithms-part1/
coursera.org/learn/algorithms-part2/
github.com/carp-lang/Carp
twitter.com/NSFWRedditGif

Waiting for

Fields in traits

Higher Kinded Types

Procedural macros

Const generics/pi types.

Attached: rustlogobig.png (660x660, 82K)

What is C used for other than embedded programming?

C++ has shared_mutex
C++ has recursive_mutex
C++ does not have shared_recursive_mutex
So I'm jacking code from stackoverflow to make one as fast as possible.

Or am I just retarded and the fact that I need this mean my code is poorly designed?

This is not a canonical /dpt/ thread because OP doesn't use an anime picture. Here is the official anime picture for /dpt/.

It is used where ever you want to use it.

Attached: idolmaster_sicp.png (1165x1740, 2.49M)

can anyone familiar with sdl tell me why pressing and holding a key will trigger an SDL_KEYUP event?

I've tried checking for event.key.repeat, but this does nothing
void KeyManager::keyReader(SDL_Event event) {
if(event.type != SDL_KEYDOWN && event.type != SDL_KEYUP) return;
if(event.key.repeat) return;
int i = -1;
if(event.type == SDL_KEYDOWN) {
int i = indexOf(keys, event.key.keysym.sym);
if(i < 0) {
keys.push_back(event.key.keysym.sym);
printv(keys);
cout

More tinyscheme stuff, wrote a script that compiles the scheme files and the interpreter into a single binary that can be executed directly. Also implemented some stuff for it, like fold-right and while loops. Next would be optimizing some stuff that I use in a weird way, like its append procedure.

$ ls
bin caesar-cipher.scm embed.scm file-io.scm init.scm tinyscheme-template tinyscheme-template.tar utils.scm
$ ldd bin
linux-vdso.so.1 (0x00007ffe6a3bf000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f3e55da7000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f3e55aa3000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f3e55704000)
/lib64/ld-linux-x86-64.so.2 (0x00007f3e561c3000)
$ ./bin
0 and -26: ebiil
1 and -25: dahhk
2 and -24: czggj
3 and -23: byffi
4 and -22: axeeh
5 and -21: zwddg
6 and -20: yvccf
7 and -19: xubbe
8 and -18: wtaad
9 and -17: vszzc
10 and -16: uryyb
11 and -15: tqxxa
12 and -14: spwwz
13 and -13: rovvy
14 and -12: qnuux
15 and -11: pmttw
16 and -10: olssv
17 and -9: nkrru
18 and -8: mjqqt
19 and -7: lipps
20 and -6: khoor
21 and -5: jgnnq
22 and -4: ifmmp
23 and -3: hello
24 and -2: gdkkn
25 and -1: fcjjm
$

Where can I found a bunch of pre-categorized sample image for teaching my ML program?

I am getting started into writing apps.
I'm going through React Native
facebook.github.io/react-native/docs/getting-started
Seems nice for simple stuff (this is what I am into), open source and I only have to refresh my "old" javascript to write apps for android and iOS
Going through the guide, they ask us to install "expo" on our phone in order to test our project.
The thing is, although react native is open source, expo asks for A LOT of permission on my phone.
Is this normal because it needs a lot of permissions for me to test my projects, or is this the botnet smiling in from of you while his friends are ready to wreck you up from behind?

Doing some Python practice in Codewars because I can't sleep

I really like list comprehensions. They're fun.

I'm trying to figure out how to make my graph less computationally demanding, as it is now every single data point I add takes as much time as the last one plus one more, so after ~1000 points or so the phone I test on takes nearly a minute to add the next point. I'm using GraphView for Android. Looking for feedback.

So far my possible solutions that I know how to do are:

1. Add a button to wipe the data, and adjust the X minimum to the last data point. (this allows you to continue without huge lag, but you obviously lose the information).

2. Change the data collection method so that it only gathers data points on a wider period, instead of capturing every point it could capture one in ten. (This ultimately faces the original problem, after a long enough time it will also slow down. Additionally it reduces granularity by the ratio of the period to one.)

I'm a student and just had an assignment where I needed to use C to take console input and print out a worm on a branch depending on the input. The first input was the maximum worm length and also the branch length. The second input was the string to be eaten by the worm.

Every worm starts out looking like this(with a max of 8 in this example):
~OG
========

Certain characters that it ate would change the worm. "o" would make it grow an extra "o", "s" would make it shrink, "-" would make it move forward, and "x" would kill it.

The worm had to wrap around the branch if it grew or moved forward. The tricky part was that we were not allowed to use arrays(other than the array from the console input ) or any c functions other than atoi and strlen.

I had fun doing it. It wasn't to hard of a project, although it is the beginning of the semester. I feel like my code is not optimized very well though, I have a bunch of if statements and a print function full of more if statements. I set a value to the head "G" and to the tail "~" and set a int value to them that changed based on the input.

Everything works fine, but I feel like my code is bloated and awful. My assignment is already turned in and passed the deadline so, does anyone have a better idea on how to do this?

Is there a decent place to get the quick and dirty of python? Mainly done shit in C# so far

lol is that what CS programs are actually like?

Attached: .jpg (1280x720, 588K)

The instructions for this assignment were about 20 pages long. I just gave a really condensed version. There are more restrictions and requirements. My professor is crazy, but also really cool. I don't know why we are doing this though. He says this is just an intro and things will get harder.

Going on a long flight soon. What's a good Python book to read for beginners if there are any?

doing c for the first time in my degree, what a fucking boring language. do people really shill it?

It's extremely practical for low-level stuff or applications where you need to have full control over memory allocations and memory accesses.

>do people really shill it?
only the ones who are plt illiterate.

>not posting rustle girls
>not posting anime girls

post animated gif demo of your assignment

>what a fucking boring language
You're gonna have a problem with literally every single other language if that is the case.

I'm waiting for async/await and impl specialization.

why? not every programming language are terrible at doing abstraction.

So not enough abstractions = boring? That was not clear from your first post.

yes, lack of abstraction technique makes the writing of code tedious and redundant.

Is there a library that handles downloading files from the web and can resume downloads? something like aria2 but can be embed, libcurl is pretty bare bone.

C is the most prevalent language in open-source software worth giving a shit about.

Unironically is Javascript though

Will I actually get a job if I learn machine learning?

I you are self taught, maybe in some startup or small company that doesn't really know what's they're doing. To the best of my knowledge serious places required relevant degrees.

For school I'm writing an inventory management system with Java FX

In my spare time I wrote a 0 external dependency neural network with genetic algorithm.My next project that I'm just now starting is using Q learning to play pokemon.

good post.

Attached: 1529283638901.png (1277x1190, 2.2M)

Which coursera courses have you taken?
Which do you recommend?

Where I could learn about patterns in C++? Visitors, observers, etc. Books, resources?

Hi guys.
This is my best accomplishment so far.

for i in range(100):
print(i)

How am I doing?

>Prefers anime
>Not based Gondola
Y tho

This.
I want to get on python seriously and c or c++
Where to start?

followed

I'm not sure. Java provides a higher level of abstraction than C, and is far more verbose and boilerplate-y.

holy SHIT buds it took a lot of toiling and bugfixing and googling but we did it we have a functioning A* algorithm it's 5 am time for bed

Attached: astaractuallyfuckingworks.webm (1684x899, 358K)

definitively not. take generics or object inheritance, for example.

I'm making my first Android app. Users won't have usernames and passwords, I want them to authenticate once using their phone number and that's basically it.
When sending requests to the server, they need to identify themselves. Should I just give them a token (some hash of their phone number with my salt or whatever) and have them send it to me to identify themselves each time? If not, then what's the proper model?

>tfw actually starting to enjoy programming and it's really starting to click with me
>only regret is I wish I've done this when I was younger

better late than never I guess

Very nice user

forgot pic related

Attached: 1513821087297.jpg (798x809, 48K)

>my code is poorly designed?
Most probably, but so is everyone elses

I'm working on my news feed to mail program. Hopefully I can get it to work by the end of the day.

type Gender = String;

type Gender = Boolean

ftfy

enum Gender {
Male,
Female,
None,
Other(String),
}

>async/await
nice JS shit for brainlets who can't grok monads

final Gender gender = GenderSuperFactory.newInstance().createFactory().fromByte(0xFF);

>async/await came from JS

retarded detected

>I'm working on my news feed to mail program
there are already programs/websites that mail you the current news


why don't people research before working on a project?

I like to create and use my own programs.

you made your own OS too? nice

If I had the time and knowledge, yes.

>final Gender
>super gender
triggered

t. buttblasted rustlet
like it or not rust copied it from js. it's the wrong decision and you know it

typedef void gender_t;

agreed.

a simple evolution simulator in python using the pygame module. it's more interesting to me than making loads of pointless games :]

Is there any difference in assigning from a constant value, versus assigning from a dereferenced pointer?

What I'm asking is, are constant typically implemented as automatic package scoped variables?

Attached: 040eb70eb5379dbfdc5896c6e6ac8a690426154d6c0853262b0defc724ad0d23.jpg (841x1007, 151K)

void type in c derivatives is retarded desu

Not him, but any feature which requires modification to the compiler is a red flag. It's a hack for a poorly designed language.
They already had futures, which are superior to async/await.

yeah compilers can fold constants

>They already had futures, which are superior to async/await.
Async/await is just sugar for Futures, m8.

if they really wanted to add nicer syntax for sequencing then they should have gone for do-notation, which is far more general

With additional keywords m8.

shitty sugar for brainlets

Yeah, but it's not a separate entity copied from JS just to have it, it's a logical development of Futures.

>a logical development of Futures
try again brainlet

I don't see why it's a logical development of Futures. Is it to accommodate for the lack of HKTs?

It will simplify writing Future-heavy code.

I know, but WHY does it simplify things over the Future monad? Rust devs are already familiars with monads (Option, Result).

>Is it to accommodate for the lack of HKTs?
Nope, you can make do-notation work by desugaring to and_then and point. It's purely to appease shitlangers by making Rust into another shitlang

Thanks for the term. That's an interesting technique.

Rust doesn't have monad. You can't express it in Rust because it doesn't support HKTs. Option and Result are just datatypes. In better languages you can define a monad typeclass and instances of monad for them.

I'd say: work on HKTs first instead of adding even more sugar to the language.

Enrolled in both coursera.org/learn/algorithms-part1/ and coursera.org/learn/algorithms-part2/ starting today, hope I won't lose interest half-way.

Attached: 1 ErQpF8e8pDOZSlxZBDdt_Q.png (1694x340, 23K)

I don't disagree. I doubt they'll do it, though. See how the HKTs feature request has languished for years with ridiculous excuses for not implementing it ("we haven't come up with the right syntax yet!" NOT THAT THAT STOPPED YOU BEFORE, YOU JUST WENT WITH THE UGLIEST SHIT THAT CAME TO MIND). Remember when the language was initially touted as "OCaml without a GC"? How the mighty have fallen. Now it looks like they're desperately trying to attract webdevs by adding syntactic toys.

Today one of the guys I work with actually wrote a comment BELOW the line it referenced. I mean seriously does anybody actually do that?

dios mio mother of dependencies
this is your brain on qtboostism

Attached: wt.png (482x664, 28K)

>Now it looks like they're desperately trying to attract webdevs by adding syntactic toys.
That's a shame though. The project really had the potential to become a 'functional C++'.
I liked Rust and put quite a bit of effort in it, but lost hope. The project feels all over the place.
My current bet is on Haskell and this: github.com/carp-lang/Carp

I've been keeping an eye on Carp too. S-expressions are a joy to work with, so a statically typed GC-less language using them could be fantastic. As for Haskell, I think it's been viable for real-world dev for a long time now but a lot of FUD persists about it.

Never seen it. It's either referencing the line below or the same line.

> a lot of FUD persists about it.
1 Gb/s OF GARBAGE

Self-contained code that doesn't require entire communities to maintain a local software environment.

High performance.

Predictable run-time behavior.

Oh look, this canard again. If you manage to make Haskell produce that much garbage then you're writing shitty code, either intentionally or not.

Agreed on the Haskell part. The language itself may never become mainstream, but its features will.

>its features will
Eventually, after much kicking and screaming, and after PL research has moved onto newer things again and again. We may get HKTs in mainstream languages by 2030. Maybe. We'll still have shitlangers telling us that they're simply not practical.

>hey guys
>im gonna make a statically typed functional language
>except actually it's not gonna have much in the way of type system features
>but I'll add an ad-hoc crappy version of some features for one type
>and uh you can use all these OOP libraries from a big OOP language with it
>ofc that means you'll have to essentially write OOP code to deal with them
>thus negating the point of this new language
>and being uglier than just using them from their original language
>and I'll have to add all sorts of holes to the type system
>ffs why are FPers never happy???
>stop bullying me by pointing out shortcomings of my new language
>you just don't understand, the worst of all worlds is a valid tradeoff to make
>I'm simply being p-practical here
>g-go back to your ivory tower
>y-you're all b-banned
>medium post: why is the FP community so toxic??

It's Scala, isn't it?

Kinda a mixture of various languages-which-are-often-claimed-to-be-functional

>hey guys
>im gonna make a statically typed functional language
>except actually it's not gonna have much in the way of type system features
>but I'll add an ad-hoc crappy version of some features for one type
>and uh you can use all these OOP libraries from a big OOP language with it
>ofc that means you'll have to essentially write OOP code to deal with them
>thus negating the point of this new language
>and being uglier than just using them from their original language
>and I'll have to add all sorts of holes to the type system
>ffs why are FPers never happy???
>stop bullying me by pointing out shortcomings of my new language
>you just don't understand, the worst of all worlds is a valid tradeoff to make
>I'm simply being p-practical here
>g-go back to your ivory tower
>y-you're all b-banned
>medium post: why is the FP community so fascistic??
Fixed it for Rust.

Weirdly that picture looks really comfy to me

yellow fever detected

Not the time or place to be a dick user, simple projects with lots of reference tools are easy and fun, not everyone is trying to publish

Not true tho.