/dpt/ - Daily Programming Thread

What are you working on Jow Forums?

Previous

Attached: 1538823994802.jpg (1000x1500, 215K)

Other urls found in this thread:

devdocs.io/node/worker_threads#worker_threads_class_worker
elm-lang.org/
gnu.org/software/libc/manual/html_node/index.html
reddit.com/r/rust/comments/9nnzeo/wow_rust_version_10x_faster_than_c/
lua.org/work/doc/manual.html#8
youtube.com/watch?v=1PhArSujR_A&t=24m30s
wisdomandwonder.com/link/2110/why-mit-switched-from-scheme-to-python
strawpoll.me/16604780
twitter.com/SFWRedditGifs

for me, it's zig

Attached: zig zig.png (2490x587, 32K)

I wrote my own Lisp and got experience in language building. Now I want to make a javascript-like language that isn't garbage. I think JS has a good heart that's hidden away by all the crappy parts. Basically I want to fix all the shortcomings of JS.

Would you use something like that?

it does already exist and it is known as the Dart programming language. Dart is what Go would have been if designed by PLT literates.

>using cargo clippy for the first time

Attached: 1538905976183.png (518x478, 282K)

I only write enough JS to augment a page, and if I do heavily, I use some libraries, so honestly, using a transpiler is just a bit cumbersome for my likings, automatically done or not. Still shouldn't stop you though

JavaScript rocks!

Attached: unknown.png (1000x494, 369K)

this tbqh

this but ironically

Dart is OK and a nice evolution towards stricter typing, but I want my language to go the opposite way. No classes of any kind, just prototypes and Object.create.

I don't really want it to be a transpiler or a web-based thing. I was thinking about Linux scripting.

My main rage point with Node is how there are no threads. Every asynchronous function gets invoked sequentially by the event loop and if it's even remotely complex it will bottleneck the entire fucking thing. So what people end up doing is cutting up their code into little bits and pieces, almost as if it was continuation passing style, and then letting the event loop "schedule" things.

I want to have threads communicating with each other through the event loop.

Daily reminder that __auto_type, statement expressions and local functions make gnuc acceptable lisp.

>No classes of any kind, just prototypes and Object.create.
thats a shortcoming of JS

I thought Linxu was supposed to be better for programming, but all the great reverse engineering programs are on Windows.

>My main rage point with Node is how there are no threads
devdocs.io/node/worker_threads#worker_threads_class_worker
dumb user

all the X programs are for Windows
programming is text editing, you can do that on any operating system

ew

Zig seems cool. I looked into it a while ago but haven't actually tried it out. What are you using it for?

>3pdp op
trash

i am not sure you could call that multi threading since each thread has its own specific environment and you then communicate between these using channels.

>JS but not shit
It's called Elm.
elm-lang.org/

>Dahyun
>trash
the hell are you smoking

>koreans
>any higher than trash

Currently learning about the switch-case structure in (embedded) C and working on a toy lighthouse control program.
LightChar.type is an enum and I'd like to call "LightSequence_Flashing()" routine on certain light types without using an if-else with a lot of "||"'s.

switch (LightChar.type) {
/* Flashing light type */
case FLASHING:
case OCCULTING:
case ISOPHASE:
case LONG_FLASHING:
case QUICK_FLASHING:
case VERY_QUICK_FLASHING:
case ULTRA_QUICK_FLASHING:
/* lamp control sequence */
LightSequence_Flashing();
break;

/* Constant light */
case FIXED:
LampControl(ON75);
break;

/* Error */
default:
LampControl(OFF);
break;
}

Attached: all-your-base-take-off-are-belong-to-us-every-13043632.png (500x454, 119K)

I got multitasking working on Cortex-M4. It's cooperative atm but I'm pretty sure the way I'm doing it should be easily portable to preemptive.

Attached: Screenshot_20181012_160907.png (1920x1200, 437K)

How would you insert items into a linked list alphabetically?

Attached: 1539112785124.jpg (500x367, 22K)

By using my brain to solve this homework on my own :)

Learning C. Should I use gcc or clang?

double linked or not?

>trying to debug ARM program with GDB
>current instruction is an SWI
>do stepi, expecting to land at 0x08
>program suddenly resumes normal control flow and gets stuck in an infinite loop somewhere else
what am I doing wrong?

It makes very little difference, especially while you're still learning. I've heard people say clang has better error messages, but I find it's all the same.

I use gcc because it's what I've used for the past 10 years.

no importance, just be sure to compile with -std=c11 -pedantic-errors

IME Clang works better on Windows than GCC, but if you're on Linux it doesn't matter

I would go with GNU tools in general to learn, just because of the manuals. Say what you want about GNU but their manuals are usually pretty good.

The glibc one especially. After you learn the basics, skim through this and you'll get a good idea of what's available to you in libc and how to go about doing certain things.

gnu.org/software/libc/manual/html_node/index.html

It does have GNU-isms and some people really hate GNU but this is a very good manual.

I've had the opposite experience, gcc works pretty much flawlessly(have used both msys2 and tdm distributions), but clang has had a tendency to not be able to properly detect its own standard headers and libraries.
But then again, using anything but the official ms tools on windows is a fucking shit show.

I thought clang was just a compiler and used an external libc(e.g. glibc on linux).

>caring about programming enough to argue about language or framework
>not just chasing the money

You're probably right(just did a quick google) but, it still has issues on widnows, at least in my experience, might be some PATH issue, but when it started fucking up for me, I just went back to gcc, which has never given me any issues.

Holy shit, how will C ever recover?
reddit.com/r/rust/comments/9nnzeo/wow_rust_version_10x_faster_than_c/

A big nasty fuckoff wad of Template Haskell to derive instances of a "higher-rank functor" typeclass for tons of very boring "business domain" types. This lets us parameterize the types over a functor type in which they store their data (e.g. Maybe, Identity, some domain-specific ones which capture additional metadata) and then use natural transformations to go between concrete representations.

t. employed Haskell programmer

That could be great until you become a 40 year programmer who never took an interest in his job and doesn't enjoy his work day.

The money is drying up with all of these CS grads though. Our company had TWICE as many entry-level applicants this year compared to last year. It's quickly becoming minimum wage work.

fuck off reddit

>C
>0.05 real 0.02 user 0.00 sys
>839680 maximum resident set size
>Rust
>0.31 real 0.19 user 0.08 sys
>154705920 maximum resident set size
>I get that Rust is slower than C, but here, it is 6 times faster. Also, usually, in my tests Rust is faster than Swift.

>Holy shit, how will C ever recover?
what did you mean by this?

>what did you mean by this?
I was shitposting and edited the link title to look like a sensationalist rust success story when it is actually a compete failure.

how subversive of you

>Why could this Rust code be slow?
Hey man, just like trust the compiler, you don't need to worry about anything.

The OP of the thread even says he already knows that Rust is just slower than C

Wait its landed? I thought it was on perpetual limbo. Does it launch separate event loops?

No its not. It's so easy to string up objects its the main reason the language is easy to use.

building an order book these edge cases are killing me

GC's can do memory compaction. Native langs with manual memory management actually can't do that. Or at least implementing that would suck

Attached: TWICE TV 'Dance The Night Away' EP.03-EDCAWjXHhZM-[02.15.035-02.21.174].webm (1280x720, 2.89M)

By memory compaction you mean defragmenting the allocation space? Obviously with manual memory management you really can't without moving possibly large amounts memory around and somehow updating every pointer to them.

How does a GC go about doing this efficiently?

what do you do when a language dies? e.g. luajit

Luajit is an implementation, not a language.
Lua is not dead.

>mfw interviewed for a codemonkey position today and they literally asked me to do fizzbuzz

Attached: 354deaa3770912621bb816da070346ab.jpg (258x245, 12K)

luajit is a fork of lua. they aren't compatible in the latest versions and will never be anymore.

single linked

could you just swap() the data in one node with the other or is it more complicated than that?

last luajit update was last year and not new lua versions since then as far as i know
what are you talking about?

and they say /dpt/ isn't practical.

It involves a search in the list for the position and a reference to the previous node, to update the ->next pointer.

luajit was created based on lua 5.1 and there are no plans on supporting lua 5.3.
meanwhile the 5.4 development version can be downloaded already.

Something like this in C, not tested.

struct node *node; // to insert
struct node *curr = NULL, *prev = NULL;

// searching a singly linked list is linear
for (curr = list; curr != NULL; curr = curr->next) {
if (compare(node, curr) >= 0)
break;

prev = curr;
}

node->next = curr;

if (prev)
prev->next = node;
else
list = node;

lua.org/work/doc/manual.html#8
its nothing
you can start worrying if they move to lua 6

I don't know what I should be doing. I wrote fizzbuzz as optimized as I could in C++ and python and I don't feel like I could improve it much. I need to learn about classes and stuff but I don't know where to start.

Me on the left.

you are comparing 5.4 to 5.3, not 5.1.

Have you considered reading a book on python or C++?

reminds me of this
youtube.com/watch?v=1PhArSujR_A&t=24m30s

it looks like the time select doesn't work with the embedded version, so go to 24m30s

I don't really do game dev but isn't scanning through 10 MiBs of memory to update references for GC on every frame bad?

I don't doubt carmack, he's a very programmer. He seems to pass it off as though it was nothing though, I don't get it. Am I missing some context?

I'm currently trying to solve the problem shown in pic related. I just discovered this inaccuracy in floating-point arithmetic (specifically addition and subtraction) not too long ago, and I think I found a solution using long addition/subtraction, but the class is still in testing.

Attached: reeeeeeeeeeeeeeeeeeeeeeeeeeee.png (1152x884, 37K)

the solution is using integers

>I just discovered this inaccuracy in floating-point arithmetic
what made you think floating point arithmetic was accurate?

Please be honest. Is SICP a meme?

Use floor or ceiling

>isn't scanning through 10 MiBs of memory to update references for GC on every frame bad?
no
games processes hundreds or thousands of megabytes of memory per frame

Actually I didn't quite understand the context here, after watching the video
He's basically wrong, he says games have only a few megabytes of state outside of constant resources but even for 2013 when the talk was given this isn't exactly true, iD games have always had fairly simple gameplay but try examining the state of a fairly complex open world game or strategy game or something and they're going to have much bigger heaps than he's suggesting

yes but still a good book

>he's a very programmer
lel, anyways, I imagine that you could do some form of automatic custom allocations at a low level to help with some stuff like reducing the number of syscalls and helping with trying to keep things on the same cache lines. Although you'd have to be careful with not taking up too uch time. Taking time to render something is only bad if it takes over 16 ms/frame or whatever it is. He talks about using discipline to ie make sure large static data doesn't need to be copied, just the large amount of small allocations. If you actually want to understand if it's tenable you should look for benchmarks or try to make a small game yourself though.

>yes but still a good book
You recommend it for learning how to program?

1. Do you want to learn Scheme & some math
Yes -> read SICP
No -> Pick any other book

I didn't know that before, because I never really played around with math in programming before getting assigned this one project in uni. Now I do

Yes, sussman is a brilliant language designer, and you might as well learn from the best.

Ganbatte

Attached: 1432150075245.jpg (897x647, 484K)

I want to learn python for scientific computing. Some guys recommended it for me because it would teach some important programming concepts. I guess I got memed.

>should i read a book on Scheme to learn python
the absolute state of you brainlets.
And no, don't listen to cultists.
SICP does most everything in Scheme, it is not a "general" book.

Well of course I wouldn't learn python with scheme, come on now. I got some books from Jow Forums wiki.
Thanks anyway

Python will teach you to wrangle APIs, which is a different skillset from the learn to build up from the bottom type of skillset that SICP teaches.

wisdomandwonder.com/link/2110/why-mit-switched-from-scheme-to-python

Costanza asked Sussman why MIT had switched away from Scheme for their introductory programming course, 6.001. This was a gem. He said that the reason that happened was because engineering in 1980 was not what it was in the mid-90s or in 2000. In 1980, good programmers spent a lot of time thinking, and then produced spare code that they thought should work. Code ran close to the metal, even Scheme — it was understandable all the way down. Like a resistor, where you could read the bands and know the power rating and the tolerance and the resistance and V=IR and that’s all there was to know. 6.001 had been conceived to teach engineers how to take small parts that they understood entirely and use simple techniques to compose them into larger things that do what you want.

But programming now isn’t so much like that, said Sussman. Nowadays you muck around with incomprehensible or nonexistent man pages for software you don’t know who wrote. You have to do basic science on your libraries to see how they work, trying out different inputs and seeing how the code reacts. This is a fundamentally different job, and it needed a different course.

So the good thing about the new 6.001 was that it was robot-centered — you had to program a little robot to move around. And robots are not like resistors, behaving according to ideal functions. Wheels slip, the environment changes, etc — you have to build in robustness to the system, in a different way than the one SICP discusses.

And why Python, then? Well, said Sussman, it probably just had a library already implemented for the robotics interface, that was all.

text editor poll continues
strawpoll.me/16604780

Attached: to prawda.jpg (363x720, 72K)

I kinda get this image, when I'm sitting in classes I find myself thinking that I just want to program. And when I'm relaxing, my mind drifts towards programming, ie generating and comparing forms, watching them dance.

W-Wow... What a blast from the past.

Doesn't work. All it does is round the double to the nearest whole number. I want to accurately subtract and add floating points, not turn them back into integers

you can't do that with floats, use rational numbers if you want it to be more precise

>In 1980, good programmers spent a lot of time thinking, and then produced spare code that they thought should work
they still do that now you pretentious fuck

I did work out a way, but it involves simulating long addition and subtraction by performing the desired operation on each represented digit in the double. Looks like that bullshit from elementary school was put to good use after all.

should I worry if my three.js game can't be played by another use?

that's cool, good luck with that

Programmers still do everything you do in SICP tho. Even if you're a slave working for whatever Salesforce dbms you still will be asked to come up with some kind of system that can automatically do something, then you reach for sicp so you can create the most generically typed function in your lang of choice that can be reused or something crazier like playing around with SML signatures, create one signature and then all your updates match it perfectly.

In the much-lamentable Old Days, when fewer programmers could get at the source code, it had higher quality and was better to learn from. As more people with less commitment to quality and much less attention to detail got involved in writing it, its educational value diminished, too. It is like going to a library full of books that took 50 man-years to produce each, inventing a way to cut down the costs to a few man-months per book by copying and randomly improving on other books, and then wondering why nobody thinks your library full of these cheaper books is an inspiration to future authors.
-Erik Naggum

Trying to stop being a retard

Attached: codility.png (400x400, 13K)