/dpt/ - Daily Programming Thread

What are you working on, Jow Forums?

It's anime girl reading SICP time!
NO THOTS ALLOWED!

Previous thread:

Attached: 1446700057072.png (724x674, 491K)

Other urls found in this thread:

sarabander.github.io/sicp/
bitbucket.org/snippets/Tetsumi/ke6jL4/sicp
sicp.moe/
tonsky.me/blog/disenchantment/
paste.pound-python.org/show/Vox3RKLD08CXMVbXRV50/
jetbrains.com/clion/
twitter.com/NSFWRedditGif

>NO THOTS ALLOWED!
I had my toenail removed so i basically cant walk and am bedridden so ive just been taking adderall and solving the google foobar thing i got invited to

Neck yourself you fucking idiot
Stop programming

func exchange(n int) int {
return n/10 + (n-(n/10)*10)*10
}

>too hard

fucking Northwind database image has magic (read: garbage ole header) 78 bytes.

var ms = new MemoryStream();
ms.Write(buffer, 0, buffer.Length); // not an image
ms.Write(buffer, 78, buffer.Length - 78); // it's an image

wouldnt that just be n/10

Writing a (((real))) OS on x86_64 or designing my own CPU/Computer with VHDL (Most of the CPU is already done) and my own language/interpreter/compiler/whatever?

sure that's one way.
but unless you need to run this for millions of people billions of time a second. a simple string allocation for the toy program is meaningless.
It's just another way to solve the problem.

Structure and Interpretation of Computer Programs is the greatest programming book ever written. Uncle Bob's trilogy come second.

Attached: guidosicp.png (601x623, 100K)

PS: pong more or less just werks. Still a few bugs, but who cares. Ill fix them someday

>2009
i bet python wasn't even diverse back then

webm related

Attached: wedidit2.webm (1280x720, 2.93M)

>VHDL
isn't vhdl now considered obsolete technology? i think everyone has moved to system verilog now

Attached: carmack_sicp.jpg (659x505, 87K)

It's the same fucking thing.
You can even you both in the same project. It doesn't matter which language you are writing in since it will get synthesized to the same hardware

>even you
%s/even you/even use/g
Is it worth buying it? Or should I just read it online?
47 yuropbucks are quite a lot

That says nothing about VHDL not being obsolete though.
Verilog is a better language to write in.

Which one is better:
>while(1+1)
>while(true==true)
>while(!(!true))

online or download
sarabander.github.io/sicp/
bitbucket.org/snippets/Tetsumi/ke6jL4/sicp

Attached: reading_sicp.webm (1280x720, 1.13M)

>Verilog is a better language to write in.
No. That's just taste. Also VHDL is a lot more popular in yurop (even though it's a muricuck language kek)

while(1)

while(const auto ever = true)

for {}

>That's just taste
And my taste is objective

forever:: Monad m => m a -> m b

is europe relevant at all in cpu design?

# PYTHON
very new. many questions.

>1
assuming you're running the software in a terminal, how to block typed characters from showing up on the screen?
I've used "getpass" for this to block input as someone types, but once you're out of it then characters written will yet again be echoed on the screen.

>2
how to count the length of a string as someone types? I would like to DO THING if len(input) >= VALUE, but I would like it to evaluate the length of the string without the user having to press Enter.

>3
how to load garbled random byte-input (for example, from /dev/urandom) as a string? I've tried to throw some decode() on it and such, but it isn't compatible.

>4
if you'd like the user to make different choices, and repeatedly get back to the same "menu" or input-function-place after having the previous input evaluated, should you simply use a while loop?

>5
how to block the echoing for "subprocess.run()"? I just want to define a variable as the output of the subprocess commands, and that works, but the output is printed out in full every time. should I maybe define a function for it, without a "return" to block output?

>6
if I want to repeatedly get a new value of a variable, for example z = 2x, should I then just make a function like the following and then just call it when I have a new value for x - or is there another preferred way? when I define variables inside functions it seems to work sometimes, and sometimes not. should I type "return z" in the end?
def z(x):
z = 2x
return


>7
how to make a function that prints 1 dot on the same line, with 1 second in-between, and 3 or 4 dots in total?
I tried to do something like

def dots():
for n in range(0,2):
time.sleep(1)
print(".", end="")
n=n+1

or similar. I get all the dots printed after several seconds. how to make them come at an interval? should I use an internal for loop in the function?

Attached: 1534260915512.jpg (427x507, 44K)

>hdl is only used for CPU design
????

You can design any hardware that needs logic gates with that shit, nigger.

Am I doomed to be a monkey if I struggle to understand DFS? The curriculum now includes minimum k-cut and other things, but I feel I will never be able to catch up.

What should I do? I will be able to pass by cheating but at this moment I only fool myself.

Attached: 15372557314340.png (480x800, 202K)

>8
how to clear the screen while running a python program, similar to "clear" in bash?
I'd like to clear the screen and fill it up with new input restricted to the size of the window, for every "choice session".

>9
what's the most efficient way for python to deal with long lists of numbers, say +10 million of them? arrays or lists?

thanks in advance

>tfw the chances of me getting my dream job is greatly reduced if I don't move to America

Why it be like this bros. I just want to do cool programming shit. Fuck this gay earth

Just do yourself a favour and use ncurses already.

we are talking about cpu design, tard,
>designing my own CPU/Computer with VHDL ()

What about logic doors or logic windows?

>It's anime girl reading SICP time!
sicp.moe/

>CPU design
>cool programming shit

> (You)
>we are talking about cpu design, tard,
>>designing my own CPU/Computer with VHDL ( (You))
;-)

I'm not that guy and I wasn't referring to CPU design.
It's just everything in general. Anything tech is so much further ahead in America

>1
check keyboard buffer.
if a key was pressed, check if it's valid.
if it's valid, proceed.
if it's not valid, ignore that keypress.
not python programmer, but i guess the idea is similar

>Anything tech is so much further ahead in America
no. Do not overgeneralize everything like a brainlet retard

>It's just everything in general. Anything tech is so much further ahead in America
You should do like I did then, and make something interesting and put it on Github.

I have heard of it, but hadn't considered it... I'll look it up, thanks.

cheers, I'll try that.

Yamamoto Kumoi

tonsky.me/blog/disenchantment/

interesting

Attached: software_development_2x[1].gif (698x880, 41K)

There are definitely good opportunities where I am, I'm not saying no to that. But no way they're as good. If I get lucky I will be able to find some smaller business that does some very specific (and interesting) thing here. But there just aren't the massive 1000+ employee companies who is exclusively dedicated to doing cutting edge tech here.

>1
put the checking keyboard buffer in a loop

I agree with you, which is why I posted my way of getting discovered by an american startup. I literally just made an interesting project and put it on github and was found that way.

What does cutting edge tech mean for you?

Yurop is in fact doing a lot of cutting edge tech and murifats are depended on that. It just might not be appealing to your interests, but it is there.

>Anything tech is so much further ahead in America
ARM, STMicroelectronics, Ingenic, MediaTek, Lantiq, ... not everything is based in the USoA

Yeah, I will definitely be doing this. Happy for you!

For me personally it's robotics. I always found automation and robotics to be the coolest thing, ever since I was very young.

yeah I realize I'm definitely making large generalizations here.

>For me personally it's robotics. I always found automation and robotics to be the coolest thing, ever since I was very young.
But that's not everything. Are you retarded?
That's just a single topic.

Alright I'm sorry, chill my friend

Attached: commander.jpg (4011x2256, 1.01M)

Attached: rob pike on go.png (876x291, 27K)

...

Is Nim worth investing in?

no

any arguments?

lmao

Attached: Screenshot from 2018-09-25 06-37-27.png (136x40, 2K)

what are you struggling with?

Sounds bullshit to me

all of them will check the arguments in every iteration best to use for with no argument

It isn't, though. I checked the hardware with memtest86 which proved it. And I wasn't using ECC memory.
After I replaced the buggy module with a new one the crashes stopped.
It's not technically a bug in the usual term, but these things can happen and when they do - and I had never came across such a thing - it was a pain in the ass to find out what was causing it.

so whats the best high performance memory safe language with comfy syntax?

inb4 go

Attached: aboriginal syllabics.jpg (730x398, 50K)

Explain to me how with several gigabytes of memory and only partly bad memory, malloc() would reliably give you memory from the same physical location without there being any other signs of bad memory, such as programs randomly failing or even the kernel crashing or random errors occurring when compiling or flushing the disk cache.

Not that guy, but I don't believe you either.

>high performance memory safe language with comfy syntax?
doesnt exist
they all have shit syntax

Go

I had 2GiB of RAM on that machine, this happened somewhere in mid to late 2000s.

After I found out about the problem I decided to turn off the machine exactly because I was afraid it'd corrupt my filesystem or other apps. Luckily I had backups, but I didn't want to risk the trouble. This machine was mostly used - if not *only* used - for development of that particular app.

Clojure

crystal and nim syntax is quite comfy
nim is becoming more ugly over time, but its still miles better than C++

>this happened somewhere in mid to late 2000s.
Slightly more believable if this was a system without ASLR/PIE, but still, buggy memory would have caused a shitload of different symptoms you haven't mentioned, so I still don't believe you. Also, if machine was used for development, I would expect that you'd compile on it too or at least frequently copied files too it, which would have provoked the bug.

Hello xie. How is your fight against linux patriarchy going. BTW, these are real tits.

Attached: tits.jpg (520x340, 79K)

>crystal
Fuck Ruby syntax
>nim
Fuck Python syntax and no case sensitivity

I don't recall if it had ASLR/PIE enabled, but I want to say yes. Either way, that was a 2003 hardware.

Yes, the usual code - compile - run - test (with crashes when testing) cycle was very very frequent. Well, I don't know what else to say, believe what you want, but this happened to me. And to this day I'm still convinced it was a bad RAM module. Maybe I was very lucky it was just that specific area? I don't know.

Either way it was a very enlightening experience, I never thought a hardware problem could leak to the software side like this.

please use this so I can pretend it's not useless
paste.pound-python.org/show/Vox3RKLD08CXMVbXRV50/

>syntax
out of all the issues out there Jow Forums somehow manages to bitch about the most superficial problem at all times.

>4
While loop if the options and menu depth are limited and your loop doesn't exceed ~1 monitor page. Otherwise write a state machine.
>5
What are you trying to do? Can you not use actual Python calls for this?
>6
return with no arguments returns 0, or None (not sure). Here are two ways to do it properly:
def z1(x):
return 2*x

def z2(x):
z = 2*x
return z
Use the former for simple one-liners, the latter for occasions where the assignment is more complex.
>7
for already iterates over your elements automatically, the n=n-1 is useless. Here's what you are probably trying to do:
def dots(numDots, sleepTime):
for n in range(numDots):
time.sleep(sleepTime)
print(".", end="")
Side note: use spaces. n = n - 1 or n -= 1.
Avoid magic numbers, use method arguments or at least named "constants" (I'm not sure Python has actual constants but you could define all-caps variables and make it a rule to never write to them beyond the initial assignment).

take your choice of any lisp

A multi-threading question, so far there are many ways to implement the double buffered data, I can use tryAcquire on a lock to determine which buffer to write to/read from, or use an atomic bool, which way is the best?

Addendum:
Why even make a function that does nothing but print dots and waste time? What exactly is the point of doing this? I get that it's a waiting animation, but all you're waiting for in your example is for the interval to pass... which is completely pointless. Unless you're doing some multithreaded thing, and even then you'd be better off using wakelocks.

>9
Not sure, scipy might be more optimized to do that.
Try not to iterate over their elements.

Is there anything more painful than C++?
>manually rewriting function/methods in header and impl file, make sure they match by hands. Make sure you don't forget about the header file if you changed anything in the impl file
>manually add files to CMake, if you change the name of the file you have to find the corresponding file name in your CMakeList
>Manually manage your resources
>Millions of pitfalls in templates and generics
>Millions of pitfalls in move semantics and r value refs
>MIllion of pitfalls in iostream and who knows what
>Write ~5 different constructors for each class, make sure you don't forget about overloading operator=
>Manually write scripts to download, resolve dependency circle and deploy dependencies for your program/library
>No testing framework, make sure you use gtest or catch2 to do trivial testing
>Make sure you run your program through ASAN and Valgrind to see if your resource is leaking or if you are using an invalid reference
>All this work and you don't have a docgen tool. Instruct CMake about the doxygen config you set up somewhere in your project. Doxygen generated documentation looks like a page straight out of late 90's
>Even though the language is one of the biggest to date, still lacks some very basic libraries like formatting a string. So you have to install boost. Don't forget to specify your boost version in CMake and where and how to download it in your build script.
>Be surprised by that commit someone pulled in your repo that uses some template black magic you never knew about
>Surprise others by pushing that one hack/trick that consists of template magic no one else in your project knows about
>Enjoy compile time of around 2 hours to test your debug build
>Something went wrong? I'm sure your 6 page worth of STL error message will help you.

inb4 >C is better
lol no it isn't. I'd rather write ASM

Attached: 1535935817857.png (1280x720, 258K)

saved

>manually rewriting function/methods in header and impl file
>manually add files to CMake
>All this work and you don't have a docgen tool
Sounds like something that a good IDE with plugins would be able to do; this is not something the language should be concerned with.
>Be surprised by that commit someone pulled in your repo that uses some template black magic you never knew about
git gud scrub
>Enjoy compile time of around 2 hours to test your debug build
Think first, THEN write. Cuts down on number of compiles.

>the current year
>he's STILL using a language without modules

>lisp
>high performance
>comfy syntax

>black magic
this is why i stay away from c/c++

Attached: 1537745334464.gif (480x271, 976K)

This is why Nim is the patrician choice

But that's what makes programming fun!
As long as you don't have to debug someone else's black magic, of course.

Do you know what a "library" is?

>manually rewriting function/methods in header and impl file, make sure they match by hands. Make sure you don't forget about the header file if you changed anything in the impl file
Overwhelming majority of functions are static/file local.
If you forget the header file you get a compiler error.
I develop implementation first and make sure I have a decently solid use case first, then make the header if someone needs it.
And besides, it's way more working updating all the call sites than the header file anyway, so who cares about that little extra work.

>manually add files to CMake, if you change the name of the file you have to find the corresponding file name in your CMakeList
Who the fuck uses cmake?
Unity builds with a simple bat/batch files work for 99.999999999% of all use cases, only if you have million line code bases ported to a dozen platform do you need sophisticated build systems.
Why make things hard for yourself>

>Manually manage your resources
A known and valid tradeoff.

>Write ~5 different constructors for each class, make sure you don't forget about overloading operator=
Or write 0.

>Manually write scripts to download, resolve dependency circle and deploy dependencies for your program/library
Dependencies are evil in every language, they should be painful to use so you keep them to a minimum.

Lisps are reasonably fast and have very comfy syntax.

Author of that post here. I only care about languages that have a recent GTK3 binding. Nim has one, called gintro.
Problem is that it has no API reference. Which means the API is useless. Which mean the language is useless. I've come to notice that gintro doesn't even build with the current stable nim 0.18. Useless.

>how about the language
Well it mostly solves all the issues in the post, but it has been a decade and a half and the language devs failed to come up with v1.
Nim has no destructors, i'ts been postponed for v2.
Last I checked they are trying to rework their error handling.
Nim is forever bound to be a prototype language.

>Dependencies are evil in every language, they should be painful to use so you keep them to a minimum.
Not in Rust. Too bad the language itself sucks major ass.

>no classes
There is no denying that C++ is an OO language, might as well use idiomatic C++ than joining the pseudo intellectual C++ without classes cult. You cannot avoid Classes in C++, even if you want to.

>Unity builds with a simple bat/batch files
I don't use non-POSIX os. Supporting windows is none of my concern either, I pretty much tell people to fuck off and use WSL every time I get issues regarding this.

>why not shell scripts
Ever tried building project a non trivial dependency? No? You will never know the joy of not rebuilding needlessly which cuts down compile time dramatically.

>inb4 >C is better
>lol no it isn't. I'd rather write ASM
As someone who's job requires writing a lot of ASM, errr... no, you absolutely would not.

Feels like I'm in heaven when I get to use C.

so what's your language of choice when performance is priority?

What is a good c++ ide for linux? Currently using
qt creator but I hate setting projects up for it.

Honestly? Don't know.
In fact I've given up on my general performance concerns. I just want a blazing first startup time and cost free FFI.

This is why i use Rust

>i use Rust
prove it

>Not in Rust. Too bad the language itself sucks major ass.
Yes in Rust. Thousands of "easy to use" libraries with unknown quality or if they even work properly, it's a nightmare.
>There is no denying that C++ is an OO language, might as well use idiomatic C++ than joining the pseudo intellectual C++ without classes cult. You cannot avoid Classes in C++, even if you want to.
Yes you can.
Templates and constexpr as meta programming is a nice extra over C.
Can make nice little 'DSLs' for yourself that you would need external code generators for in C.

>Ever tried building project a non trivial dependency? No? You will never know the joy of not rebuilding needlessly which cuts down compile time dramatically.
Then the problem is the non trivial dependency - adding another non trivial dependency (huge complicated build system) isn't a fucking solution.
But modern computers can compile tens of thousands of lines in fraction of a second, so even non trivial dependencies or not really an issue these days.
If it actually hinders your edit-compile-debug cycle then you can easily just write some ad hoc scripting to not compile that big piece all the time and do a "duality build".

>Is there anything more painful than C++?
C

>>Make sure you run your program through ASAN and Valgrind to see if your resource is leaking or if you are using an invalid reference
null pointer reference is probably the most common runtime error with jvm based programming languages and modern c++ has smart pointers.

>What are you trying to do?
check the state of some things on my local machine. the output of subprocess is to be saved temporarily, and analyzed.

>Can you not use actual Python calls for this?
I don't know, can I?

>the n=n-1 is useless
well, I had no idea. today I learned something new.

>Why even make a function [...]
well, I simply wanted something that explicitly pointed out that time passed, and then reference that as part of a story.

this is all very helpful. thank you so much.

jetbrains.com/clion/