/dpt/ - Daily Programming Thread

Old thread: Lisp is the most powerful programming language.
What are you working on, Jow Forums?

Attached: lisp.png (2880x1800, 562K)

Other urls found in this thread:

n0x400.1mb.site/
cs.brown.edu/~sk/Publications/Papers/Published/pmmwplck-python-full-monty/
papl.cs.brown.edu/2018/index.html
twitter.com/SFWRedditVideos

auto main() -> int {
auto car = new Car();

return 3;
}

I like C!

now that I'm done revolutionizing C++ documentation forever with , I'm going to remember how to post.

what the fuck are you talking about?

please send help to me girls

Are there any decent web frontends that aren't javascript? I just want something cohesive and easy to understand like Apple's UIKit. I don't want to fuck with outdated technologies like html, js, and css etc. Applications on the web should be like applications on desktop this shit is retarded.

Attached: 56910288_2263906140595113_4901206960194989976_n.jpg (480x480, 30K)

...

Thanks. I'd actually like to hear from people here though. I'm assuming webdevelopers will have some kind of stockholm syndrome.

incr dom by jane street, also you can write functional js, with css grid, no need for frameworks

embrace HTML and CSS. Dirty your hands and face to bloodshed. HTML5 is relatively sane and really represents a victory of pragmatism over idiots trying to create 'semantic webs' and SGML+++ nonsense.
Bite the bullet and learn JS. After you're productive with it, you can consider alternatives that compile to JS.

install Elm

>outdated
>js
mate we're going to have js for another 1000 years from the looks of it, better get learning

Also user, there's webasm now, you can write in C/C++/Rust/Anything if you wanted though it's just a MVP still

this.

I spent a bit learning CSS Grid which completely replaces every framework n0x400.1mb.site/

Also gained a new appreciation for vanilla js when mainpulating the DOM using a hash table. There's no reason to use any of those awful frameworks or even shadow DOMs if you don't want.

Greetings and salutations I was the tard asking about QT Open Sorce forcing you to release your own source code if you had it linked static but isn't having it linked dynamic an improvement in every way? I fail to find something that makes static better.

As far as I'm concerned, WASM still doesn't has access to the dom, so you still can't use it for frontend

slightly easier deployment.
I bet the same restriction kicks in if you deploy a dynamically linked program *and* the QT libraries it's supposed to dynamically link to.
I bet the same restriction kicks in with appimage or similar deployment conveniences that deploy QT and your application together.
If you can rely on an already installed QT library, sure dynamic linking is fine.

the following code's while executes a few more times than it's supposed to. what gives?

#include

void maxMin(int *vector, int *max, int *min)
{
*min = *vector;
*max = *vector;

while(*vector)
{
printf("%d \n", *vector);
if (*vector > *max)
{
*max = *vector;
}
if (*vector < *min)
{
*min = *vector;
}

vector++;
}
}

int main()
{
int vector[] = {-1, 2, 3, 4}, min, max;
maxMin(&vector[0], &max, &min);
printf("%d, %d\n", max, min);

return 0;
}

you have a loop that relies on the array being zero-terminated.
you do not actually zero-terminate it.
you therefore run past the array until you find a 0 accidentally.

Python is better fucking retard

Sure you can it pushes a binary through the browser and runs it, fuck the DOM it's obsolete under wasm.

But again it's a MVP. Lot's of issues right now, but not forever, Mozilla and others are pushing for webasm to have native resource access and it's working. Soon all app stores will be extinct, you'll just get a binary pushed to you through the browser.

Attached: peste_noire.jpg (225x225, 12K)

What does while *vector even mean? Do you ensure a topping 0? robably not.
Why is the min check inside the max check?

the min/max thing is just shitty indentation.

does it not also stops at a NULL byte? it's what we were taught today. my professor couldn't figure it out and said that it was probably to do with gcc's version lol. what can I do to 0-terminate it?

>Applications on the web should be like applications on desktop this shit is retarded
yes that's why we built electron

>NULL
a pointer to guaranteed-invalid memory. basically 0.
>byte
an 8-bit value. Not relevant to your array of ints, or to your dereference of an int*
>does it not also stops as a 0 byte?
of course not. It's not testing bytes. It's testing ints.
C string literals have an implicit 0 terminator. No C array literal does.

C style character strings have a terminating null. Other arrays dont.

you can 0-terminate it by putting a 0 at the end of your array literal.

C style strings are 0 terminated by convention (or standard?) but why would an int array be terminate by 0? 0 Is an absoluetely fine integer.

You should probably just pass the length of the array as a parameter.

oh, forgot to add that. it's not meant to make sense I guess, our professor just wanted to make sure that we understood how pointers and memory addresses work, so he asked us to make a function that receives a vector and two variables and rewrites their values with the minimum and maximum values inside the vector

I'm not sure if I follow to be completely honest. will book chapters about pointers from the cover those memory concepts? it's what I'm about to do, since he

but that would only work if I were to change the array, right? I don't think we were supposed to make modifications to the array here, just iterate it

u will need that null terminator if u want to use anything like `strlen` which is like being used by lots of the library.

yes that requires changing the array. Since it's not terminated and since you aren't given the length, there is no way to loop over it except by hard-coding the length.

ah, it was also an exercise of learning alternatives to that

about pointers cover* i'm tired sorry

well optimistically your professor gave you a broken task so that you could discover that it is broken.
pessimistically your professor is not aware that he gave you a broken task.

I am not a C programmer, but I think C and C++ work similarly here.
if you say something like
const char* text = Hello"

it will automaticaly insert the 0(null) terminator.
But with other arrays it doesn't work like that. Think about it logicaly.

If you had an array of integers(whole numbers) and then you have a 0 in that array, why would it stop? 0 is a whole number that you use all the time. 0 as a terminator for character strings kind of makes sense, but it does not for more math related types like int. How do you want to do math without 0?

>I'm not sure if I follow to be completely honest. will book chapters about pointers from the cover those memory concepts? it's what I'm about to do, since he

Pointer in this context pretty much means array.
A pointer means an aroow to a point of memory that holds the important values. And then you point the pointer to the next pointer, in other words you go to the enxt array alement.

But your code, logically, and actually pretty much say: if you find a 0 at pointer (array) position x just break, that memory is crap anyways.
And that is true for c style strings, but not for any array. C just copies the numbers that it is told to copy, after the array that you defined is random memory. Basicaly there are bytes that Fortnite used in that memory location, C will not overwrite it with any bytes. MAybe you think everything that you didnt personally initialize will be 0? Not really, it will be random. That's how that machine works.

*ahem*
C++ doesn't have these issues

imagine if arrays in C stored the length instead of using null terminators and having to pass the length in every function where it's used

I'm finally learning some intermediate Haskell. It's amazing how when I need to change the structure of a program everything seems to fall perfectly into place. If I forget to handle a new case somewhere the type system warns me that my code is wrong, and tells me exactly what I missed.

Attached: sedandawk.jpg (1280x720, 125K)

wtf.
#include
#include

void maxMin(int *vector, int *max, int *min) {
*min = *vector;
*max = *vector;

while (*vector) {
printf("%d \n", *vector);
if (*vector > *max) {
*max = *vector;
}
else if (*vector < *min) {
*min = *vector;
}
vector++;
}
}

auto main() -> int {
std::vector vector = {-1, 2, 3, 4};
int min, max;
maxMin(&vector[0], &max, &min);
printf("%d, %d\n", max, min);
}
surely this only accidentally works on my machine.
surely there's not a guaranteed 0 after the contents of a vector.

They do if you know your malloc implementation or roll your own.

Attached: animu.png (149x148, 58K)

ppl that don't get webasm, you don't need a "DOM". It's obsolete. webasm pushes a binary, that runs on your local machine. There's no 'document' tree.

Just wait until we can only buy Amazon Dumb Terminals, and all our programs are pushed through a browser with native access to all local resources. Every Amazon DumbBox will be proprietary signed CPU code too, so you can't modify it.

>changing a std::vector to int*
Y I K E S
C programmers everyone.

>intermediate Haskell
>I love this thing from SML
Robin Milner bless you, child.
You'll know when you've reached 'intermediate Haskell' when you realize that you've completely reintroduced one of the problems that FP-style programming was intended to avoid.

Attached: Milner.gif (480x480, 108K)

it could also be that I didn't interpret what he said very well. but I'm afraid that you could be right. either way, he said he's not teaching us C and the exercise was just to help us to tell how well we know it by ourselves, but if he really doesn't know what's behind this all then I'm afraid he won't make up for the the best Data Structures teacher

this is what I came across just now:
>Please note that 0 in the above C statement is used in pointer-context and it’s different from 0 as integer. This is one of the reasons why the usage of NULL is preferred because it makes it explicit in code that programmer is using null pointer, not integer 0. Another important concept about NULL is that “NULL expands to an implementation-defined null pointer constant”. This statement is also from C11 clause 7.19. It means that internal representation of the null pointer could be non-zero bit pattern to convey NULL pointer. That’s why NULL always needn’t be internally represented as all zeros bit pattern. A compiler implementation can choose to represent “null pointer constant” as a bit pattern for all 1s or anything else. But again, as a C programmer, we needn’t worry much on the internal value of the null pointer unless we are involved in Compiler coding or even below the level of coding. Having said so, typically NULL is represented as all bits set to 0 only.
so I think I will keep reading this until I have complete knowledge of what is what. I appreciate your help, I sure think I'm close to fully understanding it, I'm just not knowledgeable enough to be sure that I know what is the difference this paragraph is referring to

this is the result of gcc compilation. not cpp though

Attached: Screenshot from 2019-08-23 22-20-39.png (1366x768, 112K)

You're gay.

Python fixed Lisp's stupid syntax and provides a massive standard library.

I am far from an expert at Python, but I have done a couple of semi-serious projects in the language and will try to recall specifically what I didn't like.

- Everything you write will be open source. No FASLs, DLLs or EXEs. Developer may want to have control over the level of access to prevent exposure of internal implementation, as it may contain proprietary code or because strict interface/implementation decomposition is required. Python third-party library licensing is overly complex. Licenses like MIT allow you to create derived works as long as you maintain attrubution; GNU GPL, or other 'viral' licenses don't allow derived works without inheriting the same license. To inherit the benefits of an open source culture you also inherit the complexities of the licensing hell.
- Installation mentality, Python has inherited the idea that libraries should be installed, so it infact is designed to work inside unix package management, which basically contains a fair amount of baggage (library version issues) and reduced portability. Of course it must be possible to package libraries with your application, but its not conventional and can be hard to deploy as a desktop app due to cross platform issues, language version, etc. Open Source projects generally don't care about Windows, most open source developers use Linux because "Windows sucks".
- Probably the biggest practical problem with Python is that there's no well-defined API that doesn't change. This make life easier for Guido and tough on everybody else. That's the real cause of Python's "version hell".

- Global Interpreter Lock (GIL) is a significant barrier to concurrency. Due to signaling with a CPU-bound thread, it can cause a slowdown even on single processor. Reason for employing GIL in Python is to easy the integration of C/C++ libraries. Additionally, CPython interpreter code is not thread-safe, so the only way other threads can do useful work is if they are in some C/C++ routine, which must be thread-safe.
- Python (like most other scripting languages) does not require variables to be declared, as (let (x 123) ...) in Lisp or int x = 123 in C/C++. This means that Python can't even detect a trivial typo - it will produce a program, which will continue working for hours until it reaches the typo - THEN go boom and you lost all unsaved data. Local and global scopes are unintuitive. Having variables leak after a for-loop can definitely be confusing. Worse, binding of loop indices can be very confusing; e.g. "for a in list: result.append(lambda: fcn(a))" probably won't do what you think it would. Why nonlocal/global/auto-local scope nonsense?
- Python indulges messy horizontal code (> 80 chars per line), where in Lisp one would use "let" to break computaion into manageable pieces. Get used to things like self.convertId([(name, uidutil.getId(obj)) for name, obj in container.items() if IContainer.isInstance(obj)])
- Crippled support for functional programming. Python's lambda is limited to a single expression and doesn't allow conditionals. Python makes a distinction between expressions and statements, and does not automatically return the last expressions, thus crippling lambdas even more. Assignments are not expressions. Most useful high-order functions were deprecated in Python 3.0 and have to be imported from functools. No continuations or even tail call optimization: "I don't like reading code that was written by someone trying to use tail recursion." --Guido

- Python has a faulty package system. Type time.sleep=4 instead of time.sleep(4) and you just destroyed the system-wide sleep function with a trivial typo. Now consider accidentally assigning some method to time.sleep, and you won't even get a runtime error - just very hard to trace behavior. And sleep is only one example, it's just as easy to override ANYTHING.
- Python's syntax, based on SETL language and mathematical Set Theory, is non-uniform, hard to understand and parse, compared to simpler languages, like Lisp, Smalltalk, Nial and Factor. Instead of usual "fold" and "map" functions, Python uses "set comprehension" syntax, which has overhelmingly large collection of underlying linguistic and notational conventions, each with it's own variable binding semantics. Using CLI and automatically generating Python code is hard due to the so called "off-side" indentation rule (aka Forced Indentation of Code), also taken from a math-intensive Haskell language. This, in effect, makes Python look like an overengineered toy for math geeks. Good luck discerning [f(z) for y in x for z in gen(y) if pred(z)] from [f(z) if pred(z) for z in gen(y) for y in x]
- Python hides logical connectives in a pile of other symbols: try seeing "and" in "if y > 0 or new_width > width and new_height > height or x < 0".
- Quite quirky: triple-quoted strings seem like a syntax-decision from a David Lynch movie, and double-underscores, like __init__, seem appropriate in C, but not in a language that provides list comprehensions. There are better ways to mark certain features as internal or special than just calling it __feature__. self everywhere can make you feel like OO was bolted on, even though it wasn't.

- Python has too many confusing non-orthogonal features: references can't be used as hash keys; expressions in default arguments are calculated when the function is defined, not when it’s called. Why have both dictionaries and objects? Why have both types and duck-typing? Why is there ":" in the syntax if it almost always has a newline after it? The Python language reference devotes a whole sub-chapter to "Emulating container types", "Emulating callable Objects", "Emulating numeric types", "Emulating sequences" etc. -- only because arrays, sequences etc. are "special" in Python.
- Python's GC uses naive reference counting, which is slow and doesn't handle circular references, meaning you have to expect subtle memory leaks and can't easily use arbitrary graphs as your data. In effect Python complicates even simple tasks, like keeping directory tree with symlinks.
- Patterns and anti-patterns are signs of deficiencies inherent in the language. In Python, concatenating strings in a loop is considered an anti-pattern merely because the popular implementation is incapable of producing good code in such a case. The intractability or impossibility of static analysis in Python makes such optimizations difficult or impossible.
- Problems with arithmetic: no Numerical Tower (nor even rational/complex numbers), meaning 1/2 would produce 0, instead of 0.5, leading to subtle and dangerous errors.
- Poor UTF support and unicode string handling is somewhat awkward.
- No outstanding feature, that makes the language, like the brevity of APL or macros of Lisp. Python doesn’t really give us anything that wasn’t there long ago in Lisp and Smalltalk.

Those are mostly windows problems though. I don't even touch that os anymore thanks to 95% of my Steam games being Linux compatible now.

this is what happen when you "Want the easy path" and build enterprise software on top of it
C/C++/Java/Rust might be a pain in the ass to deal with but at least you have more control of what the fuck you are doing
this is why Go is so good, its a good balance between messy loose languages and the tight (and often cumbersome) rules of low level languages.

Python is more easy path than Go and therefore still better. Python and JavaScript are just so low effort they will always win out.

Do you guys know any good Intel Assembly peephole optimizers, or am I better off just writing my own?

>Python is more easy path than Go and therefore still better
better in what sense? i dont get using Python for anything other than prototypes, because there's so little control over what you can do with what you currently have
its only good to make things that arent there yet quickly, but once you have it....you just have to deal with its quirk and how slow it is (relative to other options)

Shut up retard

Python is fast if you're not slow

Disagree, Go is prob the 'easiest' lang going.
Python is junk and always has been, just look at their joke updates from python2 to 3 that broke their entire library ecosystem. It's peak clown language.

Anybody reading this who knows nothing about programming, learn a functional language. It's even easier than Go. It looks hard at first with all the weird syntax and silly category theory names for parameterizing an interface (module) but honestly it's the simplest way to program something, and if it's strictly typed and compiles, you're good to go no piles of tests or writing entire Oracle's (programs solely for testing) like Golang and Pythona and other languages require.

not an argument
python is a flawed language geared toward the lowest common denominator, nothing more

Python's "quirk" compared to other languages is that it usually works as expected which is ideal. Performance doesn't really matter for the vast majority of programs because Python still finishes faster than a human can detect. Python saves me the developer time and effort compared to everything else and is therefore my favorite programming language.

>a programming languages named after a disability
no thanx

>Python
>2019 identity: "a language with a massive standard library"
>Python
>related in any way to Lisp
utterly disgusting. Norvig's going to hell for spreading such lies.

This. Inexperienced programmers rely on the C family's speed because they write shitty O(n^2) algorithms.

Everyone take note: This is what an effective critique looks like.

>>Python has a faulty package system
indeed
>No outstanding feature
Its relative readability and brevity was an outstanding feature for early adopters. Today it may be more about libraries.

Attached: clapping.png (480x480, 162K)

No, fuck you, you're wrong.

Attached: id_kill_you_to_save_python_nigger.jpg (474x474, 26K)

I thought it was fucking stupid myself for a number of reasons. Clearly the faggot fuck face who posted this hasn't had any industry experience what-so-ever, and obviously he's just a Windows fanboy who couldn't get his dick sucked if he held a gun to their child.

you know
people will make shit code, then act like the language they wrote the code in is to blame for why it's unreadable.
It may have helped, but it's not the languages fault. you get this on both sides of the C vs RUST thing.

>This is what an effective critique looks like.
That's because he's a Go shill and jewgle wrote all that for him.

It's long winded and fucking retarded

LANGUAGE!!! That's rood!!!

Sometimes people are naughty, get over it.

Attached: sexygirl.jpg (706x1022, 119K)

set colorcolumn=80, do you use it?

Please stop trying to make python programmers look stupid.

Python is already being replaced by every US school because it's probably the absolute worst language to teach programming. Evidence, this paper describing all the problems with Python cs.brown.edu/~sk/Publications/Papers/Published/pmmwplck-python-full-monty/ getting a scope analysis right in Python is a PhD exercise it's so difficult.

TA's have to spend hours helping students with their shity Python evironments too, then even after all that you can't even do basic compsci things in Python like induction without their being some idiotic language implementation in the way of properly reasoning about a simple program.

Evac and nuke it from orbit, it's the only way to be sure. Any language can just FFI a python library and run an interpreter leaving out all the retarded cuckery that's actually inside reg Python.

I think he's funny.

sounds similar to the feelings that motivated the creation of Elm

What's a better alternative?

HOW THICK ARE YOU?
IT'S NOT COMPLEX. EITHER STORE LENGTH SOMEWHERE, OR PUT A SPECIAL THING AT THE END OF YOUR ARRAY THAT SAYS "THIS IS THE END". IT DOESN'T JUST KNOW.

Shut up pussy nigger

>replaced
with what? pray tell

aaaa! u sed hte n wurd!!! haths against the code of condoms! mods, get him!

nigger

Good languages like Brainfuck. Kid's gotta learn to be a man and shove the milestone dick of turing completeness up his ass instead of making useless things like cryptography key generators.

Pyret, it jacked the only good parts of Python (which was the simple syntax, everything else is junk) and simply runs in your browser and teaches an OCaml style of programming. Here learn some: papl.cs.brown.edu/2018/index.html

in python how to i read a file and delete lines where the length is less than a certain length?

what are you even doing? why not just use indexes instead of doing pointer arithmetic like a sane person

Open File and stream it to whatever Python uses to stream files line by line.

if length of line < than this other length:
do this
else
continue reading

You owe one million dollars

expert

pointer arithmetic is cuter desu

do you qualify for neetbux with how retarded you are?

well it looks like it's a bit too complicated with him, with him being retarded and all

I made a one-liner just for you.
delshortlines = lambda fname, length : '\n'.join([l for l in open(fname).read().split('\n') if len(l) >= length])

print(delshortlines('test.txt', 3))

u retards ofc i know the logic im asking what the functions r to delete

Got basic functionality for my anime torrent manager working

Attached: Screen Shot 2019-08-23 at 10.58.15 PM censored.png (2880x1800, 1.29M)

Why is everyone who hates on Python always a meme-lang shill?

ok so it's down syndrome, same difference
it's almost like a person with at least an IQ of 70 would just google what the functions are

ok google it and tell me if u can find it smartass
u cant even do this kys

you forgot pointer to beginning and pointer to end