PYTHON IS FUCKING GARBAGE

I just found why my multithreaded python program was only being pegged to one core. Each thread needs to fight over the Global Interpretor Lock because only one thread can access the Interpetors objects/memory, which effectively makes your program singlethreaded.

Why the fuck would the developers put this in if it's not real!?

I knew I should of just wrote it in C.

The developers need to taken out back and beaten with a billy club, cause clearly their mamas didn't raise them right.

PROTIP: Stay the fuck away from this horrible steaming pile of shit they call Python.

If you disagree with me, you can go fuck your dad.

Attached: main-qimg-28cadbd02699c25a88e5c78d73c7babc.png (602x577, 40K)

Other urls found in this thread:

stackoverflow.com/questions/3044580/multiprocessing-vs-threading-python
cython.readthedocs.io/en/latest/src/userguide/parallelism.html
news.ycombinator.com/item?id=1033411
pastebin.com/bs5QKdkb
pastebin.com/VHWAqeQz
pastebin.com/anccZ3ZU
twitter.com/AnonBabble

>python is fucking garbage

Gee fuck a language that makes whitespace matter is a shitshow.

WHO COULD HAVE THINK IT

Attached: Code_2019-04-18_15-02-24.jpg (1921x1159, 214K)

Python is a very good language when performance doesnt matter

No one asked you pajeet

but bro, it JUST WERKS

true but multithreading is a meme anyway. use processes it just werks

I have a conspiracy theory that Python is promoted by HP, Dell, Supermicro and Huawei so shops have a reason to buy their 8 socket, 100TB RAM monstrosities for millions.

This is a well known limitation of the python interpreter and it’s present in many other scripting languages (like lua) too.
You can work around it with the multiprocessing library, but that’s only practical for jobs you can batch. The code executed in C libraries can also run in parallel, which is why Python is tolerable for data science.
The GIL shouldn’t be an issue if you use Python for what it was intended to be: a scripting language. It only became a problem when people started trying to write servers and games in the language.

i dont know people who use C

go back to plebbit

so? developer time is much more expensive

you seem to not realize that the one on the far left is javascript.

Your fault for using Python to begin with
I've already seen videos about this exact issue like 5+ years ago
It's simply a joke/hipster language

>

The threading module uses threads, the multiprocessing module uses processes. The difference is that threads run in the same memory space, while processes have separate memory. This makes it a bit harder to share objects between processes with multiprocessing. Since threads use the same memory, precautions have to be taken or two threads will write to the same memory at the same time. This is what the global interpreter lock is for.
source: stackoverflow.com/questions/3044580/multiprocessing-vs-threading-python

use right tool for right purpose, brainlet.

Pretty much all popular scripting languages have a GIL or an equivalent, though I admit they drive me up the fucking wall. Python threads are good for IO bound stuff and stuff that spends a lot of time in C modules. Other than that, check out the "multiprocessing" module for real concurrency with Python.

What's sad is, I liked perl and ruby a lot better than python, but python's just so easy that it won out, and now the other languages are dying a slow and painful death.

all hail js
all hail node
the savior of software engineering

youtube.com/watch?time_continue=30&v=dDRDova2uro&ab_channel=Dynomite54

terry tried to warn us of our future mistakes, we didnt listen

cython.readthedocs.io/en/latest/src/userguide/parallelism.html

yeah i'm looking at python on the right

looool

over the years people have invented many solutions to GVR queerness

in your case if the problem is math in nature just grab numpy with MT enabled backend,
vectorize your algo and use numba

Threads in most languages are assumed to be capable of executing in parallel in leave it up to the programmer to handle the complications.
Multiprocessing is less efficient than threading because a process context switches requires remapping memory, while a thread context switch does not.

did you read the last line?
>use right tool for right job
if python doesn't work for your project, use language which does
programming language is just a tool, hating on a language because it can't do what you want to do for your project is retarded

or, u know, just use numba.

If you use python for anything else than scripting shit then you're doing it wrong.

Lua doesnt *really* have a GIL
See this thread: news.ycombinator.com/item?id=1033411
Also lua has excellent coroutines

But they always make me diamonds where else can i see some animations?
Pretty much anything Studio Ghibli or Miyazaki is going to BD's thread to annoy him some more Unlabelled pics of bitches and pot.

those all give the wrong answer
what the fuck

This. Don't use python to write multithreaded programs that need to take advantage of multiple cores. Use something like Go or C.

KEK

>purposely write code in such a way that the language does poorly for (You)s on a mongolian throat singing forum
limit = 10_000

def prime(n):
for i in range(2, n):
if n % i == 0:
return 0
return n

end = sum(map(prime, range(2, limit)))

print(f'sum of the first {limit} primes = {end}')

Coroutines provide concurrency, but not parallelism. Python provides a similar construct now with async.
The work around in Lua is more efficient than Python, but it's still not quite the same as parallel multi threading.

guido is rolling in his grave

He's dead?

Ya FP killed him. Now he's a zombie working on static typing for the language.

You're right. This algorithm was used in a previous thread comparing js to c. I just copied the algorithm without modifying it to not change the test.

This was the thread.

Also your algo is shit. You should only loop up to sqrt(n). Funny how to see Python haters not able to write 15 decent lines of code.

Other thing you should consider: how long to write your program? If it took you 40 seconds less in Python, then it's the winner. Your program is intended to be run once, so who cares if it takes more time to run?

Actually I'm not sure the algorithm is incorrect.

I wrote none of the code. I just ran it to compare the times.

Also I can write code just as fast with javascript which runs much faster.

:bigthink:

dipshit

What's the point of comparing programs badly written ? Python is not made for number crunching, that is well known. With your program, testing if 10000 is a prime number leads to 9900 useless comparisons since it is not useful to go over sqrt(10000).

Long story short, before trashing any language, learn how to design efficient algorithms.

>That guy who doesn't know how to make his python multithreaded

But the other language implementations do the same thing, changing the python implementation in a way that's better for python would defeat the purpose of checking how it plays out using that specific algorithm. So at least with that algorithm, python is slow.

Btw running that with pypy reduced it to ~4 seconds, much faster. Unfortunately pypy is not compatible with some libs.

Yes fellow user, a programming language created by a Netherlander is without a doubt a jewish conspiracy and being promoted by the big jewish owned corporations

Attached: 1547101423689.jpg (400x365, 17K)

What kind of retard uses Python for something CPU intensive?

I already write and prefer to write in C, but a coworker advised me that rewriting our shit python codebase into C was a waste of time and to multithread it to speed it up.

faggot who started this thread

All languages have their specificities. If you want a benchmark, then you need to compare programs correctly written in each language, not simply compare an algorithm.

Here is an example of 3 algorithms to compute the sum of the 10000 prime numbers: pastebin.com/bs5QKdkb

One is the dump implementation, another is smarter by going up to n/2, and the third one is the smartest by going up to sqrt(n).

On my machine:
is_prime_shit; Total: 496165411; Time spent: 0:00:18.667423
is_prime_half; Total: 496165411; Time spent: 0:00:09.712150
is_prime_sqrt; Total: 496165411; Time spent: 0:00:00.119127

0.1 seconds to perform this task efficiently. Even if C is still much faster (around milliseconds), does that still make sense to choose C over Python? If you need to run this task millions of times, yes. If you run it once in a while (like once every 10 seconds), then no.

Python is designed for literal retards, don't use it if you want to do something other than abuse libraries. Lisp is better
It's not so outlandish of a theory when you consider thaf Sun, HP, SGI and similar LOVED UNIX and X.org; no better excuse to sell more hardware than the software getting shittier and heavier every second

do you realize it is a deliberately bad implementation to just test the running time of languages, right?

>not implementing a multithreaded sieve with multiple wheels and L2 cache optimization

>Why the fuck would the developers put this in if it's not real!?
To facilitate intelkike shilling about "muh single core"

>lisp is better
see, i always see this claim being thrown around yet not a single example of industrial strength lisp can be found.

>should of
SHOULD HAVE
You actual retard

Lol Python. How the fuck are you going to get a job with that? Learn C++ you absolute morons.

Do you realize benchmarking shitty algorithms is not relevant, right?

w-we could be friends if you wanted anonkun :3

who are you? do you matter?

Ugh, don't be a dense fucking morons. Yes, benchmarking shitty algorithms *is* relevant because you are comparing the performance of a *particular* algorithm. The point is that, in real world applications, you are going to come across computations that are very similar to the shitty algorithms that are being benchmarked. Therefore you would want to compare using many different shitty algorithms. I'm repeating in order to get this through your thick skull because seemingly you have never taken a CS course or done much programming in an academic or professional setting.

Now that i think about it

>NodeJS is faster
>NodeJS has almost the same libraries as Python
>NodeJs has static typing with TypeScript
>NodeJs is easy as fuck like Python

Why don't sciencemonkeys use Node instead?

Attached: 1551615793926.png (273x326, 193K)

Yeah. Python is a language for the retards who are the reason why our modern supercomputers perform worse than the shitty machines of the 90s.
>Who cares about performance, who cares about memory usage, all I care about is my developer experience!

they have had Julia for months now and haven't switched to it.
Hell, people over here still use fucking Matlab of all things..

They are simply monkeys who just want the computer to spit out results of their models, and not waste time learning anything that goes out of their comfort zone they acquired in their undergraduate years.

>all I care about is my developer experience!
typical MEMEMEMEMEME! millenial.
they are used to mom & dad making everything easy for them

Python isnt supposed to be used as your single language, i am pretty sure its actually decent when paired with C or C++ and easy to do.

What the fuck is Julia

basically a faster matlab.

I use assembler jump + move can everything

No reason to switch to Julia if all of your libraries are still in Python.

I use R and it is miles ahead of any other language when it comes to statistical analysis. get fucked pyjeets and self proclaimed """""""""""""""""""""""""""""""""""""""""""data scientists""""""""""""""""""""""""""

>Being on a programming board
>Using shit like R

Atleast Python is universal but R is just matshit for codlets, if you are here you could probably implement what you do in R in a normal language

So your point is that you should start with an efficient language because people are going to write shitty algorithms anyway? Funny that this works in Python but not in C:

pastebin.com/VHWAqeQz
pastebin.com/anccZ3ZU

The second if forced killed after a few seconds, I guess you know why. However, the first one runs smoothly for hours. Does that mean C shittier than Python, then?

Or maybe your point is that it's better to change the language rather than improving the algorithm?

Then dude, I think you never step a foot in the professional world, because it's never gonna work this way.

Professionally, I worked with C, C++, Cobol, Java, Python and JavaScript in academic and private sector for more than 10 years and there is one common thing among all my jobs: the programming language was NEVER the problem.

Shitty algorithms (O(n2)) were the problem.
Bad designs/architectures were the problem.
I/O rates were the problem.
Inefficient SQL queries were the problem.

Poor choice of language? No, never the problem.

>not knowing difference between multithreading and multiprocessing in puthon

Yikes

This niggs thinks he can write his own implementation of MCMC

>muh beloved puthon can do all
shit of all trades, master of non.

>codelet
might be, but if wanted to learn a programming language i would go for a real language like C or C++. and not waste my time with garbage that causes problems like in OP.

Attached: PCMR.png (438x632, 10K)

Attached: pythonlogo.png (1152x1150, 191K)

wtf i love python now

>tfw creating a Raspi Zero + e-ink screen project that needs to poll a CalDAV server once a minute for changes
>tfw the two tools used to polling the server and creating the calendar output are written in python because there was no obvious alternative
>tfw the Raspi Zero spends ten seconds every minute at 100% CPU capacity and runs at a toasty 45° at all times, meaning that I likely won't be able to wall-mount it without damaging the glue on my wallpaper

I fucking despise python. I tried using nuitka to compile this shit into a binary, but the Pi Zero actually doesn't have enough RAM to compile all of the modules into a standalone executable. And compiling just the original script does barely anything because python is a dependency shitshow where a 5kb script needs to draw in hundreds of megabytes of dependencies and interpret them dynamically to make a fucking web request.
And the Zero uses a v6 ARM, while the more powerful Raspis that I own have v7 ARM, so I would need to actually cross-compile this shit to make it work.

>using raspberry pi for anything else than putting it between your ass cheeks and posting pictures on the internet

Attached: 1490541342964.png (112x112, 7K)

Use Node.js

Or try to understand the bottleneck since you might face the same with Node.js...

When something goes wrong, the first step is to understand WHY it goes wrong, not restart everything from scratch expecting it's gonna work better.

Translation:
I'm too dumb to write anything and the only thing I can find is a retarded python module.
And now this pajeet is mad at the language.
The entire point of the rpi is that it's a shitty little computer for cheap.
Of course it doesn't run everything, especially not poorly written python code.

The thing is Node (and pretty much everything else) is much better optimized than Python so maybe it will work without completly burning the device cpu

The Dutch are the biggest non-jew Jews in the world.

How many Jow Forums users are actually Markov Chains or simple neural network outputs?

No rust test

Attached: 1507176855244.jpg (1600x900, 464K)

>which effectively makes your program singlethreaded.
no, its threading is trash, but its not single-threaded effectively.
you realise that multithreading exists for a single core, right?
your threads simply timeshare

>should of
Learn English first.

I'm not going to argue on which one is better optimized, but what user tries to achieve doesn't seem to require much resources. He is not doing CPU intensive computations, just some basic network requests. Any language should do the job on a Rpi zero.

Maybe user simply makes stupid requests, like trying to fetch the entire calendar at once: that would give the same shitty results on Node.js. Starting from this, the first step is to understand why it goes wrong. It's quite easy to spot where your program spends most of its time with tools like PyFlame or Py-Spy. Second step is to understand why these pieces of code are resource-hungry, and optimize them.

I bet you didn't even use the multiprocessing module.
Fucking amateur.

A Common Lisp implementation is both more concise and faster.
You don't have to compromise.

Multithreading in programming languages that don't have anything from functional programming is fucking trash anyway

This, but unironically.

Attached: 1533960301582.png (1075x1518, 1.34M)

> python shills lied, I cried
It sucks it went that far, I use python but already knew about this. Take this as a lesson on why you can't trust freetards/lefties.

Attached: earthlings cant compete.jpg (786x458, 41K)

>slow language is slow
The problem here isn't that python exists, but rather you miscalculating your project requirements. Here's a hint: if the device costs less than $50 it's probably not the fastest thing in the world. In other words it's your fault for not using a faster language (which is a fucking huge selection).

>wah I dont know what a green thread is and why it could be useful

Threading in a scripting language is hardly likely to be useful. Just spawn more interpreters and pass back messages.

Surprised you made it that far. I got to the part where I learned switch statements didn’t exist and alt+f4’d so fast. Meme language.