Python Hate

Remainder:
Python is for brainlets, if you code in Python you are a simpleton.
The only reason for anyone to use it is if they are scared of writing anything other than pseudo code.
If you code in Python please fuck off.

Attached: 1464296835221.jpg (240x240, 4K)

You lost me at "writing pseudo code"..

As a scripting language is a good tool though.

you know, back when I started and I was a naive little first semester cs-let, I used to think that everything is bloated, and that everything can be done better with low level languages. I mean you, and I back then, weren't wrong, but the reality of the matter is that man-hours are much more expensive than compute hours. Over time, I learned to accept and appreciate abstraction, recognized the salt of adolescent programmers as exactly that - bitterness at their inability to quickly build good enough software that gets the job done every time. Instead, they distract from their ineptitude by philosphizing about how this and that is bloat, how programs could be optimized by integrating every module directly, when in fact the gains would be marginal at best.

As any adolescent, you're not gonna listen to my purported wisdom anyways, but just consider the following: computers these days are so ridiculously fast that in 99% of all cases you only need to care about time complexity. everything else is pretty much irrelevant. Your hate on high level languages and constructs is misguided, purely because you probably misunderstand the underlying problem that software is trying to solve: it's not about telling the computer how to solve a problem. It's about asking the computer to solve a problem. There's a nuance, and it will become increasingly more pronounced over time.

Python is a very good language for short quick programs where performance isn't a major issue.

>talking about brainlets
>remainder

>Hating tools
lmao

Attached: 1517612137639.gif (400x225, 1.96M)

Reminder:
Python bashers are brainlets who can't even spell "reminder" correctly.

"Remainder" is what you get when you use the % operator.

I call the output of the modulo function "the modulo", as in "you take the modulo x of y"

people seem to know what I'm talking about, so should I use the word "remainder"? i think the modulo and the remainder are slightly different in certain edge cases.

the operation is the modulo, not the result

the operation is division, my friend.

Attached: Screenshot_20180608-005351.png (1080x1920, 555K)

Modulo is correct. Remainder is used in division but it's the same thing.

I'm fine with Python hate except
>muh whitespace
People who can't even C-x h TAB need not apply

Attached: 1525760211477.png (417x578, 195K)

But muh jython though

My main beef with Python is that working with others' code tends to be a huge pain. Part of it is the "import everything" trend, along with people's tendencies to not explain their code well.

I end up not being surprised that there few/almost no authoritative guides to Python but, instead, a collection of singularly-focused, amateurly-written (that is, amateur to expository/academic writing) guides that instill a "Python is fun/useful now!, and your peers who are using/learning it beside you love it now!, and oh yeah don't worry about what'll happen in the next few months/years when you or someone else has to refactor this."

Holy shit, this. Developer time is expensive. This should be apparent to anyone who actually works in development.

Also "Python is for brainlets" is a meme. A couple of the smartest guys I know (both PhDs, one an engineer and the other a microbiologist) use Python frequently. Sometimes the engineer uses C to get certain tasks done as quickly as possible, but the high level of abstraction allows them to focus their energies on actually solving real-world problems.

Performance is only half of the problem. Python just not being a good language is the other half.

Call me crazy, but Python and Ruby really look a lot like bastardized Elisp at times. Ruby moreso

Attached: 1524609998742.png (769x1111, 308K)

What makes it a bad language?

It's dynamically typed, for one. How are you going to be confident that you don't have bugs if your types aren't checked until execution?

what kind of moron is confident that he doesn't have bugs

Someone who actually knows what they're doing

static typing matters far more for optimization than it does for bug-safety. Implicit type conversion however is dangerous, see JSFuck.

python 3 has type hinting now

No

What the fuck are you talking about? Optimization? I'm talking about language features here, not performance.

It isn't, though. There are compilers that compile faster to binary code and then load that to memory.

TROLL LINE
anyone posting under this line has been fucking trolled

============================


and above the line too

t. undergrad

there is nothing intrinsically wrong with dynamic typing

shit man, memory lane much?

Attached: 605.jpg (512x512, 47K)

Yes there is, for a multitude of reasons.

Um, assert all the types if it's that important to you. [spoiler]It isn't [/spoiler]

sounds liek you aint doing much then

I wonder what that makes javascriptlets.

> elisp
God no.

Any serious program can be expected to have some bugs, many of which can be caught by a compiler or by analysis tools like lint or valgrind. If you wrote a bug-free fizzbuzz on the first try, congratulations, but that isn't the same as a large project coordinated with a team of other programmers.

Python's a massive piece of shit.
>Dynamic typing
>Slow as fuck
>Indent-scoping
>GIL shit

Python is actually a great language

>GIL shit
Have you actually run into issues with the GIL, or are you just reciting points you found in a blog?

Just write unittests for everything.
Also I'm not sure what is corelation between being dynamically typing and bugs.

And this is why we have shitty electrons apps that run like shit

Just trained a cnn with 13.8 million parameters in python and achieved 93.32% accuracy classifying images into 10 different classes. You mad op?

A good type system can guarantee certain classes of bugs will be caught at compile time.

You could write unit tests and asserts to ensure your typing is sane, but what you're really doing by writing these tests is poorly emulating a real type system. If you take responsibility for your types with a standard approach then automated tools can check your work much more effectively, and refactoring tools can perform far more complex manipulations because they can have higher confidence in the intent of your code.

Strong typing is an incredible boon when writing and maintaining large systems.

I don't like how python implements OOP but besides that it's an ok language. I don't like it when people use it for medium to large sized projects. But for short scripts or proof of concept type projects it's decent.

This, but Ruby is my love, so.

If Python is for brainlets
then solve this:
S = "abcdefgh"
change "d" to "x"

unit tests with ~100% coverage

I have a rocket scientists uncle that regularly codes in python.

C is great for device drivers and anything that needs speed (encryption for example). Most things are I/O limited and in those instances python will almost always be an acceptable choice in language.

Python is easy to read, easy to debug and easy to code in. It will get the job done faster and with less headaches in the majority of situations. If all you care about is "ERR LOOK HOW SMRT I AM I CODES IN ASSMEBEMLER FOR EVARYBING" then nobody is ever going to want to work with you and you will fail in the real world.

s = "abcdefgh"
s = s.replace('d','x');
print(s)

LOL

Attached: laughing lemon4.png (600x580, 792K)

and here it is in C "code", you're welcome lad
char* str_replace(char* string, char* findstr, char* replaceWith, int replaceAll)
{
int findLen = strlen(findstr);
int replaceLen = strlen(replaceWith);
int count = str_count(string, findstr);
if (count == 0)
return string;
char *ret = calloc(1, strlen(string) + (replaceLen - findLen) * count + 1);
if (ret == NULL){
puts("calloc error\n");
return "err";
}
char *tmp = ret;
if (replaceAll == 0)
count = 1;
do {
char *insertPos = strstr(string, findstr);
int prevLen = insertPos - string;
memcpy(tmp, string, prevLen);
tmp += prevLen;
memcpy(tmp, replaceWith, replaceLen);
tmp += replaceLen;
string += prevLen + findLen;
} while (--count);
strcpy(tmp, string); //the rest of the string

return ret;
}

s[3] = 'x'

fixed

You mean s = list(s); s[2] = 'x'; s = ''.join(s) strings are immutable mate

you have to pick the right tool for the task.
python is often but not always your Jow Forumsuy.

what is bash

Python is decent, especially if you're going to write a lot of code that you're going to use only once and then discard it (machine learning, prototyping).
There are some annoying things about the language:
>dynamic typing, partially solved with type annotations
>the clusterfuck that is the import system
>only one line lambdas
>lackluster enums
>no function overloading
>having to write self every time
>pip fails to install libraries about 50% of times
>retarded way of calculating default parameters
But those are just mildly annoying. The worst thing is the community surrounding it, full of some kind of autistic cultists. Every time you ask a question and the answer contains "pythonic" or "unpythonic", the answer is pretty much worthless. And while duck typing isn't bad per se, if an answer contains muh dick typing, it's often retarded anyway, because they're somehow obsessed with it.
There are also multiple straight up retarded conventions. How do you check that an iteration does not have any more elements? You can't, you have to catch a StopIteration exception because exceptions are apparently the best way of controlling the flow of your program. Sure, this kind of approach sometimes makes sense (if the condition can change between checking it and execution), but usually it's just retarded.
And even though there are type annotations, creators of libraries don't bother to actually use them. And stringly typed code is pretty much the norm, even though enums existed even in Python 2. I guess writing clean and usable code is unpythonic.

You can write a short quick program with many other languages.

>string concatenation eats memory exponentially
>people defend this

Better upgrade to 32Gb, bud.

>Python is for brainlets
And that's a good thing. Thanks to python I can write simple scripts and use it instead of R for my statistical needs.

Attached: 15276937231412-b.jpg (750x1331, 96K)

Are you angry because you spent hours debugging your faggy fizzbuzz to fix memory leaks while pythonistas write their enterprise tier software in simple english and get paid? :^)

Can you raise me pls

>: computers these days are so ridiculously fast that in 99% of all cases you only need to care about time complexity
This is a load of bollocks. I wrote a script to do some simple translation tasks in ruby. It was slow as shit. I rewrote the innermost loop in c and it performed literally 10 or more times faster.

Scripting + C is the masterrace desu.

>reminder
>python is popular
>therefore it sucks
Translated OP for the lot of you to save you time

Python is shit, OP is right about this one.

OP is shit, I'm right about this one

>Muh runtime
You don't need blazing speed to run scripts, Python does the job fine. Anyone who suggests that we always need to use compiled languages are no better than the brainlets who insist on programming exclusively with JS.

>It's dynamically typed, for one.
It's also strong typed you dumb faggot. Stop talking about shit you don't understand. It's like you can't even have a single thread without some faggot larping

Why did I not hear of this before? I am going to convert the JS on one of my clients' site to JSFuck now to see how it goes.

Attached: Screenshot_20180608-111955.png (720x1280, 105K)

Those languages aren't the right tool for the job.

I write software for SpaceX test infrastructure

In python

And then you join the true master race that is rewriting slow functions in the Python C API and script with those reusable components from Python.

>So dumb he doesn't even understand the use case for Python

>to be a REAL programmer you MUST waste time fixing segfaults for 7 hours and MUST waste as much time debugging as possible because then you'll be a REAL programmer!!!11!1!!

can confirm
pic related for proof

this

Attached: just-use-shell.png (942x952, 103K)

I'm just wondering who thought all that fucking whitespace is readable at all

>replacing all d with x
No shit you are writing in Python.
C:
s[3]='x';
>s = ''.join(s)
>''.
waaaaat

uh honey no, people with actual knowledge of how computers work and understanding of C dont have segfaults, using anything but C and ASM makes you a programmer not a software architect

Some retards o use Python on a Macbook looking to impress in front of their hipster girlfriend.

being simple to use is not a negative you dumb faggot
python has plenty of negatives but ease of use is not one

Thats the last sentence in his post u spastics

Confirmed retard

Interpreter is not a library
Half baked asyncio
GIL
No functional
Dynamic typing
import typing
Startup time
Half assed OOP

This come on top of my mind.
Perl and Lua have been engineered better

fuck you brainlet

Truth to be told, GIL sucks if you want to work with CPU-bound threads. There are ways to get around it by utilizing multiprocessing, but... Well, it's a workaround, still shows some serious limitations in Python.

>u spastics
>

Attached: Selection_20180608_16:48:20.png (428x102, 12K)

Gtfo summer vac kids

Attached: Selection_20180608_16:52:47.png (364x129, 11K)

Attached: Selection_20180608_16:53:55.png (365x97, 9K)

Maybe he meant to write
"GTFO of summer vacation, kids."
instead of
"GTFO, summer vacation kids."
We'll never know.

This is why the center booster crashed.

based

Python is best for its use case (cross platform scripting)

don't thread on me

Attached: opengraph-icon-200x200.png (200x200, 8K)

>Assuming retards and lolis of Jow Forums know how to code

t. butthurt brainlet who can't comprehend that different jobs require different tools

>>replacing all d with x
s = s.replace("d", "x", 1)

Attached: 1514601290785.jpg (438x499, 36K)

[tool] is for brainlets
brainlet detected