This language is a fucking joke

- It has gotten lots of incompetent normies into """programming""" and now they are trying to shove it down everyone's throat, including fucking embedded shit like microcontrollers
- It's dynamically typed and hence complete cancer, because you can't tell whether something is an object, a primitive type or a function
- Because it's dynamically typed, passing arguments is a huge pain in the ass, since you can't know what type of variable you have to pass to make the function work! If you're writing a library, you have to use "type hints"??? WTF?
- "Pythonic" code is a bad joke where you write so fucking much in so few lines that it takes years to read somebody else's work. Compare it with something more verbose like Java or C++ where you can get an idea of someone's code after skimming through it a couple of times
- It uses fucking INDENTATION to denote nesting, because braces were "too ugly". So now if you want to rewrite your code to look a little bit differently, well turns out, YOU CAN'T! You have this shit shoved down your throat
- It won't fucking run if you mix tabs and spaces, because it's too fucking hard for the interpreter to sort it out
- It's slow as shit, even JVM languages beat it by fucking MILES
- The fucking 2.7 vs 3 debacle, which happened 10 YEARS AGO and people are STILL arguing on whether "print" is a statement or method

There's probably loads more shit if I sat down and learned it properly, but why the fuck would I do this to myself? I'm much better off writing something in C# than this bullshit.
I can't fucking stand it when I'm writing something in a proper language and some retard will come along and tell me "oh hurr durr why not write it in PYTHON? it's sooooo much better"

Attached: python-7be70baaac.png (512x512, 8K)

Other urls found in this thread:

youtu.be/qCGofLIzX6g
youtube.com/watch?v=ZGyx4GuGEj4&t=76s
twitter.com/NSFWRedditGif

are there any recommended books for learning c++ or c# as a beginner?
seems most c++ materials were written prior to 11
I don't know how important it is but it seems to be brought up a lot that 11 brought important things?

C++11 changed everything. Move semantics, type inference, lambda expressions, constexpr, uniform initialization syntax - it's a completely different language.
Newer versions do add extra stuff but it's the kinda thing you can pick up in a day. It's not a complete paradigm shift like C++11 was.

are you an absolute beginner when it comes to programming?

yeah my last time attempting to learn programming was back in 2005 or so but I went off the technology path.
neat, thank you for enlightening me.

you sure are dumb

>call OP a faggot and call it done
>no arguments presented
wow your mom must be proud

Attached: 1529770027854.jpg (576x512, 214K)

the language is opinionated and its opinions are bad

>indentation

Attached: vomtiblood.jpg (600x600, 67K)

>- It has gotten lots of incompetent normies into """programming""" and now they are trying to shove it down everyone's throat, including fucking embedded shit like microcontrollers
Fair
>- It's dynamically typed and hence complete cancer, because you can't tell whether something is an object, a primitive type or a function
Everything is an object, shows you know nothing about thr language
>- Because it's dynamically typed, passing arguments is a huge pain in the ass, since you can't know what type of variable you have to pass to make the function work! If you're writing a library, you have to use "type hints"??? WTF?
Never had this problem
>- "Pythonic" code is a bad joke where you write so fucking much in so few lines that it takes years to read somebody else's work. Compare it with something more verbose like Java or C++ where you can get an idea of someone's code after skimming through it a couple of times
Pythonic doesn't mean golfed code, it in fact means readable following proper semantics
>- It uses fucking INDENTATION to denote nesting, because braces were "too ugly". So now if you want to rewrite your code to look a little bit differently, well turns out, YOU CAN'T! You have this shit shoved down your throat
Literally not a problem if you are a real programmer and property indent code in any language
>- It won't fucking run if you mix tabs and spaces, because it's too fucking hard for the interpreter to sort it out
It's bad practice anyway
>- It's slow as shit, even JVM languages beat it by fucking MILES
It's fine
>- The fucking 2.7 vs 3 debacle, which happened 10 YEARS AGO and people are STILL arguing on whether "print" is a statement or method
3 won, get over it

BTFO
I am based and redpilled
/thread

no you

Python is very god for stuff like scripting, automating and organizing other programs that do the "heavy lifting".
Only a fool restricts himself to a single language.

not all functions take arguments, lad

a lot of them do

>"""programming"""
why did you comment that word out?

Attached: DliNseFVAAEQaIi.jpg (1898x2048, 261K)

any of you got any advice on learning materials. I guess I should put this in the stupid thread or the programming thread but clearly all the intelligent people are here.

but not all

>- It has gotten lots of incompetent normies into """programming""" and now they are trying to shove it down everyone's throat, including fucking embedded shit like microcontrollers
You'd say the same thing about BASIC back in the day.
>- It's dynamically typed and hence complete cancer, because you can't tell whether something is an object, a primitive type or a function
There are no primitive types in python and every object stores type information. Use type() and isinstance and raise TypeError.
>- Because it's dynamically typed, passing arguments is a huge pain in the ass, since you can't know what type of variable you have to pass to make the function work! If you're writing a library, you have to use "type hints"??? WTF?
Type hints are exactly for that you brainlet.
>- "Pythonic" code is a bad joke where you write so fucking much in so few lines that it takes years to read somebody else's work. Compare it with something more verbose like Java or C++ where you can get an idea of someone's code after skimming through it a couple of times
Good joke. I assume you haven't seen the amazing DSLs that can be spawned through templates.
>- It uses fucking INDENTATION to denote nesting, because braces were "too ugly". So now if you want to rewrite your code to look a little bit differently, well turns out, YOU CAN'T! You have this shit shoved down your throat
Style debates lose everyone's time, and projects need a style.
>- It won't fucking run if you mix tabs and spaces, because it's too fucking hard for the interpreter to sort it out
Be consistent, your codebase SHOULD NOT have mixed indentation.
>- It's slow as shit, even JVM languages beat it by fucking MILES
Yep, if you are writing critical performant code just use Java.

learncpp.com

well, i don't care about the functions that don't take arguments, the fact of the matter is that functions that DO take arguments are a pain to deal with

Retarded and bluepilled

Python is DOPE u just not using it right

>quick programs
>extra time to jerk off when done programming
>acting like dynamically typed is a con?
>your life is a meme

t. babby who only knows statically typed languages

Attached: anan.jpg (600x400, 28K)

>- The fucking 2.7 vs 3 debacle, which happened 10 YEARS AGO and people are STILL arguing on whether "print" is a statement or method
This has already settled for years, and 3.4+ is the rule now. No new codebase targets 2.7.

Pretty shit rant OP, you could have brought up actual language problems but instead chose to whine about why can't you keep your shitty code style.

>There's probably loads more shit if I sat down and learned it properly, but why the fuck would I do this to myself? I'm much better off writing something in C# than this bullshit.
kek

maybe if you learned more about it you would see the benefit of some of these things you complain about, but you seem already crippled beyond saving so never mind

Attached: 1534195994167.png (645x729, 107K)

cringe but redpilled

name one thing python does better than c#

I still come across people who swear up and down that they will never give up on 2.7

that's why functions without arguments are superior.
therefore is correct

Anyone that thinks dynamic typing is a good idea in 2018 is a fucking moron. There are no exceptions and Python's new shitty type hints are just a pathetic bandaid over a gaping wound.

how can you sit here and defend dynamically typed languages when you have to call methods on the object in order to determine it's type, and if something goes wrong it won't be caught at "compile" time (translated from text form to python "machine code" (.pyc)

>you could have brought up actual language problems
go ahead and enlighten me then

did I say anything about c#?

dynamic typing is better than a weaksauce type system like in POOlangs

the bare minimum for a usable static type system in 2018 is HKTs

>that's why functions without arguments are superior
>implying that it's in your power whether a function will take arguments or not
>imagine being this retarded

ok so if i'm going to write a sha256 function, how can i write it so that i don't have to pass arguments to it? oh what's that? i have to write it so that it receives arguments no matter what? well, would you look at that

if you can't keep your variables in check in a python project you are using the language wrong, it has local variables and that's enough

> No new codebase targets 2.7

All new Python projects at Google are still using 2.7. Python 3 is only used for a few open source libraries and that's an afterthought to supporting 2.7. Even Guido is forced to write Python 2.7 code for his work at Dropbox lmao.

we're not talking about new projects, we're talking about already existing projects that were written before they fixed all the broken shit with the initial releases of python 3 (remember, it took a few years before 3 was even useable)

Simple. You just don't forget the rules to the game your playing and debug as necessary :)

>learncpp.com
Is this well regarded? Again I have been out of the computer loop for 14~ years almost and back then people considered most online tutorials to be trash. still got my copy of accelerated C++ but apparently its completely outdated.
thanks for the reply.

>my language is shit
>don't worry, i'll just spend more time debugging it to get my project going

Mutable default arguments, poor lambdas, type hints require if TYPE_CHECKING on circular imports.

I like python but 2.7 is still a real issue that has to be dealt with often.

>if TYPE_CHECKING

Holy shit Python is awful.

Google's open source projects target python 3.6 and backport to 2.7.

k.

Portability.

don't forget the beautiful
if __name__ == "__main__":

you rewrite the function, baka.

how the fuck do you write a sha256 function without allowing the end user to pass an input to it? what the fuck is it going to hash, thin air?

Aside from 'data' related roles, hobbyists and teaching, Python isn't used that widely.

The only reason Python is used in data related roles is because of it's simplicity to wrap both Java and C++ giving a bridge between the two codebases that most regular programmers won't see.

There's absolutely no reason to use a language with implementations that are as slow as Python. Nowadays there are plenty of languages that are as expressive that have extremely fast implementations. If software requires Python, Ruby, the JVM or other slow bloated trash you simply shouldn't install it.

The actual worst things about python are not even on your list.
>no multithreading, use multiprocessing instead, even though it's 2 times harder to debug, makes your program use enormous amounts of memory, and you can't pass anything that's not picklable (such as lambdas)
>the import system is retarded and loves to shit itself
>community is retarded and every answer to any question you ask is bound to contain something about muh pythonic or muh dick typing
>libraries don't even use type hints so that you have to read their shitty docs, and worst of all, use magic strings as parameters instead of enums or anything else which is sane
>the community has accepted bad libraries as the gold standard (e.g. pandas, matplotlib) so nobody tries to make anything better

wasn't the committee working on 4.0 which wasn't compatable with 3 even while they were still fighting over 2 as legacy? lmao

Like?

Okay, have fun doing your data analysis in Java or C++ buddy...

>normies into programming
If you're good then competition is irrelevant.
It only makes you seem better.
>dynamically typed and hence complete cancer
Brainlets like you are cancer.
>Pythonic
It means easy to read code and promotes a similar code style.
And seriously, Python is preudocode.
If you cannot read it then consider changing field.
>indentation
You're indenting anyway so you could as well use it to your advantage.
>won't run if you mix tabs and spaces
Then don't. Are you so thick that cannot do that?
>slow as shit
Indeed. It is supposed to be used for scripting.
When performance is needed you're supposed to write a low level library to which you'll provide a high level interface. That's how numpy and other scientific libraries work.
This division was hypothesized by Tcl's creator and as such given the name Ousterhout's Dichotomy.
>even JVM
That is not a fair comparison. JVM is fast. A program running on JVM is within 1/3 the speed of C.
>2.7 vs 3 debate
There is no debate. Python 3 is where everyone is at.
Support for Python 2.7 exists only because some are too incompetent to port their code.

Basically every statically typed language in existence is at least several orders of magnitude faster and more memory efficient than Python. A better answer would depend on what you're trying to do and what platforms you need to support.

>>no multithreading
wat?

I guess something that fits most of the stuff python is used for, while also being as normalfag-friendly. It's your argument that there's no reason to use it.

You do know that all that "data analysis" gets translated into production ready code in, you guessed it, Java, C#, or C++... right? The model with tabular data is made in python and then the real programmers rewrite it in the real langs.

Really wish Python never won out over Perl. Perl is the superior language by far.

>mind blown
And that's what python is meant for. Efficient normalfag-friendly scripting that usually interacts with more efficient systems.

They're both shitty dynamically typed cluster fucks that need to be put down.

pozzed and blackpilled

ok so a couple general questions
1) how many/what are the programming languages?
2)what is each language best suited for/used for commonly
3)what sort of programs would you need to be able to program each languages, such as compilers, debuggers, source code editors, interpreters, etc
also please answer in this format
language > uses/used for > necessary "tools" best suited for said language
please and thank you

Doing data analysis in Python is so clunky compared with R.

Exactly. Python belongs on /sci/ with Matlab and R.

Or here, seeing how it's programming nonetheless.

>- It uses fucking INDENTATION to denote nesting, because braces were "too ugly". So now if you want to rewrite your code to look a little bit differently, well turns out, YOU CAN'T! You have this shit shoved down your throat
Indentation should never be a matter of choice. It must be mandatory. That's the only point where python is right.

It would be absolutely fine if it stayed in the realm of scripting and didn't get pushed into full on programming.

Might as well have threads on pipettes here since they're technology.

It is used for scripting (who will have guessed), console utilities, web development and user interfaces, so is Jow Forums as well.

relevant
youtu.be/qCGofLIzX6g

That's a retarded analogy and you know it. But sure, if there was some reason to have a thread on pipettes I say go for it. As you say, they are technology.

Considering we've got watch threads pipettes won't be out of place.

watch threads should be moved to /fa/ desu

People that use dynamically typed languages shouldn't really even be considered programmers. They're just gluing libraries together that are written by their superiors. It's like having the same name for a world class chef and the local 14 year old McDonald's fry guy. In fact, it's worse because the fry guy deserves far more respect than a Python programmer.

I still like it. Fuck you

>The model with tabular data is made in Python

I wonder why

Is Jow Forums moving too fast for you or something? Why do you care if there are threads you don't like on the slowest board on the entire site?

There are like hundreds of different languages.
From the general purpose ones to the memes.
All you need to program are butterflies or emacs.

Attached: 1515761902014.png (740x406, 151K)

relevant

youtube.com/watch?v=ZGyx4GuGEj4&t=76s

> They're just gluing libraries together that are written by their superiors.
Yeah, because C, C++ or Java guys never use libraries, of course. I guess the only real programmers are the asm guys.

Because it just werks (except when it doesn't). Problem is that data scientists work with dummy data that is tabular (non-relational). They're more concerned with creating the model than actually having production ready code. So then the python mess is passed onto a systems engineer and they say "WTF is this supposed to do?" and end up rewriting most of it from scratch.

I'm pretty indifferent towards it to be honest. There is a time and place for python and in those moments it shines

Multithreading in Python gives you literally no performance increase unless you're blocked by IO because of global interpreter lock (i.e. too lazy to write the interpreter properly). In any other case you have to use multiprocessing to get any performance gain.

That webcomic (whatever the fuck it's called, I can't remember) never fails in not bringing a smile to my face. It's the perfect blend of smug and predictable.

The problem with this argument is that there are large scale complicated libraries written in Java, C and C++. In Python, basically everything even remotely complicated/interesting is written in C or C++ because Python is far too slow and inefficient to write anything non-trivial.

>It has gotten lots of incompetent normies into """programming""" and now they are trying to shove it down everyone's throat, including fucking embedded shit like microcontrollers
not an argument, just tell them to fuck off or remind them all the shit they import is C

>- It's dynamically typed and hence complete cancer, because you can't tell whether something is an object, a primitive type or a function
yes you can, type(X) returns this. It's still dumb how you can change references after inialisation, i'll admit.

>- Because it's dynamically typed, passing arguments is a huge pain in the ass, since you can't know what type of variable you have to pass to make the function work! If you're writing a library, you have to use "type hints"??? WTF?
def func(var) :
if type(var) is not in [var1,var2...] :
raise Mydick
do.shit()
but yeah it's dumb

>- "Pythonic" code is a bad joke where you write so fucking much in so few lines that it takes years to read somebody else's work. Compare it with something more verbose like Java or C++ where you can get an idea of someone's code after skimming through it a couple of times
Ok this one is valid, no one knows what the fuck they're doing in python.

>- It uses fucking INDENTATION to denote nesting, because braces were "too ugly". So now if you want to rewrite your code to look a little bit differently, well turns out, YOU CAN'T! You have this shit shoved down your throat

>- It won't fucking run if you mix tabs and spaces, because it's too fucking hard for the interpreter to sort it out
Use an IDE instead of notepad user, they all convert tabs to 4 spaces

Why are you using Python for the level of high performance where that matters? That's not Python's intended use.

hardcode it.
normally with other languages you would have to recompile the program, taking a long time.

python doesn't need to compile, so in the end you're saving yourself time.

> we should accept shitty performance for no good reason

My (now econ professor) friend Michael loves that comic so fucking much. He also loves Big Bang Theory and Firefly. I notice that a certain crowd loves this normie "nerd" shit.

Well, you should have made that argument instead of the shit argument you made then. Pajeet cobbling together C libraries in C or Pajeet cobbling together C libraries in Python isn't that different really.

Dynamic typing is lame. The compiler should know as much information as possible. Type checking, borrow checking, explicit trait/type class/interface instantiation, static_assert... THAT'S THE GOOD STUFF BABY

But Python isn't meant for performance. Why are you using a tool not meant for performance for performance? Were you dropped on your head as an infant? Use C or something that's intended for your use. You seem to think Python is meant to replace, or in some way contend with languages like C. It's not and only a ultra-brainlet would think so.

What even are the issues porting 2.7 to 3? The main difference that I'm aware of is the change of strings from ascii to unicode. Aside from that dicts and itertools seem to behave slightly differently.
Surely it wouldn't be too hard to refactor changes like these.

Attached: amIDisabled.png (500x375, 190K)

Superior choice

Attached: vita-nuova-1.png (800x799, 37K)

When something takes an hour and is trivially made parallel, there's no reason not to do it (e.g. preprocessing data for meme learning). With 5 minutes of additional work I would get my results 50 minutes earlier. Unless the interpreter is too shitty to handle it, then enjoy multiprocessing and using literally 8 times as much memory.

>Surely it wouldn't be too hard to refactor changes like these.
It's not.