Perl vs Python

Which one will be better for:
1.System automation and administration
2.Networking
3.Web scraping
I've heard a lot that python is more modern including many libraries(modules)?

Attached: Python-vs-Perl.jpg (420x280, 13K)

I haven't used perl personally, but python looks comfy

My time to shine, as I am an expert at both and use both daily.

Python, but not for the reason you think.

Perl gives you 1500 ways to do everything, with no strict guidance on how to do them. This inevitably means that your code will look different from other people's code. It is very difficult to pick up someone else's project and work on it.

Python gives you a lot of flexibility, but not nearly so much. If people just follow the big PEPs you'll be fine.


Its worth mentioning that one of the reasons JAVA remains popular is because it really forces a specific coding style that everyone will share.

Attached: zycrssws8xj21.jpg (2741x3070, 716K)

I made this thread yesterday. What I like about Perl, at least from what I've read so far, is that Perl version 1 code will still run with your Perl 5 interpreter, or at least it should.

Run a Python2 script with Python3, and you're going to have a bad time. npm install a year old nodejs project, and you're going to have a bad time.

Common Lisp

Perl is dying, python is going to save you shittons of time.
>web scraping
none of them apply. If you want to do modern web scraping, you should learn pyppeteer. If you try to go selenium, you will get some results but will always fail working with modern websites.

*puppeteer
written in javascript. Pyppeteer is ded python fork of it.

>Perl vs Python
>Which one will be better
LUA. Harder, Better, Faster, Stronger.
Smaller.
/thread

>1.System automation and administration
perl
>2.Networking
perl
>3.Web scraping
python (scrapy)

does perl even exist anymore?

t. "JavaScript ninja"

Scala

nah, python "data scientist".

they are both old, deprecated dropout bullshit that are more trouble than they're worth

>>web scraping
>none of them apply.
Oh I fucking disagree, you don't need chromium instance in the background, you just need python + requests and be clever how to recreate request flow to scrape what you need.
I'm working at the company specializing in data scraping, I've yet to see a website that absolutely needs sellenium to work with. Sure, it's easier, but then you'd need quite monstrous infrastructure if you'd like to scrape millions of pages a day.

>python + requests
how are you going to scrape js-only websites, let me ask?

What's the best linux distro for web scraping and data science work?

Please go to a a js-only website and open dev tools(inspector). Tell me what you see, user.

fuck.
now theres all kinds of hacker shit on mx mac. will the fbi now come for me?

Directly from the api the js-only website uses to pull the data, it's even easier because you don't have to deal with html.

how is that related? How are you going to scrape dynamic-fetched data, data that gets fetched and pulled only through javascript in the webpage? Will you look for the source code of each script to see where it comes from and then hack hashes/nounces/whatever kind of defense they have?

You also can't get around firewalls kinda like udemy has, made specifically to detect web-scrapers and parsers.

>js-only website
the fucks that?

I don't even know. I assumed he meant a react/angular website, where viewing the source code is just a bunch of bullshit.

As much as I hate python, python is better than perl 99% of the time

The only time I really use perl is when I need to execute shell commands regularly. Python does it too, it's just easier and faster with perl. I could also just bash script it too, but bash scripting is shit.

>the source code is just a bunch of bullshit
the current state of webdev right there
as was foretold

a bunch of JS

Perl

Attached: boomersme.jpg (550x545, 31K)

you -> firewall -> server
Firewall is a webpage that executes some checks via javascript, checks some of the hardcoded variables in your browser, your screen size, checks if your user-agent was faked and then makes decision how much of human you are. If you have javascript disabled you will be told that you have to have javascript enabled to visit that webpage. If you aren't human enough you will get google captcha in your face.
Big daddy websites have that kind of firewall and selenium/fantomJS get detected easy af. Cloudflare firewall has the same mechanism but cloudflare is easy to bypass.

this

>1.System automation and administration
haskell
>2.Networking
haskell
>3.Web scraping
haskell

Attached: 1562064798149.png (1000x1300, 786K)

As someone who loves Bash scripting and hates Python, should I look into Perl?

Perl isn't even in the question these days. This would have been more apt in 2006.

how do you hate python but love bash? they basically almost same shit

are you high or something?

Mainly the forced identation and syntax in general, but also smaller things like missing case/switch statements and the additional layer for using externals tools.

are bioperl/pdl used anywhere?

>Perl gives you 1500 ways to do everything, with no strict guidance on how to do them. This inevitably means that your code will look different from other people's code. It is very difficult to pick up someone else's project and work on it.
>Its worth mentioning that one of the reasons JAVA remains popular is because it really forces a specific coding style that everyone will share.

Not pythonic

>you need.I'm working at the company specializing in data scraping, I've yet to see a website that absolutely needs sellenium to work with.
What do the clients ask for?

perl vs. awk do you mean

this is how professional web scraping is done.
browsers should be used only in extreme cases.

i would like to see example site with such protection which is able to detect selenium or headless chrome

how can you like bash and hate python
holy shit people are weird. Just install pycharm bro it will change your mind

>lua
>faster than anything

I still have some bioperl scripts running but it's basically abandonware. biopy is very solid now and even as a perl lover I kind of prefer it now.

maybe that user has only used it for pandoc filters. There, lua has an advantage because it doesn't need to parse the AST to JSON like python does.

Bump

By not being an idiot and see which ajax calls have the data I'm interested in of if they are defined in some included/inline JS object and extract that data from code as JSON. Have you read what I wrote?
Its all about recreating all necessary requests and having good XML/JSON parsers. Much less resource consuming than constantly calling sellenium/pupetteer (and usually better, sellenium sends some data that is easily used by bot blockers).

Usually prices for various items from e-shops and marketplaces (Amazon, eBay etc), we're also expanding into other stuff (stock info for stationary shops, banner position monitoring etc).

>1.System automation and administration
POSIX Shell and utilities
>2.Networking
POSIX Shell and utilities
>3.Web scraping
POSIX Shell and utilities together with curl, jq, xsltproc.

1. Perl
2. Perl
3. Perl

Are they asking for data of day x or a live feed?

Thx btw

Attached: C__Data_Users_DefApps_AppData_INTERNETEXPLORER_Temp_Saved Images_1561060734013m.jpg (682x1024, 89K)

>how can you like bash and hate python
Let me just quickly output the json response of a website's API to a file

Bash:
curl api_call > temp.json
Python:
import urllib.request
import json

answer = urllib.request.urlopen(api_call).read()
answer = json.loads(answer)
with open('temp.json', 'w') as f:
json.dump(answer, f)


Mind you, I realize Bash's limitations, but if I can use it, then I definitely prefer it over Python.

anything you brainlet

now manipulate that response by adding, removing and modifying key-value pairs and write them to a file. that's when you'd use python. to dump an output to a file by all means use bash.

you can also make your python script print to stdout and pipe it to file exactly like with curl..

Attached: 0000.jpg (512x564, 41K)

she looks like Tanner Mayes

I want to learn Perl just because I like regex :)

Yeah, bioperl have good database manipulation and PDL is excellent for physics.

That's when you use jq

>using js for automation

Attached: 1434179922692.jpg (674x673, 63K)

How does one learn that? I know basic scraping but i have no idea how to look at js functions

you don't look at javascript in most cases
you just look at http communication
just press f12 in browser this is usually all you need. Sometimes it is good to use proxy like burp suite for complicated sites like facebook where you can search through many requests in better way

JavaScript is based language tho

>jq
>js
u wot

jq is written in glorious C though.

>this whole post

Attached: haskell-chan.png (1920x1600, 697K)

It's not that rare, I do test automation and my client has a bot detection service, akamai I believe, which easily detects most bots. I have to use a special http header to bypass it.

In this pic yes but with tits

Rose monroe

Currently make 190k/yr and write copious amounts of perl. Lol

>System automation and administration
Shell scripts. Bash doesn't change, python libraries do cause the worst dependency-hell of any language
>Networking
Dunno
>Web scraping
Shell scripts. pup is a great html parser written for shell scripts. Takes way less code to do basic scraping with bash than with python

i thought he misspelled js because Jow Forums is full of twink web developers. looked up jq and i suppose it's alright but why use that when most linux distros come with python by default and it can do that much more?

>why parse JSON with a DSL specifically for parsing JSON instead of loading it into a python interpreter and writing out an entire script to do what you want

it's F A S T E R

>LUA
zero library

What's the best language to make a bot interact with a website? Selenium fucking sucked the last time I used it.

this desu