*yawn*

*yawn*
Compiling is for suckers. How do you autists stand this shit?

Attached: Untitled.png (986x718, 303K)

by having a decent machine.
my ThreadRipper 2990wx compiles AOSP in about 15 minutes.

Languages like sepples and rust were a fucking mistake. C programs compile way, way quicker.

What's the point of compiling if your CPU is too fast to notice gains?

>what's the point of compiling
to allow a program to be executed by the computer.

I can just grab a binary

every time i run tests at work it takes over a minute of compiling and that's incremental compilation on a huge cluster of machines
i fucking hate this industry

>incremental compiling
fire your sysadmin

why

anytime two different CPUs (or machines) need to communicate with each other will inherently induce latency and will cause it to take longer for the task to complete.
for a task like compiling, it's best to use one machine with a retarded amount of CPU power within itself. so that it need not waste time communicating with other machines.

by using python

It finished compiling and this firefox runs 5x faster than the binary one. What is this hocus pocus bullshit?

the latency between the machines is negligible compared to the cost of sequentially compiling millions of lines of macro and template heavy C++ since they are connected directly to each other
plus there is no single cpu that can replace a cluster of thousands of machines

yeah why get a better ISP when you can just compile

Attached: 1561076637041.jpg (400x400, 29K)

My ISP is fantastic, I'm talking about general performance like scrolling or hardware acceleration.

>webkit takes 24hrs to compile
Dont build bloated software

It took me 2 hours to build the entire firefox browser with Ofast and march=native on a i3 530.

2 hours is too much

You only need to do this once a week max plus it's just a shitty i3 530.

How the hell do the devs even debug that shit when a build cycle takes >15min?
I just dont even bother building a webbrowser anymore and just use an older version of seamonkey forever or until a new slackware comes out.

I have no idea. In spite of that, they also utilise 10 different turing complete languages, like give me a break.

>Compiling is for suckers
> being this retarded
how do you clueless fucking faggots even live? not everything is compiled for easy use by clueless fucking faggots such as yourselves. running a make file is not complicated, retards.

>still using make
time to take your meds grandpa

It's not as simple as running a make. Off my head, you can't build chromium, firefox, entire linux distros, linux... You need some sort of configuration which sucks.

> unable to understand the deep mysteries of using linux to build things
this board is truly fucked.

running make isn't hard, but writing and maintaining a proper makefile is horrible. been there, done that. just use bazel.

bazel is pretty bad as well (so is GN). Google just took their make files and a build system that could support their messed up use case for their huge projects, instead of restructuring their project. So bazel and gn design ends up being a lot of bloat which is not necessary for new projects

well.... it is a necessary evil. The only way to not compile is to write in machine language - either assembly or 0s and 1s. Both are bad choices, so better write your hello world with 10 characters in python rather than 10000 1s and 0s. or 100 assembly characters.

cmake is an abomination onto this world, make works just fine

I don't compile things written in trash languages like C++.

why use build tools at all? you fuckers sure LOVE your build tools.

Well the reason they did that is because they won't end up with 10 thousand copies of the same dependency with different versions since some teams won't bother to update.
Since they have a test system continuously running, if you upgrade, you have to make sure that it breaks no one. And if it does, you have to fix it.
The upside is that everyone gets the same versions at the same time (so new features at the same time) and everyone can share their libraries with everyone else (instead of reinventing the wheel over and over again).
The downside(?) is that you have to be constantly testing and making sure your tests are comprehensive, since if they're not, you're going to break everyone all at once later down the line. I'm convinced a monolithic repo is the best solution for a large company with a distributed workforce and teams all doing their own thing.