Readability VS Performance

I just finished the Clean Code book and the author seems to claim that easy to understand code is better than fast code.
I believe using tricks to improve performance is ok as long as you document them properly.
What do you think?

Attached: 51oXyW8WQwL._SX387_BO1204203200_.jpg (389x499, 33K)

Readability should always be favored over performance, except in cases where code is performance-critical. In that case, you should clearly document what your code is doing.

Now, cue 1000 butthurt Jow Forumseysers telling me that everything should be written in C and be super optimized

It entirely depends on what you're doing.
If you're doing embedded systems programming for aerospace shit, then you should obviously favour performance.
If you're doing enterprise web shit then you know another 100 pajeets will already destroy performance so you should favour clean and readable code that any other retard could modify easily.

C is slower than C++ btw, raging autists like suckless faggots are deluded college dropouts.

/thread

Performance is overrated. Most applications do not need to be well optimized, because even in slow-as-molasses languages like Python, individual instructions are executed in microseconds. Write code that is understandable, but don't be reckless and start putting O(n^2) shit everywhere.

You can make understandable code perform but good luck doing it the other way around.

It's hard to know what needs optimization early on. You don't want to optimize code you might delete.

Sometimes you could spend time optimizing a tight loop only to later realize you can cache the whole thing anyway.

I think it has a lot to do with the type of work you do. I work on business software using mostly C# and SQL. For our, performance isn't so critical that we need to pay attention to how many milliseconds of difference there is between two implementations. Additionaly, .NET compilers and the SQL Server query optimizer are usually smart enough that I don't need to worry about microoptimizations. You also have to take into consideration the fact that programmers' time is often costly if you work in a business - so costly that in my situation it actually makes sense from a business perspective to write readable code over performance code (assuming it's a small difference, which it more often than not is), because users won't notice a 10 ms difference, but if it saves a total of just a couple of hours of work, it's a smart decision to choose readability over performance. Every time you have to work on spaghetti code, you usually have to re-familiarize yourself with it if you haven't looked at it in a few days, which is a waste of time you wouldn't necessarily have with code that's immediately clear from reading it what it does.

Obviously if I was writing industrial control software in C or C++ where saving milliseconds or even microseconds is worth the added code complexity, it'd be a totally different conversation.

gcc -o3

readability traded for performance problem solved

This. And it makes retarded algorithm puzzles in interviews especially hilarious. 90% of coding skill is being able to write large software that doesn't become completely unmaintainable at a few thousand lines.

>90% of coding skill is being able to write large software that doesn't become completely unmaintainable at a few thousand lines.
this times a trillion

This. Webdev has already proven that the monkeys who use most software don't even care about performance. And there is undoubtedly more demand for consumer-facing software in this world than backend/systems

100%

Very often, people don't even have a good understanding of what is fast or slow anyway, especially when writing in high level languages.

Sure, don't be an idiot who uses exponential algorithms everywhere, but apart from that you should prioritize readability.

The one exception is when writing library code that many other people will use, and are supposed to know what you are doing. Hell has a special place for people who write extremely slow libraries. The API's you expose should make it easy to avoid calling things that are algorithmically expensive.

Attached: klossy.jpg (869x1776, 226K)

I run into the absurdly slow lib issue when working with anything written by Oracle heavy Java developers. It shouldn't take four hours to crunch 10,000 accounts through a billing system wtf.

It's only 90% of the work if you're not completely crippled by an inability to solve simple problems quickly.

This. Regardless of how fast Java the language is, Java + ecosystem is often extremely slow, because of developers who don't care about performance. This often ends up being more important than language performance in typical use.

maybe you haven't written big software yet

These. At work, I have the unfortunate pleasure of maintaining a 30 year old medical billing software suite that has gone through about a dozen business mergers and selloffs and developed by over a hundred different people of varying skill. It's an unreadable mess of spaghetti code that I would give anything to scrap and start fresh if given the choice.

Is there some way you could redo it very incrementally?
Could you make a sane overarching system that reused bits and pieces of the existing mess in an isolated way?

Exactly
>"Premature ejaculation is the root of all evil."
-D. Knuth

something is off about that quote

It should be a balance between the two. Favoring only readability is the sign that the programmer is incompetent.

Trying to balance between the two is getting neither.

Readability should always come first, as the saying goes, you spend 90% of your time reading the code and the other 10% writing it.
Performance is more often than not a non issue as a relative small portion of the code will actually be critical. I mean, why would you care squeezing as much character per second out of a printf anyway, the same however doesn't hold for memcpy and co. There fuck readability, those function needs to squeeze every last drop of performance out of the processor.

>C is slower than autistic-level C++ that takes 100 times longer to compile btw
FTFY

Don't use python and you'll be fine.

What is a case of performance being sacrificed for readability? Can't really think of any.

I've lead and hired for teams working with and creating 100K+ line codebases. I wouldn't want someone who couldn't write clean code, and I wouldn't want someone who couldn't solve basic problems.

If you can't solve simple problems notably you can't debug anything and you can't contribute to the more complex parts of the code very well.

Also, there is more problem solving going on than you might think in making clean code. Solving how a human might approach your documentation and making sure they're lead to the right place is a bit of problem solving. Determining how to divide up functionality between modules requires problem solving. Finding a descriptive and easy to understand names requires problem solving. Finding information about what you're working on requires problem solving. Determining the most fragile parts of your code in order to insert additional logging and error handling requires problem solving. Determining the best way to write tests definitely requires problem solving.

You can know that you have to document code, test it, organize it, and simplify it. However, if you can't really do any of that on your own, you're useless.

Similar to how IQ is the best general predictor of success we have, I suspect gauging problem solving is the best way to determine the aptitude of a software developer. Now, algorithm puzzles are not the only way to do this, and definitely shouldn't be the only tool in the interviewer's arsenal, but they are a decent tool.

That explains why software is so slow nowadays. Programmers don't care about performance because it's "impossible" to make compromises. It says more about the programmer than the codebase they're working on and the programming language itself.

>C++ code is 1000 levels of template expansions that leads to kilobytes of error messages for a simple mistake
C still wins, brah.

I own this book too.
It's mostly about consistency of project structure, naming conventions, documentation when actually necessary, etc. Clean code and fast code are not mutually exclusive.
Fun read, 8/10 would recommend.

Attached: image (1).jpg (557x564, 52K)

You make your point clearly.
Since you're a clear thinker, I have to ask: Any book recommendations?

Is anyone else on earth going to be reading your code? If no, then it readability doesn't matter at all. Now if people are reading your code, the time you spend making it more readable is 1000x more expensive than the 0.1ms longer it may take to run. So is spending a few days on "readability" worth it if anyone working your code only takes 5-10min longer to understand it? Most likely not. If it works, its good code, nothing else really matter. I've seen a lot of "pretty" code that doesn't work for shit. And a lot of bad code than breaks a lot of "rules" about formatting work perfectly.

You shouldn't undervalue readability and code that is easy to understand, but I would prioritize performance where it actually matters. What I mean by that is that in the real world your application's performance bottleneck may not necessarily lie strictly in the computational part that you optimize.

For instance if you're using a database or some other form of slow I/O extensively you may not actually be hitting any bottleneck in terms of execution time in your code, as in a DB query is so slow that any improvement you do to the code around that won't have an actual, measurable impact on real-world performance. In cases like this I would favor simpler, easier to understand code over a faster version.

Basically, you need to be aware of what your application is doing and how critical each piece of code is to the end result in terms of performance. There will be areas where having the fastest implementation literally doesn't matter, as such you should focus on other things that do matter.

>Is anyone else on earth going to be reading your code?
your future self will always be reading your code

Shamefully, I've read nothing but Jow Forums recommendations and useless books recommended by startup executives.

If you're looking for something a little different and you don't already know much about chaos theory, Chaos: Making a New Math is good, although I suspect there are better chaos theory books.

This, I have fucked myself over many times

>If no, then it readability doesn't matter at all
You may end up regretting that down the line when you have to implement some major change on a piece of code you wrote 2 years ago and haven't touched since.

>chaos theory
Interesting, thanks.

this

Attached: unknown.png (741x314, 27K)

It seems a very careful struggle, ey?

The faster the code seems to get the harder it is to read and program in. The slower the code is the easier it is to make the syntax easier.

If only there was a fast language where you could redesign the syntax in the middle of your code and add your own syntax on the fly thus allowing you to write your code however you would like.

Attached: 7ff.png (640x480, 184K)

>lisp

> And a lot of bad code than breaks a lot of "rules" about formatting work perfectly.

The rules are there to make the code easier for collaboration, documentation, and automation. You would know that if you've worked in a team environment before.

if you're creating code for an end-user that isn't you, readability don't mean shit

I prefer to write two functions, the optimized function and the easy to read function. Then there would be unit tests to validate their behaviors are the same. That + comments and you have a pretty understandable approach. The primary downside is the extra maintenance efforts since you have two implementations that must match behaviourally.

How about using the understandable version to do a little generative testing on the fast version?

uncle bob's trilogy is a piece of art. the most influencing books after sicp.

kys

why?

performance is readability
assuming you are smart and anyone who reads your code is smart
they will immediately understand the purpose of highest performant code because they would have solved the problem the same way.
if this is not the case, either
>you are dumb
>they are dumb
Or both

If they are dumb, disgard them.
If you are dumb, disgard yourself

>Implying everybody that you'll work with has an IQ >130
Simply employing geniuses isn't the answer. A team of retards and competent managers can build skyscrapers. Software is no different

Competant managers aren’t a thing, and of course you can employ only geniuses.
You just have to find them, and also ensure that the way you discriminate against retards isn’t blantently stated “no retards”

Depends on what you mean by "trick".
Overall you should readability and maintainability over special optimizations. However, knowledge about proper architecture (that includes language design), data structures and algorithm usage aren't such optimization.

That being said, Clean Code is a cult which is stating the obvious at best and being opinionated at else.

>Competant
Competent
Hail spelling & grammar.
There seems to be a wave of people that switch e and a for whatever reason.

>whatever reason.
On second thought, it could be dyslexics rotating them letters.

Free market brainwashed us into believe that we should make things easier to modify after client/managers change opinion by 345th time, or that we should make things easier for the company when they replace us for a cheaper pajeet.
That is the sad truth of readability, modifiability and maintainability. And this reality is foreign to innocent fresh grads who usually had fixed requirements and a taste for performance. Code written that way is OK, it works and it is fast. It can perfectly be wrapped into the black box of complexity. Can't read it? There is no need to, and also you are a brainlet.
It is corporate programming what is wrong.

FUCK CORPORATE CRONY CAPITALISM

Attached: 1508517367555.jpg (682x1024, 95K)

New CPUs come out every 1-2 years. New, better code doesn't.

His last book is just a mashup of the other books. Still incomplete and ambiguous.

>of course you can employ only geniuses.
>You just have to find them
MM sweaty that's bullshit. enjoy spending $5m a year on 3 programmers

no

Have you profiled your code first?

>of course you can employ only geniuses.
>You just have to find them
You're retarded. How many geniuses do you think there are in the world, how much would it cost to hire them, and what is the absolute limit on the output of a Quality You need four times better units to make up for half as many units. You need sixteen times better units to make up for a quarter as many units. A team of 10 programmers need to be ten thousand times better than a team of 1000 programmers

Assuming you have a job then you're likely writing JavaScript. If that's the case you should really write your source code as legibly as possible, then use an uglifier/minifier to condense it as much as possible.

I guess you could if you had nothing better to do and agreed to work on it on your own free time.

>I just finished the Clean Code book and the author seems to claim that easy to understand code is better than fast code.
Good programming languages don't trade performance for readability.
Code should be optimized at compile time.

This. We just tell clients and resellers that their hardware must meet certain minimum standards or else we don't support them. Fuck the cheapasses who hire their brother-in-law as their IT manager who insist it's okay to keep running Server 2003 with 512MB of RAM for an office of 20 employees calling in asking why it takes 10 minutes for our application to open.

>claim that easy to understand code is better than fast code.
When you have to deal with a fuckhueg legacy C++ codebase that has been beaten to shit by interns, pajeets, seagull consultants, and other shitcoders, you will understand.

Attached: 1409021535165.jpg (500x738, 234K)

Just work with tiers.
If it's stuff that don't require actual performance such as GUI shit, make SURE the code is as readable and easy to use as possible.
If it's processing shit, like stuff that does it per pixel or scan a shitton of data, you do what you need to do but comment it well.

>once had an O(n^9) complexity task
>was legitimately the only way to do it

Attached: z7gub14d03k01.jpg (600x580, 71K)

Basically everything in that list are difficult problems related to scaling a codebase. And there are more, like handling changing requirements and borderline sabotage by coworkers for a myriad of reasons. Social problems. Problems that autistic puzzle solvers often have a really hard time handling.

Compared to all of that, actually solving the original problem is like 10% of the work. Hmm, this sounds familiar.

I seriously doubt that.

nope, legitimately was; I was working inside a really shitty API framework and short of doing caching at every boundary condition it was the only way

To be fair it wasn't really n^9 but more n*o*p*... etc; two values would be basically always under 5, another under 36 or 18, one at 8192 fixed, and another at around 250 typical use case. Rest would probably be around 30 to 40.

It would only run once every five minutes, operated in its own thread, and the operations done inside the loop had a load of short cut conditions that'd typically cut Average case a ton. Worst case was still giant though.

Attached: 1486663586163.jpg (600x534, 45K)

premature optimization is bad and you can always spot someone that is new to software when they do it or insist on it. These people apparently have never learned what a profiler is and feel the need to waste a ton of time optimizing things that have no relevance.