Is it true?

Is it true?

Attached: 623464235.png (604x431, 26K)

simply don't write program

He is the co-creator of C so it makes sense he would say something like that.

No, debugging is not always harder than writing the code in the first place.

Yes. If you get too cute with your code the first time around, going back to it for debugging or adding features becomes a total pain in the ass.

>co-creator of C

Except he's not. He's the co-creator of the reference manual which came years later.

Pretty much. Too many people conflate clever and elegant code. Clever code is often just a hack that is bound to break and likely difficult to test. Elegant code is clean and simple while getting the job done. Elegant code may not be as fast as clever code, but it’s definitely more valuable unless you are in the exceedingly rare case of requiring the absolute fastest code possible.

>exceedingly rare case of requiring the absolute fastest code possible.
Do you realize that web and desktop programs are part of the minority of all the code written yearly for decades now?
Most of the code written on this planet is highly efficient and fast C/C++/whatever because otherwise you would not have real time electronic in cars, forget about smartphone, forget about drones, forget about sub 5sec networking latency.
Developping is not and should not be an accessible to everyone tool especially when there are lives at stacks, it's a really bad mindset to push that it's ok to write slow code because it's easier to understand.
I build geometrical libs for a living and I can certify that C++ templates are the only tools available in 2018 that allow me to achieve near optimal performance for high dimensions geometry problems (around 16 as I write this).
Writing code is easy, writing good code is an art and not eveyone is cut for it, you should accept that not everyone can do it.

t. codelet with no job who hasn't seen the java molochs that support 90% of services in the world

"not absolute fastest" does not mean "slow".

>writting
stopped reading there

I agree but you also have to know there is a valid argument to the other part, as long as you are meeting your timing requirements and writing code to reasonably reduce the processor overhead, you shouldn't be sacrificing the code readability for saving a few cycles, unless of course, it's critical.

this is also why perl is unreadable, it's entirely based on cleverness.

That statement is a non-sequitur.

K&R is the C spec, so yeah he co-created C along with whoever implemented it prior.

>imagine not writing your project side by side with sets of tests
But seriously though, what did he meant by that? What if your most clever piece of code is designed to do one particular job and that's it. Why wouldn't you be able to debug it?

There's some koans about it. In 6 months you won't remember how clever you were even if you can debug it now.

>write a spec
>that means he invented it

I bet you think Apple invented round corners too.

creating doesn't mean inventing, inventing doesn't mean creating. C is what it is because of K&R.

EBIN BURN :DDDDDDDD

Literally who is that and why should I care what he thinks.

>C is what it is because of K&R.
And how is that good?

Sounds like a pajeet.

Old Unixman from bell labs. You should care what he thinks because he's been proven right for decades.

It's better than the alternatives, probably.

that's why i just keep rolling random bit configurations, hoping it does what i want

To a certain degree, yes. "Clever" isn't really the correct word (he really means "convoluted") but I get why he uses it. Idiots think they're being clever when they write esoteric code just for the sake of entertaining themselves but they're just being obnoxious.

Clever code isn't necessarily convoluted though. Relying on knowledge outside of the software to get things done in a way that is not explicit is classic clever C/asm

>Relying on knowledge outside of the software to get things done in a way that is not explicit
That's also extremely convoluted. If you REALLY need to do that shit to avoid unnecessary data duplication or something then you better (a) document the shit out of the behavior you're exploiting and (b) make sure that that behavior is an explicit and documented contract of the external component and not just "a thing I noticed it does."

It's not necessarily convoluted, maybe esoteric. Implicit behaviors are usually blind shortcuts.

Absolutely. Debugging is at a minimum 2x as hard. Dont write clever shit, make it obvious. Also, it helps those that look at the code after you.

I could write giant ass single expression, but that's a horrible idea. Always break it up. Dont make massive class hierarchies, no one will understand it. Keep file length low,

>proven right
>buffer overflows
>better than the alternatives
>buffer overflows

Attached: this_nigga.jpg (225x225, 6K)

>don't use makefiles
there's no good build tools, a small subset of make is good though.

The solution to writing bad code is to not write bad code. Buffer overflows are working as intended.

>several decades later
>even the best C programmers still commit buffer overflows on a daily basis
>greatest security issue in the world
>no other programming language has this problem
>"haha it's just your fault for writing bad code man, the language did nothing wrong letting you write beyond the buffer you explicitly told the language to allocate lmao"

Attached: get_a_load.jpg (1086x652, 80K)

he's right.
>co-creator of C
top fucking kek.
>creating doesn't mean inventing, inventing doesn't mean creating
wow. that's a stupid thing to say after being busted for not knowing anything about what you're writing. well done, Jow Forumsenius.

>Buffer overflows are working as intended.
thanks, NSA, for your valuable contribution.

It's a faulty premise. C lets you do what you want and nothing more. Every single language or program ever written has this problem. It is not a security issue because it was intended to be used correctly, i.e. by someone who knows both their data and their language. The "best C programmers" would either be writing code that worsk as intended (by them) or not even good. If your best isn't even good thats entirely on them.

Generally though, C programmers write things that are very specific in use case and have precisely controlled input. C doesn't scale like some people seem to think it does.

>if you get buffer overflows it's because you want them
This excuse manages to be even worse than your previous one. Stop defending the indefensible!

I'm not sure you even know how to read

the secret to not writing bad code is not writing code. everything has already been accomplished, why bother?

Attached: 1541996492.png (441x174, 40K)

Source code is only bad if it isn't at least average.
I've seen fucking garbage shit still manage to print money.
It's not about being good, it's about not being a fucking giga-pajeet.

Jow Forums memers make it sound like world is filled with 130IQ successful white men with 7" dicks. In reality, most people you'll be dealing with are 110IQ mutts with 5" penises or even vaginas, who write garbage code that just barely does the job, yet they are still totally worth employing, because it gets the job done.

Yes but there is plenty of room for mediocre programmers as well. Maybe even more so.

>participated in the creation of a terrible os and language that have stagnated computing for 30 years
>Worth listening to anything he has to say
The absolute state of cniles.

No, if you get buffer overflows its because the designers of the C language wanted you to have no protection (for low level speed reasons). If you go in raw and fuck it up, you write protections against these things. You can't add that to the lang/std because that costs speed and someone better than you can get it right. C isn't for bad programmers its for good correct software.

No, you buffer overflow happens when there is oversight. It's often overlooked or not audited correctly. Any programmer worth their salt can prevent buffer overflows, but no one is perfects and mistakes happen. Besides, we have stack canaries now, which make it a hell of a lot harder to do traditional exploits.

>connect positive to ground
>fuse blows
waaah why does nature let me brek my circuit? I need retard protection!!!1!1

I’m pretty sure he’s talking about using shit like bitwise operators to do simple mathematical operations like a modulus which most compilers will optimize to a point where the difference in performance is negligible, but the cognitive drain is high.

Depends who's code you're debugging. If it's prajeet spaghetti then it's often easier to rewrite from scratch for sure.

>Write code you (((hope))) will work
>Writing that code requires lots of thinking
>Code doesn't work
>Have to apply an equal or greater amount if thinking to overcome solution
It is by definition more difficult

>Writing code with the maximum amount of thinking
>Doesn't work
>Can't debug it because you can't think any harder
It just makes sense™

fucking this
>tfw company would lose too much money if they let me go because the higher ups made horrible decisions over the last decade and I'm basically the only one who has somewhat of an overview
sure, some of my co-workers went too far and got canned, but now the company is stuck with me.
several attempts to port the codebase by new hires ended horribly so they intend to drag out the heat death as long as possible by handing out a nice bonus every couple months.

tl;dr: sales people in charge of a tech company ruin everything

Smart people don't write """clever""" code, I guess

>C is nature

Attached: brainlet.png (464x450, 156K)

Smart people write clever code only to find out later on they're not as clever as they thought they were and all their co-workers/teammates are much much less clever than that.