Confession of a C/C++ programmer

I've been programming in C and C++ for over 25 years. I have a PhD in Computer Science from a top-ranked program, and I was a Distinguished Engineer at Mozilla where for over ten years my main job was developing and reviewing C++ code. I cannot consistently write safe C/C++ code. I'm not ashamed of that; I don't know anyone else who can. I've heard maybe Daniel J. Bernstein can, but I'm convinced that, even at the elite level, such people are few and far between.

I see a lot of people assert that safety issues (leading to exploitable bugs) with C and C++ only afflict "incompetent" or "mediocre" programmers, and one need only hire "skilled" programmers (such as, presumably, the asserters) and the problems go away. I suspect such assertions are examples of the Dunning-Kruger effect, since I have never heard them made by someone I know to be a highly skilled programmer.

I imagine that many developers successfully create C/C++ programs that work for a given task, and no-one ever fuzzes or otherwise tries to find exploitable bugs in those programs, so those developers naturally assume their programs are robust and free of exploitable bugs, creating false optimism about their own abilities. Maybe it would be useful to have an online coding exercise where you are given some apparently-simple task, you write a C/C++ program to solve it, and then your solution is rigorously fuzzed for exploitable bugs. If any such bugs are found then you are demoted to the rank of "incompetent C/C++ programmer".

Attached: 7A06909A-AF2A-45A8-988E-35DF040FCADC.jpg (1536x2048, 388K)

Other urls found in this thread:

yodaiken.com/2018/06/07/torvalds-on-aliasing/
twitter.com/SFWRedditVideos

yeah no you're not

Security bugs are usually a product of complexity; it's trivial to write a "safe" FizzBuzz, writing a "safe" encryption library on the other hand, is non-trivial.

Consider the fact that TLS is implemented in C.

Just the other day I saw a commit on sway that replaced something like this:
calloc(1, n * sizeof(var));
Seriously, there is no shortage of imbeciles in CS

One thing is faults like buffer overflows, the other is programmatic errors (such as the duplicated goto error line that caused heartbleed).

OH GODDD OH FUCCK OH OH OHHHH IMMM COMMINNNGG AAHHHHHHHHH

AAAAAAAAHHHHHHHHHHH I'M GOING TO FUCKING COOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOM

> Nice copypasta
> Poor bait though

What's wrong with it? Except for the parenthesis around var

more like confessions of a cumbrain

nmemb, size

so it should be
calloc(n, sizeof(var));

n * sizeof(var)
Can overflow, whereas if you call calloc like so
calloc(n, sizeof(var));
It will error out in case of overflows

While also an issue, overflow isn't the real reason though; memory alignment is.

Elaborate please

Some architectures, for example Nvidia GPUs, have memory alignment requirements, meaning that memory accesses must be aligned to 8, 16, 32, 64 or 128 bytes. If you allocate an array of four unsigned 32-bit integers, the difference may be in aligning the memory allocated to 64 bytes or to 128 bytes.

COW MILK

Attached: 1543430641339.png (666x819, 160K)

kys avatarfag, see you in three days. think over your life

This pasta reminds me of the TV shop commercials where they first show frowning people performing basic tasks with grayscale footage and then happy people using their unnecessary and probably non-functional solution in living color. It's also about as convincing

Who?

but that's done for performance not for security.

>trying on underwear or swimwear
um what
you are not supposed to do that

Attached: 1539540438363.png (407x402, 87K)

>not grabbing a panties some thot brought back from the fitting room
pleb

Attached: 1552157084341.jpg (387x291, 23K)

Would you recommend Rust?

yodaiken.com/2018/06/07/torvalds-on-aliasing/
Daily reminder it's impossible to use C correctly, even Linus doesn't understand the standard and advocates for ignoring it, basically inviting UBs and subtle bugs:
> Don't tell me "the C standard is unclear". The C standard is _clearly_ bogus shit (see above on strict aliasing rules), and when it is bogus garbage, it needs to be explicitly ignored
> The standard simply is not *important*, when it is in direct conflict with reality and reliable code generation.
> I've said this before, and I'll say it again: a standards paper is just so much toilet paper when it conflicts with reality. It has absolutely _zero_ relevance. In fact, I'll take real toilet paper over standards any day, because at least that way I won't have splinters and ink up my arse.
No, you can either support him, basically turning gcc with a set of the compiler options into the de-facto C standard, but then it's no better than other languages without a standard and you're a hostage of the horribly maintained project gcc is. Or you can argue against him, but then you have to somehow show that competent C programmers even exist, because the most famous C programmer in the world is clearly incompetent.

I always grab stuff furthest behind on the rack or shelf
less likely that somebody has been doing stuff with it

>C/C++
No such thing

The fact that we still have to worry about such irrelevant 70s-tier shit details shows how little actual progress in CS we have had since then.

Attached: 7AD76ACF05A54A96853F8BCA0F58E954.png (759x1180, 817K)

Coming up with new languages is easy, rewriting the code we use daily and getting people to adopt your new codebase filled with bugs that we already suffered and fixed on existing codebases? Not so much

haha it must suck to be a roastie

>t. obliviousy wears briefs with ballsweat of some chad who wore them for a selfie

the jokes on you, my mom buys my clothes from costco and I make her wash them before I ever wear them.

Hardware platform implementation details are not really CS

Linux have their own standards like every other language in every other program that's big enough to have external contributors.
The problem about those standards is that sometimes they are to big and implement shit that should be just per project, java has to much shit, C has to much legacy, and rust has faggots.

>I've been programming in C and C++ for over 25 years. I have a PhD in Computer Science from a top-ranked program, and I was a Distinguished Engineer at Mozilla where for over ten years my main job was developing and reviewing C++ code. I cannot consistently write safe C/C++ code.
roleplayer detected

It's his project, he can do whatever he wants with it, do you also screech like this every time you see a project that refuses to accept patches that contain certain features the maintainers don't like?

lol

>I have a PhD in Computer Science
That's always a good sign that you can't code for shit

who the actual fuck takes pics in a changing room with the security devices still on? why do women do this shit? it doesnt make any fucking sense is she retarded?

Maybe she wanted someone else's opinion on how the swimsuit looked on her?

There are security cameras in the changing rooms where you're from?

ACTUAL C++ programmer here, I literally get paid six figures to write C++.
t. Rust faggot
Whoever uses malloc/calloc while writing things in C++ is not a serious programmer.

>I was a Distinguished Engineer at Mozilla
Dial 8.

Programming is hard because when you make stuff and make some other stuff and then you try to make them connected it turns into fucking Rubik's Cube, when you change one thing, and that also changes several other things, and you have to keep that in mind. And since mind keeping isn't of infinite capacity, you then start fucking your own head with your own code, trying to keep track of all the things that happen and affect what's happening within a program.

how did you learn? Have any tips for people interested in the language?

>Whoever uses malloc/calloc while writing things in C++ is not a serious programmer.
Yeah it's a good thing sway is written in C, retard

What do you even mean you fucking leper? If you use calloc in a C++ (not a C) programmer, you have no fucking idea what you're doing and you should stop.
I'll go further, if you use new or delete in a non-core function (as in, hidden away by the time it reaches the API you're developing), you probably also don't know what you're doing and you should stop.
To be honest, it took years but I was very motivated because I could see the difference writing C and C++, in C I would spend an entire day implementing an algorithm at university, whereas in C++ I could crank out multiple solutions in just one sitting thanks to the (then) STL.
I'm not going to pretend C++ is easy to learn, but you can take the language step by step. There is no one book I can truly recommend, many of the highly regarded ones are obsolete. C++17 is a game changer. For example, did you know you can use multiple return values now?

#include

std::pair do() {
return {3, 4};
}

int main () {
auto [x, y] = do();
}


I will think about your question more and return with recommendations that are modern.

>C/C++
Opinion immediately discarded.

The post you replied to was talking about sway, which is written in C, hence the calloc() usage, are all sepplesfags this retarded?

> you can use
Can I use it effectively? Can I write easy-to-comprehend code with it? Are there any other caveats? Will space aliens come and fuck me in the ass at some very specific case of using multiple return values?

What does CS mean?

Cock sucker

>The post you replied to was talking about sway, which is written in C
And thus is entirely irrelevant to the OP which is about C++. Do you have autism?

the fact that you alloc'ed some random small thing should worry more than alignment if you care about speed

You could just act like an adult and own up your mistakes you know

counter strike

Attached: 1562381864465.jpg (3823x2773, 1.02M)

If memory isn't correctly aligned then you have no guarantees about what happens when you write into that buffer.

COOMING

Attached: 1503376133296.png (827x1300, 260K)

I just wish that Mozilla would run out of money to fund shills already.

>C++17 is a game changer
Yeah I've been really interested in getting serious at learning the language but there's so much out there that I just don't know where to start. It doesn't help that I couldn't use it at my current job and none of my colleagues are familiar with c++

Yes, but generally C generated by proof-checking tools with cleanroom software development practices, not the kind of idiomatic C you'd write by hand.

It's written in C for performance necessities. The language is a security liability.

dumb Rust shill cumbrain

More architectures crash if you dont align the memory correctly

>only 1 hit for this image
>it's this thread

You best provide source

she's not very pretty at all, just go jerk off to some other chick.

>she's not very pretty at all
I think she is

I don't believe your story about being a mozilla employee, but otherwise, what you write is true.
All C/C++ programs are FULL of just the kinds of errors that are only really possible in those languages.

And then you have the dunning-kruger Jow Forumstard whose pride and joy are some memory leaking fizzbuzz, a simple socket client / server chat copy-pasted from beej' guide and some pointer arithmetic project that doesn't really do anything and they assume that they now know what it feels to write correct, safe C code.
As if. Even the biggest C projects filled with the most competent C programmers, mandatory audits by tools like Valgrind and AddressSanitizer and peer reviewed by some of the top C programmers are still FULL of memory leaks, double frees, buffer overflows, stack overflows etc.
C simply isn't a good language. C is like going into war unarmed and without armor because you think the bonus agility will make you dodge all attacks you've just opened yourself up to.

>Two papers by Edward Nuhfer et al. in Numeracy (2016, 2017) reveal problems with the graphic introduced in the 1999 Kruger and Dunning paper.[16][17] Subsequent researchers used (y−x) versus (x) scatter plots and related variants for nearly two decades. Nuhfer et al. show that many publications that used these approaches seem to have erroneously interpreted mathematical artifacts (such as random noise) as the products of human behavior. Their papers use instruments of known reliability to reevaluate self-assessment measures from the perspective of signal and noise. They show how the mathematical problems inherent in the Dunning-Kruger type of graph can be overcome by other kinds of graphing that diminish the effects of noise or employ categorical data from known novices and experts. While many would have done so by chance, the authors show that roughly half the subjects were reasonably accurate in their self-assessments.[16][17]

>The authors' findings refute the claim that people are generally prone to greatly inflated views of their abilities, but support two other tenets of the original Kruger and Dunning research: (1) that self-assessment skill can be learned, and (2) that experts usually self-assess more accurately than do novices. The researchers noted that metacognitive self-assessment skill is of great value, and that it can be taught together with disciplinary content in college courses.[16][17]

Nibba not even the autistic swedish code monkeys that work for NASA try to write c++ code that never crashes.

>Felt cute
>Take pic and send it on Snapchat/instagram/Twitter for thottery points

The security tag on the piece of clothes.
>Calloc
Holy fuck what year is it

>I'm a highly accomplished post-doc with 25 years of experience working at Mozilla
>Let me shitpost on Jow Forums about how nobody can write safe code!
LARP: The Thread

>its not a good language because it has bugs!
>nothing I write ever has bugs!
Memory leaks are almost always lost heap allocations, which is just as common in GC'd languages because you left a dangling reference in some shittily implemented generic container. Everybody who programs regularly experiences memory leaks, and anybody who claims otherwise is a fucking liar. That isn't unique to C or C++. Neither are bad pointers, Java is notorious for null pointer exceptions blowing up the entire fucking program.
None of the problems people bitch about with C/++ are exclusive to it.

>b-b-buh pointers are haard waaaah

That's not even remotely how openssl was written.

Bad practice but this is why I don't commit until I'm ready to make a PR. I make several backups each day though

>his extent of knowledge about writing safe C doesn't go beyond shitty static analyzers, valgrind and "peer review"
nice brain
before you start talking about how safe C is being written in real world, try participating in at least one such project

That is why use Java instead. you start by taking course at Durgasoft.

There have been plenty of advances in new languages. It's just C that is forever stuck in the 1960s.

>C has to much legacy, and rust has faggots.
So Rust is superior to C, but Jow Forums will weep on your shoulder if you use it?

they glorify faggots and trans so probably not, the problem is that those type of people are a bit unstable and they end up tripping everyone.
its like a man screaming in the streets for no apparent reason, would you approach him?

OP comes back to double down

c++ is breddy gud

This makes sense
It has a plastic pin sensor on it's side there, it sets off an alarm if you take it out of the store.

a lot of new stuff is too hard for code monkeys, so they stick with the broken tools they know.

It depends on what programming language he's screaming about.

i wish we could change
assignment operator to :=
and compare to =
i spend to many hours of my life fixing
if(a = 5)...
pascal was right about it.

this snippet is all what is wrong about modern C++. why the fuck is x,y a reference. i know it is safe but it should't be. because x,y should not be a reference but the are. for me it feels like a ref to and r value.

Here's some words of wisdom for you comment posting idiots. You people think your opinions are so important and that you possess some natural-born expertise. But take it from ME, a REAL expert, when I say your comments are even more stupider than you people making them. Now I have a lot of experience, so when I say something, it counts. That's because I'm very important and I know what I'm talking about, unlike feeble-minded you. I see you want to retort by posting a reply, do you? PSHAW!!! You can't post a reply to me because you're speechless, and you're too AFRAID. Besides, no one wants to hear your stupidness anyway.

Do you know who I am? I have 9 black belts, 15 Masters degrees and a PhD in Applied Arithmetic. That's right, you know I'm way better than you, and all my fans and supporters will gladly tell you how great and awesome I am! Have you seen my power level? It's over 9000! You know what that means? It means I have more than 9000 units of POWER. It also puts my total adjusted force rating at 22000! That's more than triple, so you don't want to make me mad because anger is my middle name, and I give love a bad name, which only makes me angrier.

Remember: He who laughs last, laughs last. So, go ahead, I dare you to write a reply to my comment. I DOUBLE dare you to write a reply to my comment. But I know you won't reply to my comment because YOU'RE ALL TOO AFRAID.

What a Goner

Modern languages don't have null pointers, did you really think the solution to the problems of C is to use Java?

This is only a problem in brain damaged languages without a proper boolean type.

In normal languages it would only be an issue if the value is a boolean, in other words when you write if(a = true) instead of if(a == true)

but then you should just write if(a)

proper boolean is a joke because modern computer don't have them so we use 1byte and pretend. lets say a is int i want to compare it to 5 thats okay but its easy to mistype and end with an assignment instead of compare. thats because = vs == i want := vs == hard to mistype this.

if(int) has to work because it is fast.

What is type safety? Everything is bytes in the end

It does not have to work, in fact it's non-sense.

use an Unicode equal and then parse the proper on at compile time, i don't want another symbol on mu language just because people do typos

I wouldn't mind her try on some

you still missing the point what if we have something like
auto v = a = 5; //vs
auto v = a == 5;

modern C++ has
template
bool safe_bool(T&& cond)
{
return { cond}
}
narrow casting is a optinal in C++ but some people like to be slaves and don't want choice. they think compilers should enforce them to write better code.

What's your favorite language and why? What do you think on Elixir, Lisp, ML variants and Scala? Do you know anything about Akka, what are your thoughts on it? Sorry for all the questions lol

I told you, modern languages don't have this brain damage

let v = a = 5; //works
let v = a == 5; //v is now a boolean


wherever you pass v into now expects it to be a boolean

if you have a generic function that works on both types somehow it COULD potentially cause a bug, but that's much less likely

in the vast majority of cases it will just error out when you use the value

If it were up to me, I would actually just prohibit multiple assignment in the first place, and comparison is = only. If you eliminate all sources of ambiguity you could make it work. Unfortunately, in Rust they made assignment work everywhere, but in surprising ways

In Rust let v = a = 5; //doesn't assign to v which is confusing af

so assignment is actually not a statement, but an expression of type () so you'd be making a binding of v to type () which is only inhabited by the value also called ()

if it was a statement, then it couldn't be easily chained, but it would solve this issue since then the expression and the statement would be in mutually exclusive contexts (let bindings are examples of statements)

and I don't like tricky assignments in the middle of shit, it just doesn't read well even though I can figure it out eventually