The C blackpill:

The C blackpill:
all real C programs are full of UB.

Attached: 278px-The_C_Programming_Language_cover.svg.png (278x359, 32K)

Other urls found in this thread:

youtu.be/SNnzfD3pG1Y
drdobbs.com/cpp/why-code-in-c-anymore/240149452
deathbytape.com/articles/2015/01/30/c-vs-c-performance.html
youtube.com/watch?v=D7Sd8A6_fYU
rusty.ozlabs.org/?p=330
adaic.org/learn/materials/
twitter.com/NSFWRedditImage

int sum(int a, int b) {
return a + b;
}
Technically yeah, given that this snippet has UB in it. So?

That's some nasty, evil shit right there. Don't do this at home, kids.

How is this ub?

>68473739
Signed integer overflow is an UB.

Having UB code is fine if the UB condition never happens or is handled.

Which is why sensible people program in Ada instead. C was shit even in the 70s when it was brand new.

C > C++
All modern software sucks because they are made in C++
We need a new language to replace C++ or simply stick with C forever even though it's not suitable for 3D gaming

How about callback casts. Every library has them but the standard says it's not legal.

How the fuck is it not suitable for 3d?
>inb4 much oop for literally everything
Lose 85% of program performance in one simple step! CPUs hate this! Click here to find out how!

>named after "woman programmer"

Attached: 1541292902845.jpg (1280x720, 105K)

If you code in java you are a desperate whore for cash
If you code in C you are doing something good for others regardless of income

Every C program is since it doesn't target a VM.

what's UB?

undefined behavior

Here comes the dumb Cnile again who knows nothing about C++.

>If you code in C you are doing something good for others regardless of income
Right, those buffer overflows will always create demand for new security jobs.

C++ is a german blonde 24 year old multicultural whore
she is beautiful but infected and soul rotten

Here comes the web dev code monkey who has a hard on for JS and node

>caring about the name
It was commissioned by the US DoD to be a language that caught as many errors as possible at compile time and that would have easily maintainable code. They did a damn good job of it.

It's deprecated

>Lose 85% of program performance in one simple step! CPUs hate this! Click here to find out how!
OOP in itself doesn't cause any performance loss.

Attached: C++.png (200x198, 46K)

How?

Its up to the person writing the code to ensure overflows dont happen.

Please explain me how that wonderful mind of yours made the connection.

No it isn't, dumb webdevfag. Ada 2012 is alive and well.

When rajesh writes garbage code yes it does cause performance loss

No one uses that in the industry kid. Have fun dicking around with gentoo and your thinkpad with crusty cum on the keyboard.

integer overflow is UB

It does, because the first thing an OOP retard does is wrapping everything into (virtual) accessors.

>when you think your language is good because it wastes cpu time to do checks for buffer bounds on every single write because you're too stupid to do it yourself as appropriate
That actually checks out if you think about it.

>doesnt know the nuances of a language
>doesnt know that the compiler spits out fair warning
>HURRRRRRRR UB

Even with zero cost abstractions used (at which point it's basically C code), oop memory layout is inhetently pointer-heavy and cache-hostile, both or which incur massive performance penalty individually, and absolutely tank execution speed when combined.

So the segment of the industry you're in is filled with pajeet tier plebs? That's nothing to brag about.

you think that the compiler will throw a warning for every addition/multiplication?Seems like you never programmed in C

Is the industry really full of pajeets user?

how is integer overflow UB? Adding two 32 bit ints isn't undefined behavior. They might produce the wrong result but they produce the same result every time.

>Its up to the person writing the code to ensure overflows dont happen.
50 years ago people were already smart enough to realize this is an error-prone and plain dumb idea, so they came up with high-level languages that take care of that.
But no.
Bell labs & Co. had to come out swinging their dick around telling everyone that doing that manually was cool and hip, and everyone bit. C was shit from the start.

>oop is shit when rajesh writes garbage code
When rajesh writes garbage code and manages memory incorrectly, it's even worse.

>citation needed

If it's security critical, it's not wasted. And if it's performance critical, you don't use C.

No it's not. Not every language supporting OOP is Java.
You could say that padding in any compound type in any language too is "cache-hostile", but if you actually knew a thing about multiprocessor architectures, you would realize that padding is often actually desirable, as it prevents false sharing and unnecessary cache coherence conflicts. That's why in HPC it's often evem recommended to manually add it.

It's only UB because the way overflow is handled is defined by the CPU architecture, and the language chose to defer judgement on what happens to the CPU.

Holy shit i found OP at the java convention! youtu.be/SNnzfD3pG1Y

-fwrapv

Attached: _.jpg (480x350, 21K)

Why would you not use C for performance? It blows most OOP languages out of the water.

No, even if it sometimes seems like everything is about to collapse into a pile of webdevshit. I've talked to sensible programmers who work with Ada. Present tense.

>Why would you not use C for performance?
Because there are better choices, depending on your needs. Fortran is still the best for purely scientific computation, and for general-purpose HPC, C++ is a superior tool, not to mention that useful, widely used libraries in the HPC domain such as TBB just don't exist for C.

A fucking segfault

Why is C memed so hard then?

Ada is really quite cool; Some of the work the SPARK guys have done is quite impressive. I don't know if the pedagogical verbosity of the language goes to something of an extreme, but it has many positive features going for it, and it is refreshing how seriously the Ada committee seems to take systems programming and security issues.

For example, if I recall, you can even specify the bitwise representation of a record, which is good for using them to represent protocol data and the like without relying on an ABI, or GCC's __attribute__((packed)), which is an extension.

I have known some systems programmers who are interested in using it for their security critical code; But for some reason this language doesn't seem to have lots of exposure or learning material.

Do you know of any?

In what languages would adding two integers not be considered UB?

I'm not even talking about padding, you dip. I'm taking about the way oop primes the main loop to access memory in fragmented, random fashion. Fragmentation causes cache misses and random access causes memory lag, both of which are very detrimental to the performance. The use of pointers exacerbates the problem because in addition to having cache misses and memory lag of your main data, it is doubled for data accessed by pointers. Using ecs in place of oop immediately improves performance by upwards of 10 times over, in fact achieving 50x boost is not out of ordinary for such a switch. The "85% drop" line was being very generous.

its the only language they know and they are too incompetent to learn another one.

drdobbs.com/cpp/why-code-in-c-anymore/240149452
deathbytape.com/articles/2015/01/30/c-vs-c-performance.html
youtube.com/watch?v=D7Sd8A6_fYU
rusty.ozlabs.org/?p=330

In Java, for example, they wrap in a well defined way, as in most languages. This is expensive to implement on hardware not featuring the expected semantics, but it is perhaps worth the added safety: signed arithmetic overflow can be hard to predict the occurrence of with perfect certainty in complex code.

That's not entirely true. C++20 will explicitly ensure two's complement signed integers, but signed overflow will still be UB to improve optimization. To actually take advantage of signed overflow you will need to cast to unsigned and back.

In one that would destroy it's integer performance to do overflow handling for you.

C is cancer that's been leading CS astray for generations.

Attached: ODA4O.jpg (830x610, 197K)

CPU time is cheap.

yeah what's UB? im not even new

>implying ada lovelace wasn't a trap
she had programming stockings

Memed by who? Random people on Jow Forums? Suckless?
No idea.

>The use of pointers
I already told you that not every language is Java. OOP in itself doesn't imply the use of pointers, the layout can very well be contiguous and "cache-friendly" with it.
What you are describing is a totally unrelated issue, it's pointer-heavy code (which means a lot of random "jumps" in memory) is bad for performance, OOP or not, because of, as you said, cache misses, and in case of large datasets, page faults amd disk I/Os.
In fact, a lot of "disk-friendly" algorithms try to focus on executing linear passes on the data instead of random jumps, because, even though the latter may be more efficient IN THEORY, the former is more efficient in practice, due to less misses.

adaic.org/learn/materials/

t. Pajeet

Then why is C pushed so hard? It gets pushed as much as half the stupid web dev shit.

Based and redpilled

Attached: 1539674439570.jpg (341x256, 11K)

I had seen this page already, but it links to a god Ada 95 tutorial that I missed the first time I visited it.

Thanks!

Attached: 4bf0c0e4dae3.jpg (577x433, 86K)

That didn't do oop any defense: at the point you do all that shit you're not using oop anymore.

Wow, the poo thing is real

now try writing a triple A video game with that mindset

Argumentum as populum is a hell of a drug. So much is written in C so it must be good, right? ...right? Shit.

Being a racist fuck isnt cool, go be an edgelord somewhere else you fucking loser! Who cares if they're indian, they're human fucking beings and some of the best coworkers we could ask for. Its time to grow the fuck up and drop the bullshit racism and go back to Jow Forums you fucks.

let me guess, you're a lispfag

There are some important projects around that are written in C and are worth working on, for example, the Linux kernel (which still relies on some neat extensions like VLA, as even Linus is smart enough to realize that vanilla C is crippling).

What was there to defend? You accused OOP of a completely unrelated issue.

Looks like gramps woke up guys, try not to bash C so hard he hasn't taken his meds so he gets a little angry.

>still being butthurt that lisp machines crashed and burned back in the 80s
you're the real boomer here

Attached: 1422568111600.gif (302x193, 96K)

I don't see many PDP-11s around either.

When the fuck did i even mention lisp? That language is a complete fucking joke.

A REEEAL HUMAN BEEEEEEEAAAANNNN...

then it's a good thing C is portable and available literally everywhere
not sure what point you were trying to make

>I was just pretending

Attached: 1421440348416.gif (320x240, 130K)

>C is portable
Oh geez, not this old myth again. Next thing you'll call it 'portable assembly' or some such nonsense.

>then it's a good thing C is portable and available literally everywhere
So is every popular language. In fact, what was the point of "Lisp machines having crashed and burned?"

int sum(int a, int b) {
int ret = a + b;
if (a > 0 && b > 0 && (ret < a || ret < b)) {
//handle overflow
}
return ret;
}

Ha ha ha no. Simply adding the two ints can cause UB. Your code is still potentially erroneous. Check values BEFORE addition.

ret is a temporary value, if overflow happens it'll crash and burn. If not then it's the return value so no harm done.

C is portable, most C code isn't.

I don't think you understand what undefined behaviour means.

you're fucking retarded

>being so retarded you can't write portable C code
>i-it's impossible for C to be portable
>So is every popular language.
to a certain degree
C has a big edge because it's relatively simple on the language side and it's easy to write compilers for, which is among the first things that get ported to a new arch
>what was the point of "Lisp machines having crashed and burned?"
because almost nobody wanted them and lispfags are still salty about it to this day
which is why I'm suspecting lispfaggotry every time someone bashes C or Unix for some vague CS-related reason

Attached: 1418093989721.png (700x700, 35K)

>C is portable, most C code isn't.
Then by definition it's not very portable. There are countless languages more portable than C, funny how "portability" is an often mentioned supposed C advantage.

How the fuck do you handle it then? Simply using addition would cause an UB.

>these are the people who wank over c

Attached: 1534728248161.png (498x520, 18K)

Still waiting for an answer. I mostly work with Python, trying to ascend to C.

As stated, you need to check against potential overflow UB before you do the addition.

>to a certain degree
Name a high-level popular language in use nowadays that is less portable than C.
>it's easy to write compilers for, which is among the first things that get ported to a new arch
Agreed, but that doesn't make it a good language. Not to mention that 99% of C code targets platform suppprting other languages too, so it's kind of a moot point.
>because almost nobody wanted them and lispfags are still salty about it to this day
Doesn't make Lisp a bad language, and if anything, it being still around despite it shows that it has its merits.
>which is why I'm suspecting lispfaggotry every time someone bashes C or Unix for some vague CS-related reason
Accusing someone of lispfaggotry is not an argument in favor of C and Unix.
Try again.

How without addition?

You can use addition and subtraction, just as long as it is in circumstances where it cannot overflow. For example, if a and b are both positive, and MAXINT - a < b, then a + b will overflow.

So it's
if ((a > 0 && a > INT_MAX - b) || (a < 0 && a < INT_MIN - b)) {
//handle overflow
}
Christ I might just stick to python. Fuck C.

based and redpilled

>Then by definition it's not very portable.
So what is your definition of portability?

>Name a high-level popular language in use nowadays that is less portable than C.
Depending on what you mean with high-level languages, all of them. By virtue of having rather large standard libraries and runtimes which would have to be ported. Honorary mention of C++ which is a nightmare to write compilers for.
>Agreed, but that doesn't make it a good language.
That's moving the goal posts.
>Not to mention that 99% of C code targets platform suppprting other languages too, so it's kind of a moot point.
I think you're underestimating how much embedded devices there are. Most of which have, at most, Java ME as an alternative.
>Doesn't make Lisp a bad language, and if anything, it being still around despite it shows that it has its merits.
And? The point was anally annihilated lispfags, not lisp itself.
>Accusing someone of lispfaggotry is not an argument in favor of C and Unix.
It was never meant to be.
Though, I don't think C, or Unix for that matter, need much defending or "arguments in favor". Their service history should be enough.

Other options include using a systems language that actually handles it properly (like Ada), or kicking the issue down the road by casting to long before adding, or using some asserts creatively, or just ignoring it. Most C programmers choose that last one.

C is good if you want to learn reverse engineering.