Omg C++ is so hard just use C with classes

Is there anything more pathetic than brainlets shilling the oh-so-called "sane subset" of C++/"orthodox" C++? The moment I see this I picture a brainlet boomer scared of types and templates.

Why do these people suck up to C++ when they can use C or Go? Those are more upto their pace.

Fucking low IQ drooling brainlets.

Attached: 1518949762367.jpg (1200x1200, 67K)

Other urls found in this thread:

en.wikipedia.org/wiki/Virtual_function
sites.fas.harvard.edu/~lib215/reference/programming/linux_functionptr.pdf
xml.com/ldd/chapter/book/ch02.html
devblogs.nvidia.com/easy-introduction-cuda-c-and-c/
youtube.com/watch?v=86seb-iZCnI
docs.nvidia.com/cuda/cuda-c-programming-guide/index.html
en.wikipedia.org/wiki/Primitive_data_type
twitter.com/SFWRedditVideos

Object oriented programming is for fucking brainlets. There, I said it.

>make heavy use of templates
>everything has to be in headers
>absurdly long compilation times

They have their place, but I'll be damned if I'm gonna add 15 seconds compile time per translation unit just to use a Boost library

>it's totally accaptable to learn a language for 5 years before you can even call yourself "intermediate"
>actually living the stockholm syndrome

Gee, I wonder who's the brainlet here..

Attached: C++ in a nutshell.png (1016x98, 72K)

Oh here comes another brainlet. No, OOP is the most efficient way to obtain polymorphism. Trait based generics of Rust is yet to be proven. But that's way ahead in future because anything, literally anything Rust tries to implement is so half assed you can't use it to its full potential.

brave post

>C++ in a nutshell
Now do that in C, hard mode, below 100 LoC.

Fucking mouthbreather. You don't even know a thing what you are talking about.

> muh polymorphism.
Real life isn't fucking college

>below 100 LoC.
That's not even possible in C. C's type system is a joke if actually look at it.

>Now do that in C, hard mode, below 100 LoC.

Do what?
Implement superfluous grammar to boost my already bloated language further into oblivion?

Thanks but no thanks.

Real life isn't your fizzbuzz either, stupid NEET. Get the fuck out, adults are talking.

Next time, actually ask a C++fag to get a grasp of what you are posting, low IQ primitive C ape

LMFAO ... God I want to shoot myself in the head anytime I say this fucking shit language. I literally got so fucking frustrated interfacing to a c++ program that I used unix sockets and just sent the fucking shit I needed over to an all C process I authored.

>I was too dumb to FFI to C++ so I made another process in C
Another C ape

Do you type out the same code over and over again? Seems exactly the kind of shit a monkey scared of C++ would say.

>sepples fag acting all tough

kek

Fizzbuzz....
Try Embedded systems and Kernel Dev faggo
> MUH C++
Get the fuck out of here

Everything is pretty much authored in C including the infantile high level languages you use to escape any difficult thinking. Polymorphism doesn't actually exist btw nigger. It's a compiler trick for lazy faggots who don't want to author specific functions nor know how to generically organize data structures and operators.

en.wikipedia.org/wiki/Virtual_function

Attached: 1527546289632.png (1834x1200, 740K)

I frankly didn't give a fuck. I did it for kicks so I could make it a talking point in a meeting for which I told the faggot to author his shit in C

>. It's a compiler trick for lazy faggots who don't want to author specific functions nor know how to generically organize data structures and operators.
What the fuck about polymorphism seems like a magic to you, dumb fucking nigger ape? It's not being lazy, it's making compiler do the repeated task of defining the same functions over and over again. Too bad your primitive ape brain isn't able to comprehend what it means to not repeat yourself.

>Polymorphism doesn't actually exist btw nigger. It's a compiler trick for lazy faggots who don't want to author specific functions nor know how to generically organize data structures and operators.
Not all operations between inherited types are unique. Do you _actually_ write those fields for every inherited objects?

>kernel dev
I see you haven't done anything complex yet.

No, what I do instead is I author data structures properly and organize my program accordingly. I'm guessing you never heard of function pointers and function registries? It's what you use when you're doing large scale dev and interfacing with multi-layers of software for which clients of your infra need hooks. You know.. like the linux kernel

sites.fas.harvard.edu/~lib215/reference/programming/linux_functionptr.pdf

> Unironically referencing something as niggerish as polymorphism

How about y'all niggers mix C++ and C? They are both good for different things.

>I'm guessing you never heard of function pointers and function registries? I
lambdas and closures exist, you primitive ape. Yes I know way more than you do.
>I author data types properly
AHAHAHAHAHA ape you pass function pointers instead of a statically dispatched property, which is not only type safe but also fits right into the type, dumb nigger ape.

>function pointer is an alternative to polymorphism

Attached: 8nRqoXW.jpg.png (800x729, 48K)

> I'm a nigger who writes a program that does the same thing over and over again in subtle different ways and I think the only way to efficiently resolve this is polymorphism.. I ignore type check inefficiencies at runtime and also I've never worked in an enterprise environment where its demanded that everything be as EXPLICIT as possible to avoid issue.

Cniles BTFO yet again

I see you've never used them nor understand what I'm referring to.

>I'm a nigger who writes a program that does the same thing over and over again
Exactly. Your primitive ape brain isn't capable of using the brain and let compiler do the repeated task so you use your ape hands more.

>muuh "subtly snowflake implementation"
You're an ape who doesn't understand parametric polymorphism. I bet you use void * too.

> rest of gibberish
I don't speak monkey language

xml.com/ldd/chapter/book/ch02.html

> I see you haven't done anything complex yet.
I see you you're a Node.Js employee for a bean flicking website who runs dev language was developed in C and runs on a C based operating system providing all of the complex operations that allow you to use abstractions that don't make your brain burst when writing code.

>lose argument
>u are node js XD
The clear defeat of a low IQ C ape that doesn't understand C++

C autists VS C++ niggers ITT

Attached: 1522345171681.jpg (712x533, 36K)

These exchanges are the most hilarious fucking things I ever encounter. This pic says it all :
Unironically... Your dev life is more concentrated on spouting off convoluted terms that ultimately do the same shit as all the green text on the right but you think you're somehow more intelligent? HAHAHAHHAHHAHAHA

> delegates
> inner class
> Polymorphism
> Parametric Polymorphism
> Dike centric normalized polymorphic inheritance

Are you listening to yourself you dumb ass?
All of this gay shit still runs on the same X86 ISA.
None of this gay shit exists at that level so all you're yammering about are convenience features for brainlets who don't understand lower level aspects of computing. Are you seriously and unironically boosting about this? LMFAO.

If I ever sat your dumbass in front of a micro-controller with 1KB of memory your head would explode. There's no wonder programs are filled with shit now-a-days and take up Gigs of space and no I'm not a boomer or an oldfag... I simply didn't take the easy way out in College and actually know how to efficiently use and instruct a computer at a low level. You're literally speaking gibberish with this high level bullshit and you're in no way shape or form more versed because you use features you dont even understand. You're by definition a brainlet. Because you don't have a brain to understand the features, you 'objectify' someone else's body of ideas and structure.

Akari is cute, CUTE!~

>It's another "all these shit turns into 1s and 0s anyway" excuse
God I fucking hate C apes.

>i like sugary syntax because i'm a c++ code artisan

Attached: 1528736963009.jpg (483x589, 78K)

people that spend too long on C start to become afraid of abstraction.

>Parametric Polymorphism
>"convoluted"
Genuine LOL
The power of Cniles. No wonder only stupid Cniles use it. Even their compilers aren't self hosted any longer. kek

When it comes to embedded, the people who were most proficient in my experience were Haskellers. It's not the fault of the language that its user is a brainlet who failed introductory hw classes, so don't blame the high-level languages for attracting brainlets. It's akin to the brainlets blaming C for vulnerabilities and spouting such nonsense as "C is inherently unsafe" when, apart from Ada/SPARK, it's the only language that you can use on industrial scale to write a provably safe program.

It's not an "abstraction". It's just as real as ints and floats.

> Went to one of the top schools in America for C.S
> First language taught is an OO language because its obvious the brainlet tier aspect of the degree program and training wheels
> Quickly merge off into C only coursework because its the basis for any real Software
> Get into Verilog and RTL languages
> Get into Computer Architecture and see exactly why C is so prevalent and the basis for any sound body of work
> Get into Compiler Design and it most definitely makes sense
> Get into Embedded Systems (Now everything is coming together) : In limited resource environments with gimped Compilers, the are no nigger tier tricks and the platform has no memory to support the bloated results. It's all just memory and operations on memory
> Get into algorithm design
> Start doing CUDA dev (A variant of C)
> Get into HPC computing dev
> Get into Distributed computing (C)
> Robotics (C)
> Industrial Automation (C)

Meanwhile, in fucboi land you have the LARPing faggots who went down the EZ mode web/cloud computing coursework track spout that they're the most capable of the programmers..

Unironically think they're justified in calling people apes as the knuckle drag their way through Library/API and kiddie language land... *makes vidya games and authors web code and thinks he's on top of the world*

I fucking love these stupid exchanges...

>Lose argument
>C ape goes onto autistic LARPing
Same shit over and over again.

Wrong. Any nonterminal is an abstraction.

When you actually understand how a computer works you learn there's no need for most convoluted bullshit and that it in fact is used because brainlets are unable to understand more complex and efficient operations. A team of brainlets using C would destroy a company in a quarter because it requires you to have discipline and know what you're doing. So you get kiddie glove languages that chaperone distributed development and code scaling.
devblogs.nvidia.com/easy-introduction-cuda-c-and-c/
As the hardware evolves.. the cutting edge remains in C
I have no clue what you're ranting about as the next wave of computing is heavily reliant upon C and knowing your way around hardware. I guess there's tensorflow for brainlets but you're ultimately at the application layer for a reason.. Your brain is too small to handle more complex aspects of the OSI Model.

If a nonterminal an abstraction, so is a terminal symbol.

I don't disagree. I pay and hold much respect to such individuals. I've learned a lot from them. Thank you for your contribution to the thread.
BTFO. Just goes to show how uninformed high level brainlets are about the underlying hardware

>next wave of computing is heavily reliant upon C
youtube.com/watch?v=86seb-iZCnI

Terminal as a concept definitely is an abstraction. Terminal in a specific language is most certainly not an abstraction - thus its term "terminal".

>It’s funny to read about the F-35 Lightning II jet fighter. It may be an example of a colossal project development failure, a projected US$1.3 trillion train wreck. Why is it so bad? If has often been pointed out that it suffers from software problems. So what’s the problem? Too much software? Inexperienced programmers?
>The project for the Joint Strike Fighter, was supposedly the first DoD project which allowed the use of C++. Now for those that know the background to programming languages in the DoD, you will understand what a big issue this might be. The DoD spent a lot of effort in the design of Ada for exactly these kind of projects. So why not use Ada? Ada is extensively used by Boeing, 99.9% of the code in the Boeing 777 is Ada. Perhaps software written in Ada would be too costly? Or maybe there are too few Ada programmers. According to report on the net, the majority of the code is written in C++, with only 5% written in Ada.
>C++ is not a bad language per se. It’s just suffers from bloat. In the words of Richard E. Gooch, “It seduces the programmer, making it much easier to write bloatware“. This is somewhat of a concern when developing embedded and real-time systems. In addition, as early as 2002, the Certification Authorities Software Team (CAST) identified common problems with C++ in DO-178B-compliant applications including compile and run-time issues (the FAA recommends using DO-178B as the vehicle to demonstrate that software is airworthy). There are issues such as dynamic memory allocation in C/C++, which is forbidden, under the DO-178B standard, in safety-critical embedded avionics code. Some of the concerns with D)-178B have to do with the OO features of languages such as C++. Examples include: the overuse of inheritance, particularly multiple inheritance, which can lead to unintended connections among classes, and ambiguity resulting from Inheritance, polymorphism, and operator overloading through dynamic or run-time linkage.

Attached: f35.jpg (1200x707, 222K)

>Terminal in a specific language is most certainly not an abstraction
Why is `BigInt` an abstraction not `int`?

docs.nvidia.com/cuda/cuda-c-programming-guide/index.html
What is it called again? Cuda C++ or Cuda C?
CUDA spawned in C and later transitioned to C++ support for deeper brainlet penetration.

Thanks for highlighting this for me.
New waves always begin in C and then peter out to brainlets later on in OO frameworks because most people aren't that intelligent and need things packaged up in easy to use frameworks

>CUDA spawned in C and later transitioned to C++
Yeah, C compiler spawned in C and later transitioned to C++ as well.

C is dying out, no need to beat around the bush. C++'s type system is far more sophisticated and more flexible to accommodate smart programmers.

Depends on what `BigInt` is. In SQL, that would be a terminal - not an abstraction. In Java, that would not be a terminal - an abstraction. In C, which was most likely used to implement the dbms which can interpret your SQL sentences, `BigInt` is not a terminal - an abstraction.

>it's non terminal
No no, tell me the actual reason why wouldn't `int` be an abstraction. I mean what exactly does it mean to be an `abstraction` after it's compiled?

> C++ is not a bad language per se. It’s just suffers from bloat.
> It’s just suffers from bloat.
> It seduces the programmer, making it much easier to write bloatware“
Brainlets using ez mode programming languages write shitty code.
> This is somewhat of a concern when developing embedded and real-time systems.
C is a glorified pleb filter and keeps brainlets out of this challenging domain
> common problems with C++ in DO-178B-compliant applications including compile and run-time issues (the FAA recommends using DO-178B as the vehicle to demonstrate that software is airworthy).
Well well well.. the higher up the stack, the more retarded things get
> There are issues such as dynamic memory allocation in C/C++, which is forbidden, under the DO-178B standard, in safety-critical embedded avionics code
OO Brainlets would neck themselves here
> Some of the concerns with D)-178B have to do with the OO features of languages such as C++. Examples include: the overuse of inheritance, particularly multiple inheritance, which can lead to unintended connections among classes, and ambiguity resulting from Inheritance, polymorphism, and operator overloading through dynamic or run-time linkage.
And the MOTHERLOAD OF WHY OO is dogshit for any serious software but is perfectly suited for brainlets...

FUCKING /THREAD

>> There are issues such as dynamic memory allocation in C/C++, which is forbidden, under the DO-178B standard, in safety-critical embedded avionics code
>OO Brainlets would neck themselves here
Why? Is it not possible to "do OOP" without dynamic memory allocation?

That's the exact reason, why is it so hard to understand what an abstraction at the language level means?

C++ isn't about OOP.

So being an abstraction or not means absolutely nothing to machines? Is it just a lexical term? There's gotta be at least something, considering how the term `abstraction` is demonized. Come on.

> A layer of the OSI model is going away
> mfw dumb fag has no idea what the OSI model is..
Dear faggot, pull up an ISA and look up what an integer and floating point pipeline is. It's a native data structure to hardware you dumb fuck. Every type that is non-native to the hardware and ISA is a goddamn 'abstraction' :
en.wikipedia.org/wiki/Primitive_data_type

GET IT? Presentation layer boy

C++ is dogshit

>5 years before you can even call yourself "intermediate"
Brainlet confirmed.
It doesn't take that long to master C++. Literally all the "C++ is so hard" memes come from literal braindead drooling brainlets.

Reminder that that image is old and can be reduced much further down in modern C++.

>c++ pajeets will never recover from this thread

Attached: 1527774956736.jpg (997x1413, 979K)

>OSI Model
Why did you bring that non relevant obsolete model up?
>Every type that is non-native to the hardware and ISA is a goddamn 'abstraction' :
Earlier you said it's not an abstraction when `terminal`?

Hello, mister Dunning-Kruger

No one said C++ is hard which is why its one of the first languages taught at universities. It is one of the easiest as is OO which is why its taught to freshman. As you progress, you move on past OO languages into more powerful and challenging languages that can be used universally at any level. This is where most brainlets drop out of C.S or head down the Cloud computer/Web Dev/Database or brainlet track.

The fact that you morons are actually and unironically trying to pass C++ and OO bullshit off as intellectually superior makes me believe the broad majority of you don't even have stem degrees and if you do went to dogshit universities and went down dogshit paths of C.S. In this way, your idiotic comments amount to butthurt REEEing about never being able to get into the more chellenging aspects of C.S. Ultimately I have no clue who you think you're following besides yourselves. You look like total fucking morons spouting off this retarded shit.

Sepplesfag here.
All I've taken from this thread is that Cfags have absolutely no idea what they're talking about.
And it's funny because they think they're winning.

>The fact that you morons are actually and unironically trying to pass C++ and OO bullshit off as intellectually superior
Actually read the posts you respond to. Literally NO ONE is claiming C++ as intellectually superior.

Aren't you the one who calls parametric polymorphism convoluted? Why do you think you are not a brainlet?

Indeed, after interpretation, abstraction means nothing. Unfortunately during interpretation, composition of certain non-terminals leads to really bad code being emitted, that's why abstractions are generally demonized when it comes to embedded - there are not enough resources to compensate for it. Having less general languages helps - ghc switched to LLVM for a reason.

C++17 ver
template
auto make_resource(Creator c, Destructor d, Args &&...args)
{
return std::unique_ptr(c(std::forward(args)...), d);
}
And now every C resource you've ever used supports RAII at no cost. Why is this a bad thing again?

>auto

Abstraction is lifting your reasoning to the problem's domain.
C is no different in this sense: it abstracts away the ISA and provides useful tools like functions, types and the call stack. The main problem with C programmers is they think they are at the machine level, which is completely wrong.

This auto is not what you think it is

C is a bloated piece of fucking shit for drooling brainlets trying to escape difficult thinking.
Use assembly or fuck off from computers.

> non relevant obsolete model up?
Now the Application and Presentation Layer boys think they're FULL STACK ..
WE WUZ KANGZ.. There are no layers.. we command them all
AHAHHAHAHAHHAHAHAHAHA
> Earlier you said it's not an abstraction when `terminal`?
Wrong fag. I give definitive answers when I post replies. Try and keep up. Anything non-native to hardware is an abstraction.

>c++
>pajeet

i think you mean java. the average poo in loo would see a auto&& and turn 360 degrees and walk away

an average Cnile do the same, to be fair

> See OP and 80% of the posts here.
> See Brainlet comedy

>Anything non-native to hardware is an abstraction.
So anything non trivial is an abstraction?

^based poster absolutely blowing handlefag the fuck out.
> mfw you can tell someone has no clue how hardware functions.

Only reason to use C++ is if you want to throw sand into your eyes

Attached: 1494074448412[1].png (1620x774, 40K)

>Unfortunately during interpretation, composition of certain non-terminals leads to really bad code being emitted, that's why abstractions are generally demonized
Uh, it doesn't looks like it's the fault of an abstraction, rather the fault of an implementation

>It's another episode of brainlets thinking abstractions must have runtime costs.

because we don't need it. A void pointer will do much easier, overcomplicating things makes you dumber, not smarter

It's the fault of the language designers writing retarded rules for the abstraction. It is possible to have some abstractions generate very short terminal sequences, but that's 1) true only for certain class of abstractions; 2) very hard to do.
When you add long terminal sequences to the inefficiencies in the interpreter, you get a pretty noticeable performance hit. Something you just can't afford in embedded for the most part.

> A void pointer will do much easier
and less type-safe, more verbose and less-readable
The only time you'd say templates are complicated and suggest void pointer is because you suffer from terminal baby duck syndrome.

Not him, but yeah, anything that's a composite data type i.e. "non-trivial" is an abstraction.
Hardware optimisation in almost all cases means that a program doesn't rely on excess of computing resources to disentangle the composite spaghetti, instead using primitive types that the hardware in question can natively work with.

Cniles proving they don't know anything.
Do you even know what auto&& is or does?

A type-deduced rvalue reference?

>and less type-safe, more verbose and less-readable
And more importantly, slower.

Not necessarily.

Oh right, not every && is rvalue reference. Oh well, the wonders of C++.

>And more importantly, slower.
I'd love for you to explain why.

> Abstraction is lifting your reasoning to the problem's domain.
All problems are reducible to data and data operations which is why so many domains can execute on computer hardware that do just this. It's important to remember this and not go off smelling your own farts in Framework land. If all you are is a pleb cog in the wheel churning out new api features for an enterprise sweatshop with a bunch of low rate pajeets who you don't want fucking up everything then obviously languages have evolved for this PROBLEM DOMAIN. Don't however confuse the evolution of a language to save software development from idiots in large numbers w/ the actual applied problem being solved.
> C is no different in this sense: it abstracts away the ISA and provides useful tools like functions, types and the call stack.
Thus why its powerful, pervasive, has lasted so long and will continue to last because its the tool that takes hardware sensibly into the software domain with no bloat or bullshit allowing an end user then to have the hardware at their finger tips.

> The main problem with C programmers is they think they are at the machine level, which is completely wrong.
There is no problem with C programmers. You hardly ever hear from them because they're too busy getting actual shit done. There's nothing we really talk about because the language is straightforward. It's more about what big brain you have to understand what to do w/ it. We talk about what we do w/ the language not carry on like insufferable pseudo-intellectuals about the idiosyncrasies about the dogshit features the language has to offer. Two different types of people it seems.. You busy yourselves talking about what your language can do while unironically not doing fuck all of significance with it... C fags have nothing to say about the language and talk about the insanely amazing things they do w/ it.

>Not every * is multiplication. Oh well, the wonders of
boomer Cnile having a mental break down

Using void* instead of concrete types means you'll have more indirection (causing cache misses) and more heap allocations.

I don't want to have to need a 2000 page manual with a thousand exceptions to every rule to program something.

>Not every && is logical and. Oh well, the wonders of
boomer Cnile having a mental break down

Don't forgot that for "generic" data structures, they can't optimize for the specific type due to not knowing it, too.

Ask me anything about C and I can most likely answer it without consulting a reference. I don't think there is anyone in the world who can do the same for C++.