Why does everyone suck c's dick?

why does everyone suck c's dick?

Attached: c.jpg (512x512, 66K)

Other urls found in this thread:

state-machine.com/doc/AN_OOP_in_C.pdf
youtube.com/watch?v=EtZYaESX5M4
twitter.com/AnonBabble

what does that even mean?

Attached: 973.gif (500x286, 898K)

It's big and tasty.

just werks

It's the most precise programming language besides assembly, so you can write your software to do exactly what you want it to do with lots of performance optimization, security, etc. without relying on the work of others, who often have low standards and poor cooperation.
It means that OP's favorite language isn't the most used one.

user gets it

i don't have a problem with C or anything, just wondering why it's so popular. might learn it for myself.

You have it backwards OP
C is one of the few that doesn't have a Code of Conduct for you to suck
Rust: "Must accept homosex. They have tasty semen"
C#: "Be friendly to the Indian replacement we have you training"
Python: *Pulls you aside at conference*

It was the original python-tier baby lang when all the alternatives were FORTRAN and Lisp and machine languages of all flavors.

It's not everyone, just fizzbuzzers on Jow Forums that learned pointers yesterday and think C is the only language they'll ever need to know and use.

Ok then user list your top 10 reasons why C is shit.

I never said it's shit. C has its uses but it's not suitable for every task.

not an argument.
fuck off

What a child. Good luck with that attitude.

because object oriented programming is too hard for the 30 year old boomers of Jow Forums

Attached: 1507179535305.jpg (3264x2448, 711K)

because its a very simple and easy language so all brainlets can learn it.

Did your parents never teach you to be polite? It isn't difficult, user. What's your excuse?

Go suck a dick you contrarian imbecile.

LMAO

You can use OOP in C...

Attached: 1519403754373.jpg (238x192, 14K)

It doesn't matter, OOP is bad for caches and thus useless.

>the most precise programming language

>Hey what does this C programm do?
>"undefined behaviour, lol"
>"depends on the compiler, kek"
>"lol this is not ANSI C, stupid"
>"just change statements arbitrary, because it's all just offsets, huehuehue"
>this is easy, we can just cast a pointer to the function pointer of the void pointer, so the returned function type holds the adress of the type for the pointer.. what did just say?
>my language can't into aspect or meta programming.. JUST ABUSE ""TEMPLATES"", BRAH.. xD

I think you're conflating compilers and retarded programmers with the language itself. Also
>>my language can't into aspect or meta programming
you can do pretty much anything in C.

>muh turing completeness
>"you can have classes in C"

Attached: you.jpg (645x729, 81K)

>implying you are smarter/have the time to do better than the people who wrote an actual good implementation/spec of a programming language

its not even hard to implement classes in C, just dumb.

Brian Kernighan's personal charisma

Attached: MG_8525.jpg (3000x2000, 284K)

The most important thing is that you can actually learn all features of the language and thus read all code ever written in the language as long as the coders are human beings capable of documentation and reason.

Conversely, you *can't* do a bunch of cool stuff in outsource-to-pajeets languages like C#.

Well said, if only the entire computer industry had higher standards and better cooperation, it wouldn't be the clusterfuck that it is today. But gotta outsource to pajeets and fill our offices with womynz.

Attached: 1499116153485.png (572x505, 92K)

But when you outsource to pajeets you need more safety than C allows for. Even straight C has an entire layer of preprocessor before letting you touch shit and that's """no safeties."""

>computer industry my son, yuo are leader now
>you must choose programmer standards
>similar to electricians'?
>or the lowest in the modern economy?

OOP has nothing to do with cache usage.

OOP is references, references break cache.

>JUST ABUSE ""TEMPLATES"", BRAH..
that's c++ you utter buffoon

yeah in C you abuse void pointers instead

read a book or something

Attached: 1529631839339.jpg (480x360, 19K)

>read a book
Ok, I'm going to read Twilight, how is it going to help me with C?
>or something
How the fuck is labelling of Dr. Pepper going to help me with C either?

Attached: 1528911729084.gif (288x377, 1.83M)

I think he meant to read a book on best practices.

you obviously have never used C, just copypasting Jow Forums autism

you just described 90% of C's cult following on Jow Forums

what?

we have literally multiple Jow Forums threads on how C is bad at all times

No one knows what it means, but it's provocative

1. void *
2. (casting)
3. array[decay]
4. "cstrings"
5. integer overflow undefined behaviour
6. MACROS(X)
7. lol no modules
8. lol no generics
9. lol no ownership semantics
10. Unhandled errors everywhere

what do you mean by unhandled errors everywhere?

C(had) is literally the best language ever, you wouldn't have any modern computing stuff without him so stfu, oh, and his creator is a smug grandpa with longass hair and beard so he has to be the best

That exceptions are the only way to errors.

References are pointers. If you use pointers in C, you breal the cache too.

C is the best language, because of it's minimalism.
Its creators had to solve an issue (create a minimal system) and they solved it, by making a minimalist language.
It is a step-up from assembly, with simple calculations and automatic register magic.
The thing is, it allowed the simple use of libraries.
It is fast and readable, while being powerful and ubiqitous, the perfect balance.

Don't project your fedora tipping muh minimalism fetish onto C

You can do pretty much everything in unsafe code in C# (and also easily call C/C++ anyway), but you can't have reflection in C without a retarded amount of work.
And besides very few, very specific ise cases, there's never any reason to use C instead of C++.

C gives you absolute freedom and that’s why brainlets don’t like it

Are void pointers not a "best practice" for simulating polymorphism in C?

C is small and easy to understand. THAT is why brainlets like it.

C makes you "work harder, not smarter." That's what brainlets love to do. Everyone can work hard, but not everyone can work smart.

A C programmer feels a small sense of accomplishment when he fixes a bug, even when that bug is fundamentally impossible in C++. Still, he won't switch to C++. Why? Because he wants to feel like he is smart. He wants to show how hard he is working and how smart he is for being able to understand his own convoluted code.

It's like a guy who cuts off his legs so that he can win the special olympics.

If your not dev low level stuff such as drivers, kernels, dont use C! It has no memory management, it's string handling is non-existent/bullshit. and everyone builds autist code which is massively leet-0ne-liner-yo!!!11111 where instead of having one command a line so people can look down and go "ok, ok, ok thats fine its 7 commands smashed into each other using utistic if(pointer->de-ref->pointer->de-ref) = true && 0x64 then d=pointer->de-ref && 0x32

fuck c

Attached: dealwithit480.webm (854x480, 1.48M)

Exactly. That's why nobody writing performant C code does operations on references.

but you feel so smart writing shit like that tho

wow look at this child-hater

>reflection
who cares
cute

Nigga, I've been searching online "how to implement polymorphism in C."

Seems like 99% of the answers I come across are C brainlets that don't even know what polymorphism is and wrongly think their examples implement it. Are you in that camp?

The other 1% of the answers suggest to use void pointers.

no i dont really get c, no at all. like i dont understand homopathy........but i can tell its bullshit

Pure autism. This is the typical C programmer, everyone.

With this logic pointers are also "useless."

Uh... can't you just say:
"nobody writing performant C++ code uses OOP"

What percent of applications are time-critical, anyways? I run into this same shit with the C programmers at work. We produce embedded products for consumers. Processors are so cheap that we have virtually unlimited processing power relative to what we want to do (assuming we don't fuck up anything in big O).

Still - I constantly have C programmers hitting me with these retarded suggestions to reduce cache misses. They spend so much time worrying about this BS that they hardly write any code. So, instead of hitting the deadline and making a product that works, they aim towards an "efficient" product, but run out of time to complete all the features. "oh, it'll take a few more months to complete, we're doing it he 'hard way,' so that it works correctly :)"

It's called "premature optimization"

It's not about what percentage of code is time critical, its about the 0.1% of code doing 1000000% of the work of the rest. Mature optimization.

That being said yeah you can write non-OOP C++, too. Pointers and references are fine if they're kept out of the way of doing work.

>performant
Did you mean "performance?"
Couldn't figure out what you meant past that typo.

It was a direct quote from the person I replied to.

Seems like you just have a hard time understanding things in general.

Might actually be the case, but here's a recent guide by a real organization: state-machine.com/doc/AN_OOP_in_C.pdf

Yeah, that's kind of my point. He said OOP was useless because it is bad for cache. That is only relevant in the small percent of code that is time-critical. Other parts of code (ie. the rest of the 97%) could take advantage of OOP features with negligible drawback.

Also consider the fact that a lot of time critical applications deal with large amounts of data that must be allocated on the heap, so they must use pointers anyway.

Is there any where I can read about c void pointer abuse so I can uh... avoid such things?

lol xD

well he has the biggest dick

You should try.

If you started off with a pajeet language it's a hard transition though.

well then how would you optimize it later? migrate to c?

Straight forward translation into object code. Makes it useful for avionics stuff. A lot of pitfalls can be avoided by using a decent code standard.

You can just write C code in C++ if you wanted, in the critical parts. That's not necessary for optimization, but it is an extreme option you could take.

I read the example. IMO, it seems that this design naturally evolves towards the usage of void pointers...

The only real reason to avoid void pointers in this case is to avoid incorrect implicit conversions. Obviously, I'm not too familiar with C, so I don't know if the implicit conversion still occurs with typedefs of void pointers (but I assume it does).

The main flaw with the C way is that it requires a downcast whether you are doing this from a void pointer, or another pointer type. It's also generally uglier. C++ does the exact same thing, just with way less code. The C examples you provided are very verbose when the equivalent C++ code would be much simpler.

Forgot to attach pic

Attached: C polymorphism.png (927x1014, 117K)

what is the python one supposed to mean

C is horrible and a stain upon computing history. Entire classes of bugs and vulnerabilities would never had made it to production code were a more sensible language used

Yeah, like Rust

Who jizzed in your cheerios?

why would anyone do that?

>Processors are so cheap that we have virtually unlimited processing power relative to what we want to do
You should want to do more than that then. Mudgrubber.

what?

It's ok. The world lost when we used C over Forth.

Forth
>Takes 3 simple functions and a few minimal helpers that are also simple to implement in ASSEMBLY
>Limitless abstraction
>Fast
>No bullshit about the compiler changing your code into arbitrary assembler because it's threaded code

your mum

I think the image is pretty self-explanatory. What part aren't you understanding?

Do you agree that interfaces without data (ie - pure virtual interfaces) occur frequently?

Once you remove the data from the interface, then there is nothing left besides the vtable. So, you have a struct with one member. What's the point of even keeping the struct at all?

So, if you then move to delete the struct, you are left with a void pointer. Here the ONLY disadvantage to using the void pointer is that casting is now implicit. So, you might want to avoid that, which is why you'd go back to the struct with just one element.

I am trying to demonstrate how refactoring the code will lead you to a void pointer implementation. The original code didn't have void pointer, but it did have downcasting. The only real difference with my version is that the downcasting is implicit (without a cast). Besides that, it's the same thing. Either way you are downcasting, which is the problem with void pointers anyways.

So, I'm still convinced void pointers are the best way to implement polymorphism in C.

no I meant why would anyone try to achieve this in C?

>C is one of the few that doesn't have a Code of Conduct for you to suck
Neither does Object Pascal, and it's a better language, basically a fast-compiling C+.

Hmm, so I should go to my boss and ask for more requirements on my project? That way it's more of a challenge?

Efficiency isn't everything in programming. Sometimes actually getting something to work is more important. The difficulty of my job is not figuring out how to do things efficiently, but how to do them at all.

You are doing exactly what I said the C programmers do. Instead of making a product with n features that works, you are making a product that does just a few features "efficiently." My boss doesn't care if the code runs in 5ms instead of 10ms when we have ~90ms to spare in both cases. All he cares about is that I implemented the n features he asked for.

The difference between a plain function pointer and polymorphism is a fundamental one. Luckily, it's also easy to explain (I'll assume you already understand function pointers).

The advantage to polymorphism is that you can bind data to the function pointers (or a group of function pointers). The bound data is hidden to clients, which allows you to write very flexible code that only relies on the interface.

Plain function pointers don't have this feature. There is no way to bind data to function pointers, besides the roundabout way of emulating virtual functions that we are discussing.

Binding data to functions is just a common thing in programming, whether in C, C++, Java, etc. It's just a bit harder to do in C.

Because it doesn't lie or hide shit from you

>if(pointer->de-ref->pointer->de-ref) = true && 0x64 then d=pointer->de-ref && 0x32
you obviously dont know C; and possibly are a pajeet because of the "is true" in if

>where instead of having one command
>7 commands
command? C is not composed of commands but of statements and expressions, that distinctions has quite significant consequences

which diss C if you dont know it?

you can only explicitly cast pointers.

>hello sir, this is Mahorindahaganesh inc, we in reference to you're question of code want to say that we made #DEFINE is for the copy and pastings of other code.
Thanks you sirs and having a good evenings

tfw no C/C++ gf
youtube.com/watch?v=EtZYaESX5M4

Attached: New Gaymen C++.jpg (1179x810, 164K)

Yeah it's been holding computers back for 40 years. C is the worst thing to ever happen to computers.

>roads are the worst thing to ever happen to cars

It's the grandaddy of most modern programming languages, and doesn't do any handholding or have libraries. Oh, and it's fast as fuck.

It's essentially the anti-js

Attached: 1446405342634.jpg (221x246, 9K)

Just google python and dongle jokes

But why would you use C instead of a proper language if you're not writing a fizzbuzz?