The problems with C (and good parts)

Using only undebatable facts.
>Pointers
They can be NULL.
They can point to uninitialized memory.
They can live beyond the object they refer to.
Trying to make "smart pointers" in pure C via a safe wrapper leads to boilerplate hell and/or macro hell.
They can also alias, hindering these optimizations:
- Code re-ordering to improve instruction scheduling
- Loop optimization
- Constant propagation
>Types
C is weakly typed.
You can cast freely.
There is no type safety as a result.
>Undefined behavior
You can divide by zero.
You can accidentally not return a value in a function which should.
In c89, there are over 100 other instances of UB.
The optimizer may introduce UB if the programmer indicates an assumption they violate, such as the strict aliasing rule.

The good parts about C:
It is portable, and generates small binaries.
Like assembly, it is fast if written well. Keep in mind that well written code is more a product of developer time and pain more than it is of developer skill.
This makes it good for embedded and OS dev.
It also lends itself to use in hyper-optimized code such as cryptography impls or interpreter main loops.
Highly optimized C outperforms highly optimized code in virtually all other systems languages. Reasonably optimized C falls behind well-optimized and sometimes poorly optimized code in other languages.

>but muh Rust is a onions language for trannies
invalid argument. brainlet.
>you just need to get gud at C, elite programmers don't need the language to handhold them
No, you are a human. You will inevitably introduce errors. Sometimes they will be vulnerabilities. It is irrational to forgo basic safety measures when the use case does not substantially benefit from doing so.
>C has the largest set of analysis, verification, and other security toolkits of any other language.
Other languages have a type system and basic safety measures which eliminates the need for most of them. And they take much less time to maintain configurations for.

Attached: 1280px-The_C_Programming_Language_logo.svg.png (1280x1361, 81K)

Other urls found in this thread:

stackoverflow.com/questions/48008350/portable-way-to-retrieve-a-int32-t-passed-to-variadic-function
stackoverflow.com/questions/2124339/c-preprocessor-va-args-number-of-arguments
blog.llvm.org/2011/05/what-every-c-programmer-should-know.html?m=1
trollope.org/scheme.html
twitter.com/SFWRedditGifs

I'd like to add that C is also very small, meaning that someone could feasible know all of C, which is not the case for something like C++. How does this affect development, is it always a good thing?

C also lacks exceptions. My boomer professor thinks this is the only valid criticism of C.

It's true, there are cases where it is very clear what the correct/best way to design and implement something is.
Heavily multi-paradigm languages such as C++, Python, or JS may lack some design philosophy. This may lead to consequences in developer productivity and sometimes design consistency.
While it is easy to know all of C on the surface and write the same C that other C devs write, there is often cognitive load in understanding the compiler to generate better binaries.
My personal opinion is that the time lost to menial security checks and using various analysis/security tools in C detracts from the time i would be spending on higher-level design with vastly greater impact on performance and safety if I were writing in something like Rust, OCaml, or even fucking golang.

I thought C would be a good language to learn first. What do you recommend instead, OP?

exceptions can be implemented, although they are terribly clunky due to the boilerplate and the precarious safety of any handwritten code which does abnormal transfer of control. I think he really is referring to the semipredicate problem, in that you often can not distinguish valid returns from error returns without a big ass wrapper.

Python or JavaScript if you want to learn quickly and make money. Maybe even Java because if it's your first language you might not feel oppressed by its verbosity.

>Safety
I correct application should check for all errors, right? This drives me crazy when I tried it. My solution in my hobby projects is to do absolutely zero error checking, and at my job everyone just add checks and debug statements in places they think the code is most likely to fail...
For example, take OpenGL API... The error is retrieved with glGetError(). But the spec says that any function may fail with NOMEM error and after that OpenGL is in undefined state. So glGetError() may fail with NOMEM too. So after checking a function call if it failed with NOMEM, you have to check that glGetError() if it didn't fail with NOMEM and so on it goes. So it's impossible to write a "correct" OpenGL application.
Also, doubles and other floating point numbers. Let's say you want to get time, and the function returns a double. Do you make sure that it is not denormalized, not negative/positive infinity, not silent/signaling NaN, and within valid range, a not less than previous time?
For hobby applications, I think that zero error checking is a valid route to take.

Python and JS will make vidya creation inefficient though, won't it? And the only game I can think of that's gotten anywhere made in Java is Minecraft. I do not consider Minecraft a good game.

Am I wrong to think these things? What about C++? (I intended to learn this after C, but if C won't teach me anything worthwhile...)

But why even use C for hobby applications unless it's embedded?

No, JavaScript games are easily 60fps unless you're doing 3d and even then people have made smooth 3d games in Python. Also Unity uses JavaScript so I guess you can make fast 3d games with it.

It is certainly an important language to know, for a few reasons:
- Its syntax and language constructs are ubiquitous. If you know C, you can much more easily pick up another language.
- It is a great example of many bad mistakes in language design. If you understand those and why they're a problem, you better understand why more modern languages are the way they are.
- Working extremely low-level gives you a better understanding of the underlying layers of abstraction, in turn enabling you to write better code on all layers of abstraction.

If you want to do systems stuff, I'd learn C first. You'll appreciate Rust and parts of C++ much more afterwards, and understand the implementation of the features they employ.
It also gets you into a mindset which reasons about code generation.

If you want to play with ADTs, HOFs, functors, and type/category theory, learn something like ocaml and join the nerd herd.

If you want webdev or easy money look on hacker news for the js framework of the month.

I'm sorry but making a quality game in Python or JS just sounds like trying to cut a pizza with a sharpened dildo. And UnityScript isn't 1:1 JavaScript as far as I know.

This is exactly what I was thinking. Thanks. Back to work.

>I'm sorry but making a quality game in Python or JS just sounds like trying to cut a pizza with a sharpened dildo.
That's because you're retarded.
t. game developer

Many languages have built-in facilities for better error handling. If you wanna prototype something and don't need the foot-shooting freedom of C, then don't use C. If you're using OpenGL, C++ has a native OpenGL interface. Rust and others have bindings.

>Simplicity
I know the language very well. I like to know my language very well. I've been programming exclusively with it and recently started professionally for over five years. I could say I know 95% of C. Obscure shit like 5[array] int foo(int arr[static 1]), pitfalls like signed integer overflow, different-type pointer aliasing an so on...
If I used C++ after a decade of intensive programming I would know something like 80%.
>Performance
This kills Lua, JS and other minimalist scripting languages. I still use Lua where performance doesn't matter.
>Bindings
Virtually all the APIs are in C. OpenGL too.
This kills golang.
> Field
My job is embedded. My hobby is game engine-dev. The hobby is 100% dominated by C++, but as I said I hate it.

Really, I just like the language, that's the main reason, and I don't feel like I should be defending it.
I like assembly too, I just don't find a realistic niche to use it.

>In c89, ...

Attached: Time-for-an-.jpg (658x1000, 52K)

That's not obscure. I knew everything you mentioned before I wrote hello world in C, just by lurking and reading. A 2d game can easily be made entirely in LuaJIT with top tier performance btw. I don't like the creepy religious adherence to the shitlang known as C. It's like a masochistic cult.

This doesn't solve the API problem.
As I said, in any language, it's impossible to write a safe/correct OpenGL program.
Every function can fail with NOMEM.
The function that is used to check for errors, can fail with NOMEM too.
This doesn't address this.
Also I'm not aware of any language built-in facilities that help dealing with floating point values... Maybe Ada?

If you don't want to defend it, then don't. You just walked in here and defended it.
If you like it and are ok with the relatively large time cost of maintaining performant and safe code, then nobody wants to stop you.

In fact, since you work professionally in embedded, C is basically the gold standard. There is no argument there until there is a competitor which is safer and generates even better binaries.

You can still write better error handling facilities in a wrapper API around it.
If you can't write a correct program in OpenGL, that is an openGL problem. Frankly, unless you're writing something to do with parallelized crypto it is likely that you don't need a "correct" system, especially for a hobby.
Making it error tolerant rather than error-free is likely a better choice.

Not sure about base language features for dealing with floats, but most languages expose at least rudimentary interfaces for floating point.

>shitlang
When you insult other's preferences, you will generally receive vicious responses.
>LuaJIT
You must port any library that isn't ported. And they all have C APIs. GLFW? libsoundio? nuklear? LZ4? And then you wll probably have problems with WASM when porting to HTML5.
Bindings suck.
Also, what if you want your library/framework be binded to other languages? C is the only viable option.
One of the best things, if not the best one, is that C is fucking *old*. So much stuff is written in C, it continues being written out of inertia, and many languages will die off before C does. Ultimate job security.

Basically, I like C a lot, I don't understand why it triggers you so much. I do my hobby projects in C, I have a job where I work with C, and in my lifetime I will always be able to find a job to work in C.

It triggers me because it's a verbose pain in the ass without any useful features. Not to mention ctards always write horrible code in other languages too. Every top programmer in my school hated C. And the only professors that liked it had lots of 3d graphics or embedded experience.

Porting anything to Wasm is always gonna be a PITA. even with C and WASI.
Bindings with languages doesn't have to suck 100%, maybe only 80%. just don't use Lua, golang, or python. or haskell. or anything not running natively.
Inertia is nice for job security, not really for innovation. no opinion there.
Why do you care about someone else using a language you don't like?
What does top programmer mean?

>Why do you care about someone else using a language you don't like?
Idk, tell that to all the people shitting on Java and other useful languages.

>not obscure
What is obscure?
~a + 1 == -1?
named struct initializers? {.myvar = 3}
indexed array initializers? {[4] = 7}
unnamed structs/unions? struct {int a;}b;
unnamed struct/union fields?
struct Vec3 {
union {
float x, r;
};
union {
float y, g;
};
union {
float z, b;
};
}
// Now v.x, v.y, v.z are same as v.r, v.g, v.b, just like in GLSL

generic macros?
c11 threads and atomics and alignment?
_Noreturn?
restricted pointers?

Ancient ANSI function declarations?
Undeclared variable type defaulting to int?
The nightmare with variadic functions?
stackoverflow.com/questions/48008350/portable-way-to-retrieve-a-int32-t-passed-to-variadic-function
asked by me

VLAs?
Counting how many arguments in a variadic macro?
stackoverflow.com/questions/2124339/c-preprocessor-va-args-number-of-arguments

This is quite a good book for learning C

Attached: 2019-08-31-074750_521x719_scrot.png (521x719, 119K)

c is as good as the programmer using it
brainlets like op can't handle not being hand-held by shit like garbage collectors

>You can divide by zero.
Dividing by zero is NOT undefined behavior.
It is PRECISELY defined behavior.

You literally have no idea what you are talking about, stop making things up.

that's some heavy duty same fagging there

>correct application should check for all errors, right?
No. Depends on what you define as correct and on your spec.
And an application can be absent from any checks and still be correct if it works on static input and you prove that the input will never violate your spec.
/*@
requires valid: \valid_read(a + (0..n-1));
requires valid: \valid(b + (0..n-1));
requires sep: \separated(a + (0..n-1), b );
assigns b[0..n-1];
ensures result: \result == n;
ensures replace: Replace{Old,Here}(a, n, b, v, w);
*/
size_type
replace_copy(const value_type* a, size_type n, value_type* b, value_type v,
value_type w)
{
/*@
loop invariant bounds: 0

>Pointers
>They can be NULL.
Why shouldn't they be?
>They can point to uninitialized memory.
A general problem of variables. Initializing everything is costly.
>They can live beyond the object they refer to.
That is the nature of the pointer. What's your problem with it?
>Trying to make "smart pointers" in pure C via a safe wrapper leads to boilerplate hell and/or macro hell.
Don't do it then.
>They can also alias, hindering these optimizations:
This is a compiler problem, not a language problem.

>Types
>C is weakly typed.
In a sense, but not really.
>You can cast freely.
That's the point of casting.
>There is no type safety as a result.
Programmer stupidity isn't the fault of the language.

>Undefined behavior
>You can divide by zero.
Do you want to check every single division in case the denominator is zero? Be my guest, but don't complain afterwards for the slowdown you would cause.
>You can accidentally not return a value in a function which should.
Indeed, that is a legit problem, it shouldn't allow it imho.
>In c89, there are over 100 other instances of UB.
UB is there to make the implementation for any system more flexible. If the standard was to define everything under the sun, almost no system would be standard-compliant.

I agree with the good parts.

C99 6.5.5p5 - The result of the / operator is the quotient from the division of the first operand by the second; the result of the % operator is the remainder. In both operations, if the value of the second operand is zero, the behavior is undefined.

How would a divide by 0 not be undefined behavior without introducing overhead to any divide by a runtime variable?

Because it's not, division by zero sets appropriate flags on most architectures.

Leaving it up to the architecture is undefined behavior.

Holy fuck the way you frame this is retarded.

The problem with cars:
Using only undebatable facts.
>You can drive them into buildings
>You can drive them into other cars
>They require gasoline to run

Yes, if you tell C to do retarded things, it will do them because that's exactly how it's suppose to work.

C is the fucking rocket car of transportation and you act like it should also be just as safe and easy to drive as a bumper car.

Attached: BloodhoundSSC[1].jpg (2047x1152, 200K)

Not if you specify it in the standard of your language.

I just started learning c. What Im wondering is the oop parts of a language handled in a functional language? Is it just a more tiresome copy paste each time i want to make an object?

Why would you though? If the behavior is inherently illogical and results in a crash in pretty much every circumstance, what does the language gain by defining the behavior as architecture or OS defined?

What do you mean? C isn't OOP or functional.

yes. hence why i asked how to do the oop parts of a language in c

Makes it impossible for brainlets to rely on incorrect optimizations.

Can you give an example?

What do you mean? What OOP parts would you want to do in C? You can't do encapsulation in C because there are no private/public concepts and you can't do inheritance.

Sure, here's couple examples blog.llvm.org/2011/05/what-every-c-programmer-should-know.html?m=1

None of those had anything to do with dividing by 0. And they're all good optimizations and a testament to why UB is good.

yeah that sort of stuff. whats the approach to it? i get that a structure = class, but what about all the other stuff? The approach is doing my head in

>Pointers
No smart pointers.
`int64_t y = (int *) &x;` for retyping a double into an int is UB, you have to use memcpy.
`sizeof(t) / sizeof(int)` for sizing an array, something nobody should do but everybody does.

>Types
Weakly-typed unions/enum.
No inheritance.
No iterators.
No generics.
1970s-tier error handling.
Array indexing not restrained to size_t.
argc is an int... signed.
No RAII.

>Undefined behavior
ID-sized primitive types. The average program likely contains thousands of unchecked and nonportable casts between ints of different sizes.
VLAs are standard but are not but are but are not... Not even the standard knows.
So many functions nobody should ever use are still supported.
printf when any sane language nowadays uses macros or 0-cost.
You can veil tons of UBs with a simple function call and not get a warning, even if the function body is in the same compiling unit.
abort instead of panic.

The approach is to not do it.

oh ok, but whats the way of doing what it would do? or is it an inherant flaw of functional programming

>Pointers
>They can be NULL.
>They can point to uninitialized memory.
>They can live beyond the object they refer to.
>Trying to make "smart pointers" in pure C via a safe wrapper leads to boilerplate hell and/or macro hell.
All of these are good things.
"Smart pointers" shouldn't exist in the first place, sine they are only desired by people who don't understand manual memory management.
Pointers having the ability to be null, prevents mistakes in the long run, since programs can then just segfault instead of continuing to run with non-inited memory (which can be a huge problem to debug).
Segfaults are comparatively easy to find and easy to fix.

Aliasing due to pointers is a disadvantage for optimization, but you can't design a memory-reference-solution that is as powerful as C-pointers without having aliasing issues.
There ways when programming C which help overcome aliasing issues though, so its not the biggest problem in actuality, as always you just need to know what to do.

>Types
>C is weakly typed.
>You can cast freely.
>There is no type safety as a result.
Again, all of these can be good things. Most of the time when programming Cs type system is not super weak.
It has some implicit conversions which can be annoying (or helpful depending on your outlook).
Typecasting stuff and the void type are super helpful, and make C quite the powerful language.
Not many disadvantages there. More safety mostly would mean either a shittier, more restricted system, or less runtime performance.
I take the choice of less safety and more performance.

Too much UB is stupid in 2019, and basically one of the main disadvantages.
But C is a language with quite a lot of history, so its natural that its pretty unclean in some ways.


> Keep in mind that well written code is more a product of developer time and pain more than it is of developer skill.
Lol.

*Can* (simple) JS games hit 60 fps? Sure.
Can you ensure they keep a steady 60, or perform well if they do anything remotely interesting? Ha ha ha no.
UnityScript is dead

C isn't a functional language. And it depends on the use case. If you wanted a complex simulation that you're used to doing with inheritance, an ECS would be a good C solution.

thanks

C is like a hammer. Yes you can hit your fingers if you're not careful but you can do a lot of things with one.

>Lua where performance doesn't matter
LuaJIT is faster than C in many cases

>many cases
Name five, with benchmarks

scheme
trollope.org/scheme.html

C++ and Rust don't use garbage collectors cnile .

itt: brainlets that don't understand C and blame it for their own mistakes

>I thought C would be a good language to learn first.
C is absolutely the WORST language to learn first. Literally anything else is a better start.

Don't listen to the faggots, C is great to start with since it's very basic and makes you understand what is actually happening.

why? im learning it as my first and im enjoying it so far

Again, that's what I was thinking. There's no need for me to go balls deep and try to start learning with Assembly, it's not 1987. But the lower level the language I start with, the better, right? The syntax doesn't seem too complicated so far and when I get the hang of it I should be easily able to pick up any other language.

>Weakly-typed unions/enum.
So?
>No inheritance.
Inheritance is not needed unless you want to specifically structure memory that way which is very rare.
>No iterators.
iterators would be nice, but it's not a big deal for me.
>1970s-tier error handling.
error codes > exceptions
>Array indexing not restrained to size_t.
Why should it? It's a retarded idea to force type casting to size_t when indexing.
>No RAII.
RAII hides control flow and makes the syntax ugly. Managing memory is easy when you don't do OOP.

thats one way of looking at it, but lower levels are only useful for so much. in most (read: pajeet) development environments what matters is how fast your pumping out code. Bug fixes and efficiency are n terthought. Thats generlly why java caught on so well. But of course this changes when you are anything but sub 80 iq. Although that said even on the lower level stuff you get hackjobs. Ask yourself your end goal. What is it?

>But the lower level the language I start with, the better, right?
At least C, since it's as some other user said very influential and it's still actually in use.

Attached: Copy-of-top-programming-languages-1.png (1000x1000, 58K)

I started with C as my first language, after hitting my head into C++. It was a breeze!

>muh safety

Attached: the-man-who-would-choose-security-over-freedom-deserves-neither-quote-1.jpg (620x800, 64K)

C is a garbage language that only trannies like because they associate it hacking. Everyone who's a real person hates C unless they do embedded or something like that.

>Lower is better
Lol. Literally the opposite, look at any top 100 university around the world, they start with java, python, haskel.

t. pajeet

>See tranny in mirror.
>Call everybody else trannies.

Simpler languages = better class retention rates
Popular languages = higher employment

Both of those make the university look better. The system is designed to funnel you into a comfy 9-5, not necessarily make you adept with programming

Stop projecting, hacker larper. Everyone saw all you gay faggots at black hat.

>trusting MIT niggers

Attached: terry cool hacking sunglasses.png (495x365, 175K)

Can you have the good parts of C if you fix all the bad parts?

Basically sepples if you just use templates, vectors, classes and avoid memory management. It's alarming no one has made a language that's just that, an unbloated sepples.

Well yes, but 6.001 was much more brutal than 6.0001 will ever be. A few people each year generally do 6.001 as a supplement. The new cour*e is just easier, not better and unis that follow java courses are generlly weaker with their cs degrees. who uses haskell? brown?

Some people have a better time starting by learning (roughly) how computers actually work (C providing a decent balance of practicality and directness here). Other people are better served by starting with high-level languages that require less work and knowledge to solve more complex problems in ways that are more abstract to the hardware but map better to the human concepts of what you're trying to do.

Understanding how things work at lower levels can be useful even if you're not gonna work there. Or vice versa, really.

well yes which is my point. it takes time and a brain to understand properly, which is why higher level coders are a dime a dozen

>you never know, I might someday want to store a 5 in a 3-element enum
>boy do I love writing if (error != 0) everywhere!!!
>you never know, I might someday want to index an array with a negative number
>yuck! I don't like smaller, more legible code it looks ugly! I love good old-fashion manual frees everywhere! It's not very hard therefore it's the most interesting part of being a programmer!!!!!
The coping is radiating through my screen.

>pointers can be null
>this is a "problem"
My dude. Pointers are integers. Integers can be zero, as well as lots of other bit patterns that are invalid pointers on whatever architecture. There is no way to have pointers in a language without needing to deal with the possibility that they're null. This is computer science 101.

>you never know, I might someday want to store a 5 in a 3-element enum
It's always easy to crash if you want to. Just write while(1);, but how many hard bugs will strong enum actually solve practice?
>boy do I love writing if (error != 0) everywhere!!!
Still better than exceptions.
>you never know, I might someday want to index an array with a negative number
You are indexing outside the array bounds, you can do the same with unsigned int.
>yuck! I don't like smaller, more legible code it looks ugly! I love good old-fashion manual frees everywhere! It's not very hard therefore it's the most interesting part of being a programmer!!!!!
If that is your reason to not use C then you are just obsessing over tiny problems while ignoring real problems.

>The result of the / operator is the quotient from the division of the first operand by the second
So no single C compiler implements the standard?

>In both operations, if the value of the second operand is zero, the behavior is undefined.
So it doesn't even mandate a specific floating point implementation? Except that of course no floating point implementation can be used since / almost never results in the actual quotient?

>How would a divide by 0 not be undefined behavior
Because it is precisely defined by the floating point implementation.
In the most common standard dividing any non zero number by + or - zero results in + or - infinity.
Otherwise NaN, there is absolutely no ambiguity.

Most of the bad points are fixed by enabling compiler warnings (and turning them into errors) and external tools. Also since you brought up rust, one bad part about rust is that despite years of tool development, there is are good external tools (such as IDE).

Attached: rust-2019-08-31_16.51.02.webm (818x456, 2.65M)

Alright friends I have a question about C. I'm a statsfag so there's a lot of programming I don't know anything about. K&R says that matrices are handled as arrays of pointers, and I am wondering if it's smart to extend this line of thinking for more complex data structures. What if, instead of an array of pointers of all same primitive type, you create an array of void *'s with the base address to anything, and then you have two separate (same number of elements) arrays of type int that store separately the length and type information of your void * array. You can bring in all three arrays as arguments to a function and do pointer arithmetic to successfully dereference each element of the void * array. Could an 'object' be a struct that holds the base address to all three arrays? I don't know how else you would handle mixed types.

>matrices are handled as arrays of pointers
You mean multi-dimensional arrays? They're actually implemented as regular linear arrays with some added math to calculate the correct item from the given indexes.
>array of void* with two other arrays that store size and type info
Here's another idea, what if you create a struct that stores a void* to something, an (unsigned) int for size, and extra values for type info? Then you could just create an array of those structs, and it lets you talk about each item individually without having to access 3 different arrays. This is a simplified version of how actual dynamically typed languages handle type information in their data.

That's a better approach, thank you for answering. For some reason I thought there was a sort of performance penalty in using structs compared to arrays. I imagine little inefficiencies wouldn't matter for normal C programs but an interpreted language would need to have a good way to handle complex data structures like you say.

>For some reason I thought there was a sort of performance penalty in using structs compared to arrays.
Actually there would be a bit of a performance penalty for using multiple arrays, since if you have all the data stored in one place, accessing all of the data in an item at once can be done via a single dereference, while using 3 separate arrays means you would need to dereference 3 separate pointers. Even in the case where you only want one part of that single item, the compiler can just add the appropriate offset to the address and get that specific part.
It's generally better to group together data that will often be used together, like for example if you really only need to get, say, all the pointers of the items without any of the extra info, then using separate arrays can help performance due to cache locality. All the data that's actually used would be together in one place, so the cpu has to do less fetches from ram, while having it all be part of larger structs would spread out that data, so it would need to load more pages into cache in order to get the parts it actually wants.
In your case though, I would think that you would want to always have those 3 things together, so using an array of structs would be better.

Noted. Your answer clarified a sort of mental model of computational efficiency that I hadn't really developed from what I've read up until now. Good to hear that the sane choice of implementation for humans is more often than not what your cpu wants as well.

Really all it boils down to is that choosing the right data structure can make all the difference when it comes to performance, and different data structures will be optimised most for doing different operations. Finding the right structures that strikes the best balance between fast operations vs slow ones is one of the main things that separates a good program from a shit one, even down to things like "should I group these sets of data by what objects they represent, or what each thing does?" You can look into common data structures and their different operations if you want to delve into this further.
Best of luck, user.

Is C# the redheaded stepchild? I always see people talking about C and C++ but never C#. Is it really shitty or just completely unrelated to the other two even though it shares the C name

it's closer to java
also a pajeet language with java

It's Java by Microsoft, pretty much completely unrelated.
It happens to be really shitty too, but not because of anything related to C/C++.

Technically Java is considered C family so Cshart is too. The C family is a bunch of niggers.

c# is utterly faulty