C is my favorite language

#include

int main(){
unsigned int x = 176;
int y = -120;

if(x > y)
printf("%u is bigger than %d", x,y);
else
printf("%u is smaller than %d", x,y);

}

Returns: 176 is smaller than -120
So C is as bad as JS?

Attached: 1200px-The_C_Programming_Language_logo.svg.png (1200x1276, 77K)

Other urls found in this thread:

stackoverflow.com/questions/21627665/implicit-conversion-confusion-between-signed-and-unsigned-when-reading-kr-book
harmful.cat-v.org/software/c /
nayuki.io/page/some-bit-twiddling-functions-explained
graphics.stanford.edu/~seander/bithacks.html
cs.princeton.edu/courses/archive/spr09/cos217/reading/ia32vol2.pdf
twitter.com/SFWRedditVideos

@70056074
nice bait

What did you want to accomplish by posting this bait shit?

Everyone knows C is a dangerously broken dead language that no one should be using. This is one of the reasons why Rust should replace C.

t. tranny

t. dinosaur still using C

Have you ever heard of the recency illusion?

I FUCKING HATE RUST THEIR TRANNY KEK RETARD SJW KEKITY KEK KEK TRANNY FAGGOT KEK STOP BULLYING C IT'S A PERFECTLY SAFE LANGUAGE

Attached: segfaulting cnile.png (627x722, 114K)

haha so funny.

rustfags in a nutshell

Rust is unironically better than C in every way. Too bad most tech companies haven't noticed that yet.

>knowingly compares unsigned numbers to signed numbers
>complains when the output isn't expected
>replying to a bait thread

gr8 b8 m8 i r8 0/8

If you work down the layers of whatever stack you're working with, you will almost inevitably reach C. You need to able to use it while avoiding its pitfalls if you want to call yourself a serious software engineer.
The issues it has are therefore fairly irrelevant; feel free to bitch and moan about them, but if you want to get anything done past PHP code monkeying, you can't just ignore it and use Rust instead.

As a separate matter: Many people find that, as a result of extensive study and use of C during their career, they end up preferring it over many modern languages due to its simplicity and the ease of following the actual path of execution (very difficult in languages with exceptions and lambdas and whathaveyou). I'm one of those people, and C is my default language for any project that doesn't clearly call for something else.

Any comparison between signed and unsigned integers immediately rings alarm bells in my head; you simply shouldn't do that unless you're certain you know what you're doing. No developer would deliberately write what you did. If they wrote it accidentally, static analysis tools (and even simple compiler warnings) would point it out. It isn't a problem in practice.

> "Conversion rules are more complicated when unsigned operands are involved. The problem is that comparisons between signed and unsigned values are machine-dependent, because they depend on the sizes of the various integer types. For example, suppose that int is 16 bits and long is 32 bits. Then -1L < 1U, because 1U, which is an unsigned int, is promoted to a signed long. But -1L > 1UL because -1L is promoted to unsigned long and thus appears to be a large positive number." - K&R
> Source: stackoverflow.com/questions/21627665/implicit-conversion-confusion-between-signed-and-unsigned-when-reading-kr-book

>If you work down the layers of whatever stack you're working with, you will almost inevitably reach C. You need to able to use it while avoiding its pitfalls if you want to call yourself a serious software engineer.
Rust has inline assembly and pointers in unsafe environments. Discarded.

>The issues it has are therefore
very real.

>As a separate matter: Many people find that, as a result of extensive study and use of C during their career, they end up preferring it over many modern languages due to its simplicity and the ease of following the actual path of execution (very difficult in languages with exceptions and lambdas and whathaveyou).
It's called availability bias.

>Any comparison between signed and unsigned integers immediately rings alarm bells in my head; you simply shouldn't do that unless you're certain you know what you're doing.
Except the average C programmer factually doesn't know what they're doing.

>No developer would deliberately write what you did. If they wrote it accidentally, static analysis tools (and even simple compiler warnings) would point it out. It isn't a problem in practice.
Compiler warnings disappear as soon as you (even poorly) hide unsafe shit behind function calls. Watch this.
#include
#include

static int
incr(int *const x)
{
return ++*x;
}

static int
decr(int *const x)
{
return --*x;
}

int
main(void)
{
int x = 0;
printf("%d %d\n", incr(&x), decr(&x));
return EXIT_SUCCESS;
}

I've been loving C for some time now. Should I move to Rust then? What does one recommend?

C++ if you don't want to miss the C syntax. Rust if you want something modern.

But I heard C++ is just bloated C. I guess I should try Rust then?

People who say C++ is bloated think RAII causes overengineering.

harmful.cat-v.org/software/c /

#include


typedef enum State_ {
happy = 0,
sad = 1,
angry = 2,
anxious = 3
} State;


typedef struct Emotion_ {
State current_state;
int intensity;
char *duration;
} Emotion;


int main() {
Emotion ONLYSADNESS = {.current_state = 1, .intensity = 10, .duration = "always"};
printf("%d\n%d\n%s\n", ONLYSADNESS.current_state, ONLYSADNESS.intensity, ONLYSADNESS.duration);
return 0;
}

I'm a college student and my uni's comp sci I and II courses teach C and then Java. We're now at the point of learning about loops after learning about pointers.

If these languages are so dated and controversial, why would the college still teach them?

Attached: 1534730567015.png (111x128, 15K)

you are a nigger

>hurr why I use unsigned and signed ints incorrectly WHY COME NO WORK?!?

C's been around since the 70s, but it's still heavily used in operating systems, drivers, embedded, and a shit ton of software libraries. It is so ingrained in the infrastructure of pretty much everything that knowing it is more or less essential for anything other than pure web development. It somewhat helps that many other languages have derived their syntax from C, and that C is one of the most dead simple languages ever. It also kind of helps that very few languages can do what C does in its domains.

Java is taught because there's a fuckload of Java jobs out there despite the fact that it's such a shit language.

#include

int main(void)
{
const char *str = "Hello world!";
char ch;

while ((ch = *str) != '\0') {
printf("%c\n", ch);

++str;
}

return 0;
}


So comfy

Excuse me gentlemen. There's something I would like to say.
*ahem*
fn main() {
unsafe { *std::ptr::null_mut::() = 0; }
}

Attached: 1539615973234.jpg (511x671, 41K)

I'm new and I don't really understand the point of bitwise operations. Could somebody enlighten me?

#include

int main() {
char *str = "Hello world!";
while (*str != '\0')
printf("%c\n",*(str++));
return 0;
}


Yes it is.

No you're just retarded.

Why use Rust when you can use pic rel?
You get the speed of C, the safety of Rust, the concurrency of Erlang, the elegance of Python/Ruby, and the homoiconicity of Lisp
ALL IN ONE PACKAGE

Attached: Julia_prog_language.svg.png (1200x811, 43K)

>You need to able to use it while avoiding its pitfalls
such as?

It literally does everything so much faster and far more efficient.
For an example:
#include
#include
#include

int main()
{
clock_t start,end,res;

start = clock();
res = pow(2,4);
end = clock();
printf("pow()\t got %ld took %ldms\n", res, end-start);

start = clock();
res = 2

thanks

pow() got 16 took 2ms
bitwise got 16 took 1ms

using musl-gcc -static -O2

Try without -O2. Also it's merely an example, there are better cases and examples where bitwise operators can be very helpful. Some better examples:
nayuki.io/page/some-bit-twiddling-functions-explained
graphics.stanford.edu/~seander/bithacks.html

am I doing it right?

#include


int main() {
int num = 2;
int shift = 1;
int iter = sizeof(num);
for (int i=0; i

Too confusing. Make it more readable. Like this:
//\/\/\/
//\/\/\/
#include
int n ,i, g;int main(){ for (n=2,(0),(0),
g= sizeof(n) , i = 0 ; i

Too confusing. Make it more readable. Like this:
//#####
//######
//#####
#include
int n ,i, g;int main(){ for (n=2,(0),(0),
g= sizeof(n) , i = 0 ; i

I read K&R's section on bitwise and found myself too brainlet to do any of the exercises right
Any recommendations?

>void *

This

Edit: Thanks for the gold!

Attached: 63AC1404-E514-4650-9540-E39F8B016DBD.png (602x301, 94K)

a.c: In function ‘main’:
a.c:7:14: warning: comparison of integer expressions of different signedness: ‘unsigned int’ and ‘int’ [-Wsign-compare]
if(x > y)
^

I'm so fucking tired of these threads

this is hiro's way of training his shitposting ai
soon any thread he doesn't like will be derailed with void * and rust shilling

Based

Copy from stackoverflow

>C++ has its place in the history of programming languages. Just as Caligula has his place in the history of the Roman Empire. – Robert Firth
>actually believing this "CALIGULA BAD" propaganda
why should I even read the rest?

Because suckless philosophy is the only hope to revive good programming practices.

but why do people keep using it if they hate it
how did we get to this point where everyone keeps complaining about the language being shit but use it whenever they get the chance to? Are they really that desperate for C but with a ton of shit added on top of it where they'd rather fight all that garbage and pitfalls than implement it themselves in C?

My interpretation from cat-v & suckless philosophy, would say C++ should not be used, as all uses of C++ can and should be replaced with C.

Weak typing is harmful.

this snippet is so fucking retarded

#include

int main()
{
char *str = "The C Programming Language\n";
while(*str) {
putchar(*str);
str++;
}
return 0;
}

Jokes on you, tranny, I use cpp

I'd argue is more retarded.

You can literally do:
#include

int main()
{
char str[] = "The C Programming Language";
puts(str);
return 0;
}

That won't even compile if you enable treat warnings as errors, and would direct you straight to the comparison notifying you that the two types don't even match, your point is invalid.

You blame C. But the underlying reason is actually the x86 and x64 architecture. It is by engineering design and decision to promote lower bit types to naturally sized CPU bit architecture.

Low level languages like C, should be mandatorily accompanied with CPU architecture and OS internals knowledge. Since unlike Python/Java/etc, C fundamentally interacts with the OS mem manger and the C compiler directly creates ASM instructions which will follow engineering decisions as stated above, which if the developer like OP is unaware of, will produce unintended results.

Attached: JzO04nn.png (360x203, 21K)

#include
int main()
{
const char* str = "Hello world!";
while (*str)
printf("%c\n", *str++);
return *str ? 1 : 0;
}


C'mon lads, no need to equate a NULL. And return a useful error ;)

Attached: 1497810496534.png (2560x1440, 817K)

Shitty bait or you have no idea what unsigned means and what two's complement is

Not an architecture issue, it's an issue of weak typing, particularly type promotion. int gets promoted to unsigned int because uint can represent bigger values, as the result the C compiler emits unsigned comparison instead of signed comparison. There are no types in a CPU, others than floating point registers if you want to split hairs.

And it's solved by manual cast obviously. Comparing two different types should be a compilation error but we are talking about the clusterfuck that is C, why would you expect anything same from it.

False, but you do get all that with crystal

Yes it is due to the architecture. C compilers merely follow the Intel manual and attempt to generate DEFINED behaviour, even if it produces seemingly illogical behaviour like what OP writes. And it's not just just about representing bigger values, it's about being more CPU register efficient. On an x86 CPU, that means a 2 byte value is more efficiently represented on a natively sized 4 byte register. Hence the origin of the type promotion bugs which plague C developers e.g. short to int comparisons.

That is by Intel design. There's a reason the Intel manual is many 1000's of pages long, with specifically described engineerings decisions.

The CPU understands signed vs unsigned comparisons, it's merely a bit set at the MSB, but affects how the APU does the calculation for maximum comparison performance.

Have a read of: cs.princeton.edu/courses/archive/spr09/cos217/reading/ia32vol2.pdf

Ignore all the autists, bitwise operations are good for easily passing options into functions.

C was created before Intel even existed, moron. Integer promotion is there by design.

Flags - represent options as individual bits, this way you can easily combine and check them by treating them as powers of two. Some math operations need them, like in computing CRC. Other than that bro not much use.

i always argued that if a coder is not able to make something work its not the language's fault but the coder's fault. i don't know why but i always felt that way about peopel that talk shit about php but then code with ruby on rails. bc why the fuck would anyone code in fucking ruby or even nodejs

In case of OP there's no kind of promotion you are talking about(they both are word sized already) but there's promotion by rules of C. Signed and unsigned conditional jumps are different assembler instructions.

David, nein!

"Hence the origin of the type promotion bugs"
Fucking hell. OP here. Yes this is bait. Yes this is by design. I was reading K&R and it literally has a section talking about this:

> -1L > 1UL because -1L is promoted to unsigned long and thus appears to be a large positive number. (Page 44)

Now why does this happen, well according to C99:
> if the operand that has unsigned integer type [...] then the operand with signed integer type is converted to the type of the operand with unsigned integer type.

It's a fucking standard, it's by design. I was just hoping to get a few "haha"s or get corrected. Instead I get some idiots actually thinking it's a bug in C or it's architecture dependent. Do you guys do any research? Thanks Jow Forums!

Sorry I don't listen to europop.

>Year of the Lord 2019
>Not programming in assembly language

>hurr dynamic dispatch is bad

1. It's worse than just bad.
2. It's unrelated to the question.