Tired of being stuck at novice level

Hi Jow Forums, today's the day I start working towards becoming the best programmer/computer scientist I can be. Tbh, it's very lonely. So I am going to be posting my progress here, as I work through K&R.

Attached: 41gHB8KelXL._SX377_BO1,204,203,200_.jpg (379x499, 27K)

Other urls found in this thread:

david.rothlis.net/emacs/basic_c.html
github.com/ossu/computer-science/blob/dev/README.md
nand2tetris.org/course
twitter.com/SFWRedditImages

Start by not being a tripfag

>Tbh, it's very lonely
nobody cares fag what have you built

Done with the first 15 exercises. Some things I learnt - EOF can only be held by an int, null statements need to be used in a for loop when all the work is done in the first line itself, character constants are just small values.

Nothing substantial :( I tried looking at an open source project and the vastness of it scared me away.

>So I am going to be posting my progress here, as I work through K&R.
Why?

not a single person cares

You should be learning C++

>null statements need to be used in a for loop when all the work is done in the first line itself
you mean
for(int i=0; i

i wanna git gud too. are you just working through K&R?

>You should be learning C++
I will, but I want to learn C first (I know I can learn C++ directly if I want to)
>you can(should) just write
Is it because using braces improves maintainability? Why is it preferred?

Yeah. I want to finish it ASAP and then want to (try and) use C to do something realworldish. Would you like to join me?

i wanna get good enough to patch/commit to my favorite driver, mt76

for(int i=0; i

>Is it because using braces improves maintainability? Why is it preferred?

Because for(....); { ... } is a common error and most good compilers will yell at you for doing it (and good programmers run warnings as errors).

>EOF can only be held by an int
Are you even paying attention to what you're reading?
EOF is a value that doesn't represent any valid character - usually this value is -1. The reason they tell you to use int is because chars are usually unsigned, and writing a negative value to an unsigned variable causes problems.
Signed char can hold EOF just fine, but it limits you to a 128 character encoding.

Attached: eof-or8.png (679x483, 3K)

fgetc is pretty irrelevant and it's error handling patters is an example of badness.
If something, it should take char as poonter and return sifnalling integer.
You will find many weird decisions in posix libc, juts don't get too confused. E.g. different order of arguments in write and fwrite, yet another juggling for error values with ssize_t, string amd mem functions returning pointer to beginning and not end, etc.
You really just have to take it as a legacy that people are pining to preserve and don't fight with it.

K&R is not going to make you a good computer scientist or programmer. Instead it'll make you an insufferable elitist that can't program shit that isn't swiss cheese.

Unironically pick up a good Java book that teaches computer science concepts. Why Java? It's what colleges teach, so you'll probably be able to find an accessible, but decent textbook. The point here isn't Java -- it's computer science concepts. It doesn't really fucking matter what language it is.

SICP is way better of a book to work through instead of K&R. SICP is going to teach you "big ideas" that applies to every part of a programmer's life, K&R is just going to teach you the mechanical aspects of C. What is it you want to be -- a computing wizard or a dude that just writes code? And if what gets you off is the "low level" nature of C, just do nand2tetris instead. You'll learn a helluva lot and develop an understanding way beyond 90% of C programmers out there (who actually think C is "portable assembly" lol).

Then get a design patterns book or some shit. This will most likely be Java but doesn't really matter again. Design Patterns are going to teach you some outdated shit and some stupid shit, but it'll ease you into software architecture and how to design programs. But perhaps more importantly, it'll prepare you to move *beyond* design patterns.

cont.

If you want to be a good computer scientist / programmer, C is probably the worst fucking language to use. It's good a good language to use _after_ you understand all the shit, and _need_ the advantages C affords you. Computer science is unironically better done in a higher level language (where you'll focus on the ideas, not the code), and programming is better done in a language that ensures safety and correctness of your programs. C is better when you absolutely need precise control of the Von Neumann abstraction, or when you're in a constrained environment, or when you need maximum interoperability, etc. If you choose to use C outside of those situations, you're wasting a helluva lot of time.

Unironically learn _Modern_ C++ or Java or C# before C, or a high level language like Haskell. Or fuck, Python/Ruby/Javascript would be great, too. The point is to actually program productively and to actually develop understanding of big ideas. There's a reason Python took off -- it freed C programmers from having to constantly babysit their code. (Ruby is way cooler than Python, but Ruby doesn't jive with assburgers culture of programming, so Python is way more popular). Do you want to learn and be productive or do you want to babysit code?

That sounds like it'd be hard. Maybe some day my abilities will reach that level. For now, I am just doing K&R.

Thank you :)

Understood. Thank you :)

Thank you for teaching me that. :) I was mistaken because K&R said "We must declare c to be a type big enough to hold any value that getchar returns. We can't use char since c must be big enough to hold EOF in addition to any possible char. Therefore we use int."

>return EOF;
>255

Why is that happening?

255 as an unsigned 8-bit integer is represented as 11111111 in binary.
-1 as a signed 8-bit integer is represented as 11111111 in binary (assuming a 2s compliment representation).

This is a reason to not bother with C until you know what you're doing.

I thought the int in int main() was signed? So shouldn't return EOF at the end of main return -1? Sorry if I am asking stupid questions.

this this this
Pick up some OS concepts book as well, you should know the platform you are programming. Some knowledge of computer architecture (at least basics of ISA and assembly, knowing what memory hierarchy is) is also important to know.
I've had a lot of fun with language design and compiler books.
Most of it is heavily language-agnostic. The language mostly doesn't matter until you start writing a real project and even then it's just a tool.

It's been years since I've programmed C. Honestly I forget.

Currently stuck at exercise 22. I've finished (sort of) all the other ones. To all the people telling me I shouldn't start with this book, I am NOT "starting" with this book.

Still not able to understand why echo $? is 255.

The signature of main function has been changed historically a lot and standard rules are pretty vague.
There are some routines already executed before the program enters C's main and after it exists it. What's happening on the stack is well specified.
The linker will make the crt0 routine will prepare the argc, argv and env regardless of whether you have them in you main() signature or not, and jumps to the address of main() funcion regardless of it returning int or void.
All of those are equally valid main signatures
int main();
int main(void);
int main(int argc, char **argv);
int main(int argc, char *argv[]);
int main(int argc, char **argv, char **env);
void main();
void main(void);
void main(int argc, char **argv);
void main(int argc, char *argv[]);
void main(int argc, char **argv, char **env);

or and even without type
It's again on ctr0 routine to make a meaning out of main's return value. This part of C is meant to be OS-agnostic and the C part is, but ctr0 is prepared by OS and thus can do OS-specific behavior. On most OSs it's unsigned 8-bit integer return value.

fpbp

Makes sense. Thank you for taking your time to search and explain this to me. :)

Also, done with exercise 22, and with this I am done with chapter 1. Onto chapter 2.

Why is this redditard blogging with a trip

Ruby sucks even harder than python in terms of speed, doesn't have the same amount of libraries, and has an even more cancerous userbase. If you need a python replacement, use lua. It's better than python in every way except for community (which is honestly a dealbreaker)

> in terms of speed
Ruby is "fast enough" for anything I'd want to use it for.
> doesn't have the same amount of libraries
It has practically all of the essentials. And when it doesn't, FFI -- the latter being how Python gets all of its libraries and most of its speed, anyhow.
> and has an even more cancerous userbase
Python's userbase is damned fucking autistic.
The "cancerous userbase" perception largely comes from autistic C and Python users hating on it because, at the height of Rails, it was the cool-to-hate-on language. Hating on rails devs made them feel superior. And you call Ruby's community toxic, lol.
> If you need a python replacement
lol, implying I use Ruby as a "Python replacement". Ruby doesn't replace shit for me -- I use it because I believe it to be a damned good language for times C++ isn't a good fit.
>use lua
lol no. It's a great language for embedding, and I might use it for that purpose, but that shit is just too minimalist as a general purpose language. Ruby has 80% of all of the expressiveness I could ever ask for (THE reason I use Ruby), where I'd be spending hella time in Lua just to hand roll 20% of the expressive power.

no we have a thread for this

HOW THE FUCK DO YOU LEARN TO DESIGN PROGRAMS?
IM USING PYTHON

Youtube-dl takes its sweet time to load, and ruby shit like homebrew or certain websites I tried to host for fun even longer.
Lua has all the essentials too
Python's userbase is shit, but ruby's has that reputation for a reason.
>for times cpp isn't a good fit
Yeah. Python replacement. Arguing any more about this is just semantics
>implying minimalistic is a bad thing
I can see why you hate C and love C++ then, but if you're the type to argue function over form, then python is still way better except for certain edge cases. Expressiveness seems like a subjective thing, and you might be able to find python libraries for them anyway, sounds like you're just used to ruby.
And nothing wrong with that, use whatever lang you prefer, but don't act all high and mighty about how your language is better and shit. OP wanted to learn C, and your original post would lose a lot of credibility if you revealed yourself to be another C hater

nice blog, tripfag. please kys. thanks.

> anything that you'd use instead of C++ is a python replacement
>thinking lua is too minimalistic is why you'd hate C
You're fucking autistic.

read the manual

No, the two aren't related. It's just a separate observation.
Stop calling everything you disagree with autistic, loser.

I mean, this reply just shows how fucking autistic you are, as I was obviously providing two separate examples rather than trying to link the two ideas.
In this, you've blatantly demonstrated that you can't interpret context appropriately. This is a huge, huge red flag of autism.

>Nothing substantial
OP, I think this will help you get an idea for how the work of open source is done from a kernel dev mindset. Even if you don't use emacs (you should), I think it's worth learning from this.


david.rothlis.net/emacs/basic_c.html

This is the start of the C section. If you're curious about emacs start from the very beginning. This doc is about 4 pages in.

>you can't interpret context appropriately
What the fuck, I was about to say the same with you
Also, it just shows how much of a hipster faggot you are that you think calling people autistic means anything on this site. I take back all the good things I said about ruby and you, you're actually just a hipster offended that someone may dislike your precious ceeplusplus and ruby.
You got it all wrong in your post by the way, hipster autist.

Thoughts on vim binds or normal binds for emacs?

>If you want to be a good computer scientist / programmer, C is probably the worst fucking language to use.
So C is a worse language to use than PASCAL, BASIC, or BRAINFUCK?
>It's good a good language to use _after_ you understand all the shit, and _need_ the advantages C affords you.
No. It's a perfect language for learning how programming works because it maps almost identically to what is actually happening in the assembly, but with an understandable and portable syntax. I can't count the number of "programmers" who are literally unable to grasp what is actually happening in even the most fundamental data structures these "OOP" languages provide because they can't understand something as fundamental as how memory is accessed or how string or bit manipulation work. They are just dependent on all of this library code which they string together in ways that they don't, and no one can, fully understand which creates awful bugs that I have to go back and fix. You CAN"T understand OOP without understanding imperative programming and what is happening under the hood. You just fucking can't.
>a language that ensures safety and correctness of your programs
A LANGUAGE DOESN"T ENSURE SAFETY AND CORRECTNESS OF YOUR PROGRAMS. Holy fucking shit. If you use C11 and know modern secure coding standards it's no less "safe" than you bloated grandpa Python crudware.

Make sure you use the -fsanitize=address,undefined compiler flags (and -lasan and -lubsan to actually link the sanitizers), especially once you start doing dynamic memory management. It's pretty easy to write code that works most of the time but is actually incorrect. Look into making basic makefiles too, so you don't have to re-type a gorillion flags every time you want to compile something.

There was an user on here not too long ago that stated that K&R is the bible because everyone has it but no one actually uses it. I would have to agree. Currently using pic related for an introductory C course and I like it more than the language itself.

Attached: antsyCee.jpg (397x499, 34K)

Loser lol

Op, if you want to learn C, that's fine. The only problem is that book isn't going to teach you too much about computer science. You'll learn the syntax of C.

Go to this link and it will have resources for independently studying computer science.
github.com/ossu/computer-science/blob/dev/README.md

I would like to join you. We should make a Discord.

>tfw no pdfs of this book online

Attached: 1548032590104.gif (256x272, 4K)

nand2tetris.org/course
this is better than my uni course. at uni we used digital design as our textbook, it was dense and sucked

I tried it but it was way too shallow. Very disappointing given how much it's shilled.

>I tried it but it was way too shallow. Very disappointing given how much it's shilled.

how far did you get? the first couple chapters are a refresher for me, but later i can see it all coming together, i feel like a god

instead of wasting your time on a meme book and shitting up the board, use a static site generator to make a blog, pick a book for a language that might actually get you employed and post your learning progress in the blog