So, what makes this special? I'm still on the first chapter, but so far it seems like a more arcane way of learning to program.
So, what makes this special? I'm still on the first chapter...
Other urls found in this thread:
people.eecs.berkeley.edu
jeffe.cs.illinois.edu
twitter.com
>special
lol
lmao u just got memed ya fuckin nerd
Bump for interest because I'm fucking completing it
the pointy hats
please i want to believe
SICP is a basic intro to programming book for freshmen. If you've had an intro to CS class, it should hold little of value. For more advanced topics that a discerning fellow like yourself might be interested in, see a book like CLRS.
A good, casual read.
pic was posted on Jow Forums frequently
is it really that good or it's just because Jow Forums heard of its name?
lots of technical interviewers will use examples from or variations of.
first section already has 28 footnotes.
It's not necessarily a more arcane way. It's elegant, and different than the imperative stateful mess programs most people write today.
You probably won't be able to use Scheme in your work place but you'll find the lessons from this book popping up in different points and opportunities.
There are also functional languages you could use in production, and then you're definitely going to have fun: Erlang, Haskell, Clojure, Scala.
Some people think functional programming is the future because Moore's law is pretty dead and we're only going to be getting more cores, not higher frequencies, and functional programming really lends itself to concurrent problems (immutability, persistence, structural sharing)
>he fell for the sicp meme
>Some people think functional programming is the future because Moore's law is pretty dead and we're only going to be getting more cores, not higher frequencies, and functional programming really lends itself to concurrent problems (immutability, persistence, structural sharing)
People have been saying this forever. The reality is we are going to get more and more ASICs and the concurrency potential will likely be not that much greater than today.
oh yeah its a real page-turner
Isn't OOP going to be more prevalant, where program is alreafy separated into isolated instances?
But user, Moore's law is dead. when was the last time Intel or AMD significantly increased frequencies? It's only more cores now.
Don't think so. We're at the point where the lectures and tech talks are raising flags on OOP not being the way, so expect the industry to catch on to it in 10-20 years, and academia will reinvent it in 30 years and think they made something new
it's basically _the_ algorithm book
It's just a intro programming book.
>But user, Moore's law is dead.
so what if its dead ? its not a big deal.
>when was the last time Intel or AMD significantly increased frequencies
you are just going to move the goal posts they increase every year
>It's only more cores now
instructions per cycle continues to increase.
not to mention there are dozens of existing methods for increasing transistor density, it is just a matter of what is going to practical for the money spent.
could you stay on topic?
The book is available for free online. Feel free to ask an actual question about the content instead of demanding to be spoonfed.
i know I'm an asshole for writing that, I just needed to know that my self learning endeavor will be worth it, I'll dive right into it,. Thank you user.
There's nothing in SICP that cannot be found elsewhere. However, most if not all of it is worth knowing and this book is probably the easiest way to find this knowledge.
basically this
people.eecs.berkeley.edu
>For more advanced topics see CLRS
fuck no there are no advanced topics
it was a pretty well established meme for a while
i remmember when i was 12 (back in 2011) I bought it aswell as a terrorist watch + a thinkpad x220
Its literally just to feel LEET
Stop me from learning Ocaml
>move the goal posts
You're the one to talk, from talking about Moore's law to accelerators and ASICs, to marginal increases in frequencies while the increases in core numbers for PC CPUs increased percentage wise is an order of magnitude greater, both intel and amd have SOCs in the pipe with more cores, for servers as well, you're so off base it's pathetic, especially with
>instructions per cycle continues to increase.
which is patently false since SKL released there was no change in the core architecture. Stop pretending to know what you're talking about and go back to /bst/
Actually _THE_ algorithm book is The Art of Computer Programming by Donald Knuth
Completing those books will prepare you for a position almost anywhere you want to work.
(define sum
(lambda (a b)
(+ a b)))
or
(define (sum a b)
(+ a b))
?
also cond or if?
I personally prefer jeffe.cs.illinois.edu
but MIT's advanced algo course is really good desu
How is this any different than
int sum(int a, int b) {
return a+b
}
I don't really remember lisp details from the little time I spent reading SICP for the meme value. That being said, the first form ought to be superior if it means that sum can be a first class value in comparison to the second, otherwise I'd guess they're the same.
The second form is syntactic sugar for the first. Also, you can just write (define sum +), but then of course it will accept any number of arguments.
Use cond to avoid nesting if-statements, use if otherwise. Basically (well, almost) the same as if vs switch in C-like languages, except that cond is more flexible.
Don't make a habit of doing things half-assed.
What's the "less arcane" way?
People like to joke about GRRM dying before Winds of Winter, but I'm legitimately afraid of Knuth dying before TAOCP is finished.
You guys expect me to read 12+ fucking volumes taking a year or more when I can just read CLRS?
I don't expect you to read TAOCP front-to-back.
The expectation is you read sections of it.
It's basically The Bible of Computer Science.
Yeah. We except you to learn before you shit out programs.
In fact almost all your programming should be done with a pencil and paper then inputted using ed.
>In fact almost all your programming should be done with a pencil and paper
Nobody does this fuck off, no one ver did
That's where you're wrong. Before the advent of modern text editors it was more practical to use a pencil because you could easily erase mistakes.
Computers were originally giant behemoths and all input was manually entered. It took days of labor to run a program and if there was a single error all of that energy was wasted.
You couldn't just compile and run like you can now and typewriters weren't capable of deleting words from paper. Not only that, but typewriters were also heavy, expensive and they took up a lot of space so carrying one around wasn't really possible meaning notebooks were the best place to store your programs.
Not only did the technical limitations at the time force program developers to hand write programs using pencil and paper, but it's been shown that writing-by-hand allows you to gain a better conceptual understanding.
Historically bugs weren't in the programs themselves. Literal bugs would interrupt execution.