Where can I learn more about the fundamentals of how computers work? Electricity, binary, logic gates, transistors, etc

Where can I learn more about the fundamentals of how computers work? Electricity, binary, logic gates, transistors, etc.

I want a book or video series that doesn't use shitty metaphors or just say generic shit like "the cpu processes the data". I don't want to hear shit like how electricity is like water flowing, how the computer "thinks", how a bit decides to be 1 or a 0, etc.

I know the very basic stuff like how to make a simple binary adder, logic gates and stuff but it's all surface level.

Attached: 256bytes.png (1374x1001, 26K)

Other urls found in this thread:

pastebin.com/raw/LwN3p5Hz
ai.eecs.umich.edu/people/conway/VLSI/VLSIText/PP-V2/V2.pdf
u.teknik.io/EBMIfK.pdf
u.teknik.io/xzIgWA.pdf
youtube.com/watch?v=HyznrdDSSGM&list=PLowKtXNTBypGqImE405J2565dvjafglHU
twitter.com/SFWRedditGifs

just make a computer in minecraft

This desu op.

Pic related.

pastebin.com/raw/LwN3p5Hz

Attached: 1513368418757.jpg (406x500, 29K)

this but in logisim (easier and runs faster)

What a gay fucking cover i would never read this

just read the previous edition if your autism is that strong

Attached: 1535494215742.jpg (405x500, 49K)

How's the book content? The cover doesn't look promising

College, you dumb cunt.
But this book isn't bad if you can't afford college.
Another might be, The Art of Electronics.

Attached: stack-1.jpg (1706x930, 285K)

You really want to know that stuff? Go study Electrical Engineering at college.

You are much better off with the book posted in this thread .

The Harris book is about as good as you are going to get, unless you want to go for something completely terse and rigorous. That's when you pickup something like the Horowitz and Hill book.

Anything besides the Harris book is going to be a big fat fucking waste of time, and probably going to be in the first edition, and meant for absolute retards.

Fucking hated this book so much

Unironically this

Redstone is based

Attached: Science.jpg (1100x514, 47K)

looking to write your own compiler?

The irony is, if you pick up any other book on digital electronics, you will hate it even more. The plus about that book is it does some hand holding, and doesn't assume you have the work ethic to work through higher level math to understand the concept. Every other book will just present the material, some formulas, and give a canonical definition, with the expectation you sit down and try to understand the topic for yourself.

I'm also assuming you thought this book was difficult, and didn't provide enough hand holding.

Introduction to VLSI sytems is pretty good user i have read it .

ai.eecs.umich.edu/people/conway/VLSI/VLSIText/PP-V2/V2.pdf

Patterson and Hennessey niBBa

These books are fucking great. I TA'd for a class that used the ARM one. It literally walks you through how to build your own processor. I wish it was my book when I took the class.
AoE is the biggest fucking meme. Gives the brief explanations of every circuit ever, so you just know what they do and not how to use them/change parameters for your own purposes.

this is what i was gonna say
at the end of the day it's just learning by doing. you'll find that redstone is actually really fucking annoying, but still much more fun than playing around in logisim or whatever

>I know the very basic stuff like how to make a simple binary adder, logic gates and stuff but it's all surface level.
That's pretty much all there is to it. You put millions of those together to make something more complex.

Logisim is great, you fucking zoomer.

nand2tetris

I'd suggest doing it in dwarf fortress. A water powered cogitation machine is way cooler than red stone magic BS.

For electricity, you could use a basic physics book or something on electrical circuits.

Millman for electronics

For the computer part the internet is full of documentation but if you want a book, try to read ``Computer Architecture: A Quantitative Approach''

why do you guys like CA?

I've been working on nand2tetris and have enjoyed it so far

Attached: But How Do It Know.jpg (720x960, 41K)

Go read a book on it. Protip: what you read will have very little to do with what the modern CPU in your PC is actually doing. Publicly available knowledge on anything made in the past 20 years is essentially all "surface level".

I'm taking a class in uni where we design a cpu in System Verilog, just fuck my shit up senpai.

There's an intro to computer engineering book you can DL for free online. It covers logic gates, CMOS circuits, logic circuits like a MUX, adder, write enable latch, how memory works, machine code, and assembly

u.teknik.io/EBMIfK.pdf
u.teknik.io/xzIgWA.pdf

Attached: D.png (238x236, 22K)

Can you elaborate please?

google BUT HOW DO IT KNOW

purchase it

nuff said, ignore the other posters.

at least it's not WInBreadBoard™

Attached: win95wbb.png (640x480, 66K)

AAAAAAAAAAAAAAAAAAAAAAAAHHHH

this.
But don't waste time with their shitty code-based hardware simulator. Just find a nice visual circuit simulator online.

use logisim-evolution for high dpi support

thanks, that's a nice book for a beginner

this guy explains it quite well i think

youtube.com/watch?v=HyznrdDSSGM&list=PLowKtXNTBypGqImE405J2565dvjafglHU

Buy an 80s computer with a 6502 processor, C64/Apple II if US/EU, BBC Micro if UK. Download the docs, learn some assembly language. Learn about interrupts and memory mapped IO. Connect some shit up and make it work. Try shit out on a virtual 6502 and watch how the CPU responds to instructions and conditions. It will give you an idea of what's going on.

What is that abomination?

Done. Now what?

Attached: mc-comp.png (854x480, 67K)

That's it, your life has peaked there. Now shoot yourself.

Install gentoo

My college used Tenenbaum (minix guy) 's textbook. Computer organization I think it's called. I liked it. Not dry, wide discourse.

This picture is anti-semitic

You can make logic gates in Excel, i've done some full adders, it's quite a challenge if your Excel doesnt have a xor function and you're a normie idiot. Next step - random access memory.

You first, incel

can't tell if this is a meme or not.

sweet

It just works, OP. Don't open the pandoras box, you will never understand how it all works.

Be content with having a grasp of the fundamentals.

Unironically watch the first 10-12 episodes of Crash Course Computer Science. It doesn't use shitty metaphors and it explains transistors, circuits, and logic to you like you're semi-retarded.

I'd suggest the book By Tanenbaum. Structured Computer Organization 6th Edition