This is probably going to sound like a stupid question, but I'm gonna ask anyway: How viable is it to build a homemade calculator?
I'm a math student and I don't really know much about computers and electronics (I'm willing to learn, of course) as I'm more interested in pure mathematics. Nonetheless as of late I've become somewhat interested in the way in which computers can easily perform complicated calculations, and I started to comtemplate the idea of building some kind of calculating device by myself.
I said I would like to build it "from scratch" but of course I don't think it's doable to dig my own ore and refine it in order to build every single part from raw materials. Nonetheless I would like to start from the most elementary parts available on the market and using the most elementary tools capable of putting those parts together. Can anyone point me in the right direction by recommending me some books or blogs that might be relevant to this?
I had to google that word, but isn't that basically a ready-made calculator? (Actually, much more than a calculator, I would think.)
That's kinda overkill for what I would like to do. Isn't there a better place to start?
Isaac Wood
>FPGA Also my first thought but too complicated for a novice.
Start off with learning boolean algebra. Then study the half adder and full adder. Then two-compliment notation. Then buy breadbord and the gates in dip packages. And assemble a 4 bit alculator. Buy 7 segment displays and a driver for it (bin to 7-segment). Use dip swiches to input data to the gates.
Gavin Howard
Thanks for the reply. Can you recommend me any book that I could study to learn these topics? Or even just a book you think may be a good introduction to the most elementary aspects of electronics.
Elijah Stewart
Most of the simple stuff can be found in youtube videos, blogs and aticles. I can't realy point to any book since I learned electronics from lectures. I sugest you find out what books a university uses for the faculty of electronics or engineering.
There is usualy an electronics related thread over at \diy\. I sugest you ask around there. They will be more helpfull then \g\. There was even a guy over there who made a whole cpu with discrete components, it was massive.
>There was even a guy over there who made a whole cpu with discrete components, it was massive.
Amazing, is that even possible to do at home without some super-expensive equipment? I'm not an expert on computers, but don't you need to use photolithography to make CPUs? Or by "discrete parts" you mean he didn't use microprocessors?
based on my limited knowledge of logic gates to do it completely from scratch without an FPGA or some kind of programmable controller it would be a fucking nightmare. adders are straightforward and building a multiplication and subtraction circuits wouldn't be impossible if you account for two's complement. division would probably fuck you up. between all this you'd need to implement pemdas somehow too. i mean it's doable and in theory you could put it all on one board, it's been done but i mean the first ic calculator boards were probably massive breakthroughs. en.wikipedia.org/wiki/Multiplication_table if you're talking about doing it from scratch as your own design that would likely be a massive undertaking
Cameron Torres
They advertise a book on their site, "The Elements of Computing Systems". Is that a good book for a complete beginner? Or should I just try to follow the online course? (I'm old-school and I usually prefer to read books rather than learn on the internet...)
Ayden Cox
Documental about developement Japaneses calculators, several people need months to build big basic calculator since scratcher and pocket calculator need advance factories LCD,microchips and circuit design.
I wasn't expecting it to be a walk in the park. For now it's just a vague idea of something I would like to do. I know it won't be completed overnight and I'm willing to slowly work on it in my free time even if it takes years. It's just something I want to do as a hobby, so I have no real deadline (except my life expectancy).
Thanks for your help.
Brody Barnes
>I have a PDF copy if you'd like.
I found it on Library Genesis, so I can get it there. Thanks for the recommendation.
Aaron Hughes
Anyone who knows about Library Genesis is too smart for this OP. This whole thread is an experimental troll, I just know it >I've become somewhat interested in the way in which computers can easily perform complicated calculations, and I started to comtemplate the idea of building some kind of calculating device by myself Computers don't perform complicated calculations, they perform a lot of very simple calculations in a row and have a lot of memory. The only reason computers can do complicated things is because other people have built the software tools and programming languages that have a lot of tools in them ready to go. The "complicated calculations" thing and the /diy/ architecture thing are in two different ballparks. A computer is merely to a mathematician what hammer is to a carpenter; you sound like a carpenter who decided "I've become fascinated with hammer production". Brah I thought you were into wood
Nathaniel Carter
What does "from scratch" mean to you?
FPGA is the ultimate answer, because you can truly do it yourself. You will be inventing a custom microcomputer with a snowflake machine language, and you can write programs for it by hand-assembly. Tl;dr this is a very cool and hard goal, and you equivalently will have a degree in Computer Engineering if you can do it all yourself.
What you might ACTUALLY want is to start with an existing microcontroller that will start you out with addition/subtraction/multiply/divide functions, which you can use to write other math routines (i.e. sqrt()) all by yourself.
Nathaniel Mitchell
>Anyone who knows about Library Genesis is too smart for this OP.
Any dimwit who browses Jow Forums (either /lit/ or /sci/) knows about Library Genesis. It's not rocket science.
>Computers don't perform complicated calculations, they perform a lot of very simple calculations in a row and have a lot of memory.
That's the definition of performing complicated calculations, though. How else are you gonna perform a complicated task unless you break into a series of smaller and simpler tasks? Even you, as a human, if you want to multiply two large numbers you break the problem down into smaller tasks using multiplication tables and working digit by digit; and if you want to calculate the derivative of a complicated function obtained by summing, multiplying and composing elementary functions you just use the appropriate rules of differentiation and break the problem into smaller, simpler tasks.
>The "complicated calculations" thing and the /diy/ architecture thing are in two different ballparks. A computer is merely to a mathematician what hammer is to a carpenter; you sound like a carpenter who decided "I've become fascinated with hammer production"
Being a math student is not the be-all and end-all of my existence. I understand that nowadays hyperspecialization is favored for economic efficiency and that for better or worse we live in a bourgeois society in which one's identity and worth are very much related to one's profession. But that does not mean I'm willing to limit my horizons for the sake of autistic efficiency or because I'm socially expected to do so. The very reason I became interested in mathematics is because of math's fundamental nature, as it plays a role in almost any aspect of science and more generally in our lives; you could say I strive for a certain independence and self-reliance. Not in a practical sense, otherwise I would live in the woods like Unabomber, but at least in a theoretical way. There are things I wish to see and understand.
William Gray
>Any dimwit who browses Jow Forums (either /lit/ or /sci/) knows about Library Genesis. It's not rocket science No, any dimwit who browses Jow Forums has HEARD about Libgen. Anyone who KNOWS about Libgen has had to have a reason to use it
I'm still unsure about that, as I am fairly ingorant about electronics and that's the reason why I made the thread, hoping to get some input. I guess I have already gotten some good pointers so far and I will start studying the topics mentioned in this thread and try to figure out what "from scratch" means to me. I suppose what i really want is to start from the most elementary parts available; and if that's not viable for a homemade project, I'll try to compromise and start from something less elementary and more ready-made.
Jaxson Lee
I was hoping for something powered by electricity rather than human labor.
>Anyone who KNOWS about Libgen has had to have a reason to use it
You can use LibGen to download "A Song of Ice and Fire", if you want. Just because you have a reason it doesn't mean it's a good reason, or a reason that implies you're smart.
But what if you want a mobi or azw3 so that you can read in public on your Kindle while you sit at Starbucks?
Gavin Parker
Pretty sure a kindle can download a pdf
Ayden Scott
In the C programming language, the header math.h gives you lots of math functions to play with, but here's an example of ignoring that and doing it yourself instead: mindspring.com/~pfilandr/C/fs_math/fs_math.calgorithms
Is this interesting to you? This is how you would write complicated math functions from simpler pieces like add, subtract and multiply. Less transparently, you also have the concept of re-usable functions and loops.
At any rate, learning C and writing stuff like the above would be a good place to start. If you wanted to write code like that on a microprocessor of your own design, you'd want to write your own assembler anyway, which could be done in C.
Nicholas Gomez
>This is probably going to sound like a stupid question, but I'm gonna ask anyway: How viable is it to build a homemade calculator? You can build a piece of mechanic-electronics that performs addition of natural numbers of up to, say, a million (or rather, 2^20) using a couple dozen relays, some switches and lights for binary input and output, and a soldering iron. The relays are the only piece of primitive logic you need to take for granted for this, along with some basics of electrical systems (voltages and currents and shit).
That is probably about the limit of what you can practically do in terms of what you can physically build and solder together. You can get further than this, and possibly go as far as a simple instruction machine using a couple of hundred relays, but at that point you are well into diminishing returns regarding what you are learning from actually building the physical thing, and you'd be better of just studying the theory, and maybe playing with the logic in a simulator rather than the real physical hardware.
Brayden Robinson
You can convert PDF to mobi
The old school ones can't display PDFs, but I think calibre can convert.
Luis Butler
Pretty sure it looks like shit. That's why there are things like Calibre and other programs to convert ebooks.
Jayden Flores
fried compatible grate away
Easton Edwards
Actually I do know some C, and that's what originally made me interested in computers. But I find it very unsatisfying to play around with high-level languages (or mid-level, maybe; I'm not sure what level C is supposed to be) while the machine itself remains a black box to me. Apparently there are people in this world who enjoy coding while possesing a somewhat limited understanding of what's really going on inside the machine, but personally I'm more interested in finding out how exactly the machine works instead of playing around with a black box without really understaing what's going on beneath the surface.
Nathan Phillips
I don't fully understand everything of what you said (I'm rather ignorant, as I've pointed out before) but this sounds like something I might be interested in doing.
Gabriel Young
So using an fpga is overkill, but soldering a gorillion relays isn't? Really?
Mason Lopez
Here's your crash course in how computers "think"
"A" and "B" are boxes that hold numbers. "A" and "B" are connected to four machines that add, subtract, multiply and divide respectively. The combination of add, subtract, multiply and divide machines is called an arithmetic logic unit (ALU). A switch feeds back the result of exactly one of the ALU machines back into the register "A", which accepts the fed-back value every clock cycle. (Yes, each computation is computed EVERY time, and we throw away 3/4 of the results)
The switch is controlled by four commands: "00", "01", "10", "11", mapping to the ALU functions respectively.
Thus, we could say "00000101" is a program in machine code that adds A to B twice, then subtracts B from A twice.
There, we've designed a super-shit computer and have defined our own machine code. It clearly sucks because we can't chance the value of "B" as drawn, or load/write to memory. If you like this A LOT, go to your nearest university and get a degree in Computer Engineering.
A relay sounds like a simpler component than an FPGA. I understand your complaint but from my perspective an "overkill" is using ready-made advanced technology. An electro-mechanic system sounds simpler than an integrated circuit that was fabricated for the explicit purpose of being programmed by the customer. Of course, it's "simpler" only in the theory, and in practice it might be a hassle to solder a gorillion relays.
Oliver Gutierrez
A relay is an electromechanic component that can be used as a base for nearly all computational logic machinery. Roughly speaking, it is a electrically-controlled light switch: it makes, or breaks, an electrical connection between wires A and B, depending on whether or not there is current running through a third wire C.
To the first approximation, all modern electronics are made out of transistors, which again to the first approximation is an electronic equivalent of a relay, doing much the same job; the key difference being that transistors can operate like nine orders of magnitude faster, and can be made smaller and cheaper than relays by a similar degree. But for a sufficiently small piece of logic, you can build it entirely out of relays, and see the things do physical clicking to perform computations. So you can come to understand major chunks of modern computing by working with relays a bit, and extrapolating from there.
There's a paper on doing something like this here: web.cecs.pdx.edu/~harry/Relay/RelayPaper.pdf . If you're lucky, perhaps someone else in this thread can point you to some better resources.
The material in the first chapter of the "nand to tetris" book partially overlaps this stuff, but in a more abstract form, minus the "building this shit using actual electronics" parts. The remainder of that book is about building ever more elaborate systems out of these basic fundamental parts, all the way up to a Tetris game; but this is better suited to analytical study, rather than actually building it from relays, unless you want to solder enough relays to make up a small mountain.
Connor Turner
with an FPGA, literally nothing is premade. The innards of the chip get physically arranged based on the code you write. Writing (good) code for them is like nothing else you will ever see, and the toolchains are often extremely finicky.
You'd likely be better off with a simple ARM chip or even something PIC based. ARM toolchains are really easy to work with, and pretty easy to pop into QEMU. would recommend.
Lincoln Ortiz
Understanding the theory is well and good, and I'm more than willing to learn (that's why I asked for book recommendations in this thread). But does it really work? Unless I try to put it together myself, how do I know whether the theory is truly correct? Should I just trust the authority of the people who write the books? I guess I'm just a very fastidious person. Here's an entertaining anecdote about myself:
I was taking a course in elementary Analysis and I noticed the professor was making liberal use of the trigonometric functions without bothering to define them. After the lecture I questioned him about it and he told me that I already know the definition from high school. I showed me the high-school definition of trig functions using the unit circle and the counterclockwise measure of angles and I asked him if this was the definition I was supposed to use; he said that it was. Then I pointed out that such a definition implies the assumption of the axioms of elementary geometry plus certain vague, intuitive notions like "counterclockwise movement" which would be fairly difficult to axiomatize; I added that in Analysis we are supposed to make use only of logic, set theory, and the axioms of the Real Numbers (or Peano axioms, if you wanna start from the very beginning), so making use of elementary geometry and intuition seemed out of place. The prof looked kind of impressed and kind of pissed at my observations, and he eventually gave up and said: "You're right, that's not you define trig functions in Analysis. You can define them analytically with certain integrals or with infinite series, but that's a little too advanced for a first course." That prof later told me that I combine skepticism and rationalism in a dangerous way and that my life will be unhappy if I don't try to trust people more.
Kevin Robinson
Thanks for the explanation. Sounds very interesting.
Michael Long
>with an FPGA, literally nothing is premade
Maybe as far as the software goes, but the hardware seems to me like premade advanced technology. I was looking for something simpler and more primitive, so that I might be able to put it together myself. Nonetheless trying to play around with an FPGA might be somewhat fun for its own sake, even if it's not what I was originally looking for.
Owen Gutierrez
There's fortunately much less theory in digital electronics, where things are more about learning a library of good designs to work from. Most of the analog/semiconductor physics shit goes away by saying "just clock it slow enough that things are guaranteed to converge to the state we wanted".
Things work the way you have designed them to work. I could implement the "Add" machine as posted above to always emit A+B+4 for example, I am certainly not forced to design it in a sane way. Designing computers means defining the operations yourself, and you can add whatever you want.
John Roberts
Couple weeks into calculus 1 now, doing well, already past the chain rule and beyond. Quotient rule was a joke. Product rule remains my specialty.
I ask my professor his thoughts on quantum mechanics and partial derivatives. He's impressed i know about the subject. We converse after class for some time, sharing mathematical insights; i can keep up. He tells me of great things ahead like series and laplacians. I tell him i already read about series on wikipedia. He is yet again impressed at my enthusiasm. What a joy it is to have your professor visibly brighten when he learns of your talents.
And now I sit here wondering what it must be like to be a brainlet, unable to engage your professor as an intellectual peer. All of the deep conversations you people must miss out on because you aren't able to overcome the intellectual IQ barrier that stands in the way of your academic success... it's so sad. My professor and I know each other on first name basis now, but i call him Dr. out of respect.
And yet here you brainlets sit, probably havent even made eye contact with yours out of fear that they will gauge your brainlet IQ levels.
A true shame, but just know it is because i was born special that i am special. I can't help being a genius, nor can my professor. Two of a kind is two flocks in a bush.
My point is that I would like to see the physics in action at least for the most elementary instances of the theory. No matter how many functions I code, if I'm working on a "black box" machine I will never know for sure how the machine produces a certain output when given a certain input.
Asher Murphy
An FPGA is like a bajillion relays connected together in every possible way. "Programming" an FPGA means telling an assistance chip to go in the FPGA and break any of the connections you didn't want.
If it makes you feel better, most design tools can emit a relay-level design schematic for you. You can go to bed feeling smart because you COULD have soldered it together by hand, you just saved some time.
Mason Turner
>You can go to bed feeling smart because you COULD have soldered it together by hand, you just saved some time.
But unless I try to solder it, how do you know it will really work that way? FPGAs seem well and good, but I would like to try and get my hands dirty at least once or twice to really find out how the thing works. After I've satisfied my curiosity, I might very well just use a FPGA to play around.
>My professor and I know each other on first name basis now, but i call him Dr. out of respect.
Sounds like a pederastic relationship in the making.
Dominic Mitchell
You're only gonna be left dissapointed. You CAN learn about semiconductor physics, but textbooks generally will only cover models applicable up to about ~Pentium 4 level.
Modern transistors basically require computer simulation, and most manufacturers keep their physical models as internal secret sauce.
Even so, you're still talking about learning skills that ARE an undergrad engineering degree. Why not dual-major computer engineering and math (I did)?
>You're only gonna be left dissapointed. You CAN learn about semiconductor physics, but textbooks generally will only cover models applicable up to about ~Pentium 4 level.
I don't wanna build a supercomputer all by myself. My humble goal was to build a calculator that could perform addition and subtraction and, possibly, multiplication and division.
>Why not dual-major computer engineering and math
Sounds like too much work and I am not sure I care that much about computers. I guess I'm only interested in the theory and the idea of building a calculator is kind of an "experiment" I want to carry out by myself to see how the theory works. I'm not planning on getting a job as a software engineer or electrical engineer.
Hudson White
That looks really cool. Too bad it would probably not fit in my room.
Mason Parker
I mean if you want something that can calculate, you build it yourself, and fits in a box. Get an arduino and fuck around with it.
Justin Powell
>only add, subtract, multiply, divide In that case, you already got a block diagram of what you want. Here's an amended version where B and the Opcode are defined by you, and A is your output.
Use a bit width of 8 and this is easy to make by hand (multiply is harder, divide very hard).
No prob man. Glad you found it as I know how people feel on here about sharing but I would of set it up with a direct link and let you scan it.
I'm more into higher level and programming stuff but I have a friend who has gone balls deep into verilog, fpgas and that other one that is a more basic kind of fpga (for lack of a better thing to say).
The art of electronics is one of the keystones of understanding all this. Even if you don't need all that it offers it is worth a skim and to have ready.
Practical Electronics for Inventors Electronic Principles Into Circuits (~10th edition) Technicians Guide to Programmable Controllers
Those are nice too depending on which way you are aiming.
Ian Murphy
I programmed a base 60 calculator, I could send you the code if you want to try and dissect it. I didn't build the phone it runs on though.