On the first day of the first course as a CS major, the prof said something like "in this course you'll learn how computers work from the bottom up", turned to the blackboard, and wrote Maxwell's equations. We laughed, and he said, "You're laughing because I got it wrong, no? You need quantum mechanics for transistors".
We actually ended up starting at logic gates, flip-flops, and latches. But the joke was a great reminder that your floor is someone else's ceiling.
I once taught a class "Software as a Second Language: Computer Programming for Non-Majors" at UCSB/CCS. The audience very rapidly "why?"'d me down to "I honestly don't know, class dismissed: come back next week". I came back with a box of 12V relays and restarted the class from logic gates up, building a toy architecture with a usable machine language. Of course, I started at magnets (as given magnetism I can explain pretty easily how a relay works), but had to tell people that not only would you have to trust me that magnets worked at all, but that it isn't even clear that anyone knows how they work anyway (but for those curious, the Physics students in the class would likely be happy to talk about it ;P).
I've only read a little bit of the 2nd book (IIRC I stopped reading and lost interest when I found a very obvious and repeated error), but all of the 1st one. The 1st one is highly recommended.
IMHO the trend of teaching things in CS top-down doesn't encourage understanding of how computers actually work, which turns out to be quite important in many situations, especially debugging and knowing the limitations of what computers can and cannot do. I believe that a lot of the problems with software today (bugs, inefficiency, overcomplexity) can be eliminated or at least reduced greatly via a deeper understanding of these basic concepts, since the majority of programmers seem to have only a very vague idea of e.g. memory, number representation, CPU operation, etc.
I think CS should start at the lowest level above EE, with binary representation of data and basic digital circuits (without e.g. going into the physics of how transistors work.) Have students build basic CPUs and write/enter programs for them and run them manually to see computation happening at this level. Then proceed up to Asm, C, and higher-level languages to write more complex programs, maybe even a simple operating system, accompanied by some of the more theoretical material (e.g. algorithmic complexity), continuing to keep the theme that there is a practical use for learning these things.
Funny you mention that Petzold book; I read that as an IT student doing airy courses about software requirements and design and realized I had to get across to the electrical engineering building. I didn't have the prereqs for the OS course, so I did the first assignment from the previous year and sent it to Gernot and begged to be let in. He ended up my thesis supervisor.
(The other person I remember not being totally thrilled with the IT courses was Scott Farquhar, of billionaire Atlassian fame)
So that book can change your life, and is the first thing I tell people to read who show an interest in the systems area
Well, people could easily lose interest if they come into the field wanting to be able to make computers do cool stuff, and they're stuck with circuits and binary for a while before even being able to do that. Those are fine coming later after learning at least some amount of higher-level code in which you can explore the potential of what you can make happen with programming. Being able to see some of your code make a simple game work or something; I feel like that's an important step. It's easy for us, already fully immersed, to say that beginners should or would be interested to know all of the logic gate-level mechanisms, but it's going to be off-putting for most.
I consider it a "learning to walk before running" type of thing. It may not be interesting to a lot of beginners, many of which probably want to immediately start doing something "interesting", but these are the fundamental basics that anyone claiming to have a CS degree should know.
Starting with higher-level languages tends to breed more misconceptions, because the students get used to seeing things "seem to work like this", and having only a vague, superficial understanding, it is much harder to unlearn these than to learn the truth in the beginning.
It's also not as if things like logic, data representation, and the sequential nature of programs don't matter at higher levels --- whenever one uses conditional statements and loops, a good grasp of boolean logic is essential. Why numbers have a finite range, floating point values aren't exact, etc. all depend on knowing how data is represented. Not knowing these things will at best make it difficult to reason about code, and increase the chance of introducing bugs; at worst, it can lead to security flaws.
+1 for CODE, I can't recommend that book enough to... and when you start talking about Morris code, telegrams, braille, OHMs, and binary math as your build up... well, it really just connects a lot of mental dots.
"from the bottom up" and the _first_ chapter is called "General Unix and Advanced C"? This book may be useful, but I would look for another name (I wouldn't even mention computer science, but that may be because I agree with Dijkstra that that isn't science or about computers). I also think a generic computer science book shouldn't only use Linux as example OS.
Also, if chapter 1 is about _advanced_C_, why does chapter 2 have to introduce binary and hexadecimal notation? I think the book could be improved here, too; in general, the TOC feels unbalanced to me.
This book was created both from my experiences and the experiences of teaching 3rd year operating systems classes.
Students came in to that class having done a 2nd year class (data structures and algorithms I think) that was done with a lot of C. They really learnt "high-level" C, enough syntax to get by in the their course, but most didn't really understand it -- not enough to read the source to any kernel at any rate. I don't imagine it's done like that today; this was a long time ago.
And most didn't really understand binary and hex, and how it relates to code either. You'd be amazed most 3rd year students didn't know how 2^10, 2^20, 2^30 relates to kilo, mega, giga bytes.
The book needs a lot of editing, something I occasionally get to. But apart from the odd times someone posts it to here or reddit and people consider it like a text book, it's really only ever accessed by people googling for a specific topic. But suggestions are welcome and I'll consider it when I get some time.
And on the "it's not computer science". The first commit was 10 years ago and people have been telling me that ever since. It was originally dreamed up as a course I'd teach to high-schoolers over 10 weeks to prepare them better for university -- the "computer science" degree they were planning to take. Yes, the book is really more about what would be called "systems" in the outside world.
> I don't imagine it's done like that today; this was a long time ago.
Sadly, it still is in many (most?) places' CS curricula.
> And most didn't really understand binary and hex, and how it relates to code either. You'd be amazed most 3rd year students didn't know how 2^10, 2^20, 2^30 relates to kilo, mega, giga bytes
I've worked with graduates who didn't know binary, didn't understand signed/unsigned overflow and how integers wrapped around, and perhaps even more disturbingly, had never used a command line or knew how to do a lot of other things that could be called "basic computer literacy for CS students"... because almost all they were taught were mechanised steps on how to open an IDE, write some code, and click the Build/Run buttons. I think your book would be suitable for those students.
> But suggestions are welcome and I'll consider it when I get some time.
I think for the purpose you created the book, the organisation is fine; it appears to be a collection of selected topics instead of one meant to be coherently read from start to end.
It's great work, but the title is still misleading. Computer Science deals with more than just operating systems. I'd have expected at least some more discrete mathematics before it could be called "computer science".
This criticism, by the way, should not be seen as in any detracting from the amazing job you have done there! Thank you for publishing this.
Yeah, I don't really understand that either. At first I thought it was because it was necessary in order to get the background to do programming exercises in later chapters, but there don't seem to be any programming exercises.
I think it would make more sense to put that chapter last.
"Not everyone wants to attend shop class. Most people only want to drive the car, not know how to build one from scratch. Obviously any general computing curriculum has to take this into account else it won't be relevant to its students. So computer science is taught from the "top down"; applications, high level programming, software design and development theory, possibly data structures. Students will probably be exposed to binary, hopefully binary logic, possibly even some low level concepts such as registers, opcodes and the like at a superficial level.
This book aims to move in completely the opposite direction, working from operating systems fundamentals through to how those applications are complied and executed."
Yeah, I feel like this is trying to answer the question "What is a computer" instead of "what is computer science". Still a really interesting question, but definitely not what I was expecting.
That seems intentional. This seems like it's trying to be a "fundamentals of Computer Science" type thing than actual Computer Science. It's the stuff you need to learn first before you can move into actual Computer Science.
Computation. So what is computable, algorithms, analysis of algorithms etc
Non of which you need a computer for.
An operating system in its self is an application of computation. It's like physics and a car. There is stuff in a operating system which is computer science, like how to schedule tasks etc but learning a specific os is not computer science.
Programming languages are computer science since they describe computation. Compilers are fuzzy
People have been doing computer science, long before computers existed.
I think at its core computer science is solving problems using mechanical steps(algorithms), although some people would say its even more general then that.
I've started to do something quite similar on my blog. I've been mentoring some web developers who only know html+css, and i've been using the time to write about the whole stack.
My goal is to build a roadmap, not to write a recipe book.
I've just started, but this is going to be a very useful reference for when I need to explain some of the more complex ideas.
For someone who is just finishing CS this is great for recap. I printed the pdf and will go over everything I've learned in the past years.
Would've been nice if there was some "Theoretical CS Bottom-Up" too. From what I see this book doesn't have anything about computability, complexity, automata, algorithms, data structures, etc. Those abstract math-ish parts of CS are often overlooked by hackers, and are quite beautiful IMO.
I like the introduction with a "new" approach, even though I do not necessarily agree with it. Speaking in 0's and 1's is extremely daunting for someone who wants to learn about CS, IMHO.
This is not "bottom up". Bottom up is not ultra-reductionism. Bottom up is not learning writing & literature from the sequential arrangement of ink and space(≈ taleb). Though it is technically correct to say so. Below is something very similar.
"At a purely formal level, one could call probability theory the study of measure spaces with total measure one, but that would be like calling number theory the study of strings of digits which terminate. At a practical level, the opposite is true: " - Terrence Tao [1]
I haven't read any of the chapters of this, so my comparison may be a bit off, but you might check out The Elements Of Computing Systems[1]. It goes from logic gates to compilers.
I thoroughly enjoyed this book but it doesn't go into near as much detail as I would like- it gives a vague, general concept of how things work and some pointers on how to implement them, but doesn't really go into real systems, often relegating major features to "optimizations", etc.
If you're looking for a really great book on basics of systems with particular focus on Unix/Linux and C (like what this tutorial seems to be... rather than all computer science), I don't think you could do better than "Computer Systems, A Programmer's perspective" by Bryant and O'Hallaron.
For something covering the "lower half" of this material, I highly recommend Patterson and Hennessy's "Computer Organization and Design". I'm currently reading the 5th edition, and have gotten to the last chapter, which is on parallel processors.
We actually ended up starting at logic gates, flip-flops, and latches. But the joke was a great reminder that your floor is someone else's ceiling.