hero-background-image-header

Book Review: Code: The Hidden Language of Computer Hardware and Software

By Peter BellSeptember 16, 2019
IDE tag
Explore Our Online and In-Person Courses
View The Courses

Whether you want to become a software engineer, or just better understand the technologies that power our world, there’s real value in learning how computers actually work. Because computers are so complicated these days, it can be hard to figure out where to start.

The answer, according to Charles Petzold, is to go back in history to build an understanding of the foundational concepts, slowly moving from the telegraph equipment of the 19th century to the graphical user interfaces of the 1980s. In “Code: The Hidden Language of Computer Hardware and Software,” Petzold takes us from morse code to the early microprocessors of the 70’s and 80’s, providing a deep and satisfying explanation of exactly how computers function.

Image

What is code?

The book starts off by going straight to the heart of the title - explaining “what is code.” Not, initially, a series of steps describing operations that you’d like a computer to perform, but rather the very simplest codes like morse code — a system for transferring information between people and/or machines.

In addition to being a good introduction to the idea of codes, morse is also useful because it is binary. That is, it’s composed of two values — dots and dashes — used in various combinations to represent all of the letters of the alphabet. And that’s something we’ll see a lot more of as the book progresses since most computers also use a binary code (with 1s and 0s) to represent information and instructions.

Just add electricity

In the spirit of starting from the basics, Petzold also provides an introduction to electricity by introducing “the anatomy of a flashlight.” In doing so, he explains how electrons are moved through a circuit by a voltage, explaining what voltage, current, and resistance really mean. 

After introducing both codes and electricity, Petzold examines the telegraph - a way to use electricity to transmit encoded information. The telegraph also marks the beginning of modern communication. As Petzold notes "For the first time, people were able to communicate further than the eye could see or the ear could hear and faster than a horse could gallop."

One challenge with the telegraph is that the signal becomes weaker with distance. If you wanted to send messages across the US, you’d need to find a way to make the signal stronger every so many miles. To handle that, Petzold introduces the idea of a relay — something that takes a small, weak current and uses it to flip a switch in another circuit which can carry a bigger current. At the risk of a spoiler alert, he is also effectively introducing us to the action that transistors perform. With a transistor, a small current on one of their three terminals allows you to switch a bigger current between the other two terminals — just like a relay, but in a much smaller and more efficient package. And transistors are the primary building blocks of most modern digital electronics — including computers.

Binary basics

Petzold then takes a detour to introduce “base” systems, working down from decimal (ten distinct numbers from 0-9 before you have to add another digit to represent “10”) through octal (you only have 8 digits) all the way down to binary where you only have 0s and 1s. He even introduces some worked examples. “Tie a yellow ribbon” is a binary signal with only two possible states whereas Paul Revere’s “one if by land, two if by sea” required a couple of lanterns to convey the three possible states!

Just add algebra

A binary system is fine for conveying information, but to be able to perform operations on the information, you also need boolean algebra - a way to work with binary data. Petzold first introduces the basics of boolean logic and then shows how you can combine that with electricity (starting with simple circuits using a light bulb and switches) to create logic gates. He shows how AND, OR and NOT logic gates work and how they can be combined.

From here, he shows how to create the capacity to add and subtract using just logic gates and then introduces oscillators (that change between two stable states frequently) and flip-flops - a kind of gate that adds memory to a system.

Building a modern computer

Petzold then runs us through how to build an automated adding machine, building up the basic instructions of a microprocessor one at a time. He introduces the classic Von Neuman architecture that underlies most computers, explains how semiconductors work and then introduces two of the early, classic microprocessors from the late 1970’s - the Intel 8080 and the Motorola 6800

Just add software

Finally, Petzold introduces the software required to turn a chunk of silicon into Excel, Pac-Man or a website! He starts off by explaining the functioning of a typical Operating System, and then shows how higher levels of abstraction can make it easier to write software when compared to describing everything in machine code - which is extremely verbose and tedious. He then rounds out the book by explaining how computer graphics work and how they are used to deliver graphical user interfaces.

Summary

If you just want a couple of metaphors to have a sense for how elements of computing work, this is not the book for you. Rather than being satisfied with explanations like “RAM is like your tabletop and your hard drive is like your filing cabinet” Petzold takes the time to provide a rich and nuanced introduction to the internal workings of modern computers. 

Unusually this book blends comprehensiveness with accessibility. If you’re willing to take the time to work through the 380 pages, you’ll never quite think about a computer in the same way again!

Headshot of Peter Bell

Peter Bell

Head of Data Science

Peter is a veteran technologist, CTO, entrepreneur, and longtime educator, having taught digital literacy at Columbia and authored numerous programming books.

All Articles by Peter Bell