Code: The Hidden Language of Computer — Book Review
Computers have become an integral part of our lives. They are the backbone of almost every business in the world, in fact some of the most successful companies are centered entirely in computer hardware or software. But have you ever wondered, how does a computer work? This book aims to unravel the inner workings of the biggest invention of human kind.
What is Code?
This book begins with the idea of communication over distance. Human beings developed language to be able to transmit information and ideas. Speech and writing are very effective ways of doing this, but what if we needed to communicate over a long distance? Maybe a more compressed way of doing this exchange could be beneficial. From this problem arises the idea of coding the letters of the alphabet in different ways to transmit ideas through light or sound using Morse code, or even embossing a sheet using Braille code. This same principle of compressing can be used in numbers using the binary system, where only the numerals 1 and 0 exist, however they can be used to represent any number in the decimal system.
Electrons and Electricity
Every atom is made up of 3 parts: Neutrons, Protons and Electrons. Understanding how this last item works is what led us to discover electricity. Electrons like to move around the atom, and electricity is just the flow of electrons running through a conductor, usually in the form of a wire, this flow is called electric current. A conductor are materials that permit electrons to flow freely from particle to particle. In contrast, insulators are materials that impede this free flow of electrons. But what if we could have the best of both worlds? A semiconductor is a material that in some cases will conduct electricity but not in others. Scientists discovered that as some semiconductor materials were heated, they conducted electricity better. This led to the development of transistors using Silicon.
Logic Gates and Memory
The main building block of computer logic are AND, OR and NOT logic gates. These later evolved into XOR, NANDS and at some point jumped over a new category: memory. The ability to retain information and stored was achieved the first time when the flip-flop was invented. Flip flops and latches are the fundamental building blocks of digital electronic systems used in computers, communications and many other types of systems. It is the basic storage element in sequential logic. After improving in the design and clustering many flip flops together to store many bits of information, the RAM or Random Access Memory was created.
Digital Computers and Instructions
RAM memory could perform arithmetical operations such as addition and subtraction of binary numbers, it also could store and modify elements in memory. When transistors appeared RAM memory saw a huge increase in its capacity, and so many more operations were included in the RAM. These instructions were written in assembly language, which consists of an assembler or low-level programming language that communicates directly with the computer’s hardware. Learning to code and assemble instructions in this language was very hard and time consuming, worst it could not be translated to all computers because of the processor. This led to the rise of what is now called a high-level programming language. This type of instructions were easier to write and code, and more importantly, could be used in every computer. A compiler was needed to compile the instructions and translate them into machine code the computer could understand.
Graphical Revolution
In the early days computers weren’t interactive at all, switches and cables were used, then it transitioned to punched cards as input devices and printed results on paper as output. That was it. No screen was involved. It was until the 70s that a screen was first introduced to serve as an output device displaying ASCII characters. It was still difficult to navigate the computer’s content, the user needed to know computer commands and a programming language. Almost a decade later Microsoft and Apple released Windows and Mactintosh System 1 respectively. This was the first attempt to make a friendly GUI or Graphical User Interface, where users didn’t need to know how to code to use the computer. This changed everything and truly positioned the computer as a home item everybody could own.
Final Thoughts
Overall this is a great book to understand the inner workings of the computers, and more importantly to get a grasp of all the changes that these machines went through the years. All the way back in 1918 from being able to store a single bit of information, to modern times where machines where able to finally be used by the general public due to the graphical revolution that introduced GUIs. In simple terms, it serves as a biography for computers.
Although I think this is an extraordinary book to consolidate your knowledge in Computer Science, I feel that it is not for everyone. Although it is written in a very simple way so that anyone can understand it, when entering the more complex topics (around Chapter 17) it becomes difficult to follow for someone without a technical or engineering background. I would really recommend it to anyone that works developing software or hardware for computers, or any person that is really interested in learning how computers work, it could take a little patience and perseverance to comprehend many of the new concepts presented in the book, but it is definitely worth the effort.
I also feel that it would be interesting to revisit the book to add some of the more recent advancements in technology, such as mobile devices, GPUs (Graphical Processing Unit), TPUs (Tensor Processing Unit), SSDs (Solid State Drive) and even Cloud Computing.
Score: 4/5
Link to the Book:
Disclosure: I only recommend products I really believe in, all opinions expressed here are my own. This post contains an affiliate link that at no additional cost to you, I may earn a small commission.