One of the most challenging aspects of learning about computers and computing is the formation of an overview that can adequately summarize the minutia, but can be easily memorized. Most books about computers and computing provide too much detail in this or that specific area, or omit the big picture all together. Part of the reason for this is because most people aren’t interested in computers and computing in general and instead, they are looking for specific information on how to do something they feel an urgent need to do. Another big part of the reason for this is because Computer Science as an academic discipline demands a far more sophisticated comprehension of computers and computing than most people interested in the general picture need. This text is meant to fill in that gap by emphasizing the most general aspects of computers and computing with the widest application across sub-disciplines. It is meant for anyone to read if they want to become familiar with this aspect of our world that is composed of computers and those who use them …likely the reader.
So let’s jump right in!
The Bottom Line is Hardware
Computers are pieces of mechanical and electronic hardware. All of the things computers do require some kind of similar hardware. This includes seeing shit, moving the things you see, printing shit, sending this or that to somewhere else, putting pieces of hardware together with other pieces of hardware, creating data, storing data, automating things, making VoIP calls …all of it. When I was younger and more naive, I actually thought that with a regular desktop computer I could learn enough about it to do anything that other computers do. Shit, I actually thought that I could download RAM! But you can’t and the bottom line is hardware: electronic machines.
Now, I’m sure by now you’ve had some nerd try to wow you with how big computers used to be and how little they could do, compared with how amazing your smartphone is. That’s all true, but chances are pretty high that what you care about isn’t really how much computation can happen in your pocket; but rather, what you can do by tapping on virtual buttons. You’re also probably just as interested in how big a computer is as you are in how cheap and ugly it may look …which the nerds aren’t going to tell you anything about unless they’re Apple fanatics.
I am writing this to do the opposite of wow you. There are plenty of amazing things about computers and what they can do, but it’s more important to comprehend just how uninteresting they can be. A lot of stuff actually sucks on computers, especially anything that has to do with sensory or sensual experience. I think that computers should really be avoided for most interesting things, but definitely used for what they are amazing at. But before we get to what they are amazing at, let’s run through this overview of computing hardware.
First of all, as electronic machines, computers run on energy. All of them. They are all entirely useless without some method of obtaining and using energy. There are a lot of consequences of this fact, not least of which is environmental impact. And unless you are generating your own energy through solar, wind, heat, or whatever then computers are going to tie you up with the electrical grid in one way or another. That’s not even accounting for the telecommunications grid(s) that you’ll wind up tied up with. If you’re entirely comfortable with the vulnerability this creates in your life, that still at least means you’re going to take on an ongoing expense. And we all know what sorts of misery we are expected to endure so that we can pay our bills.
The fundamental thing that a computer does is …wait for it …..waaaaiiiit for it … …turn on and off. You think I’m kidding, but I’m not. Computer hardware all boils down to using some bit of matter that can exist in one of two states: on, or off. Now, you might ask “well what about circuits and wires? What about hard drives and keyboards? What about, what about, what about?” STOP. They are all just pieces of hardware that turn on …and off.
On and Off
Making something that can turn on and off is pretty old school; and, the mathematics that makes it possible to transform on/off into the computation we know of today is even older. However, at every level of computing, this on/off binary capacity is the mud from which computers are molded. A few other capabilities are necessary to make something as useful as a general purpose, personal computer. And, we’ll get to those capabilities soon. To begin with though, let’s take a good look at on and off.
Blinks and Blips
First of all, there are a variety of ways that we can talk about on and off: positive and neutral charge, true and false outcomes, ones and zeros, etc. The important take-away though, is that in all of these binary systems it is possible to create patterns of on and off.
Optical discs (CDs, DVDs, Blueray, etc.) are a good example of a digital medium that stores data. CDs are made from an original “master” disc. The master is “burned” with a laser beam that etches bumps (called pits) into its surface. A bump represents the number zero, so every time the laser burns a bump into the disc, a zero is stored there. The lack of a bump (which is a flat, unburned area on the disc, called a land) represents the number one. Thus, the laser can store all the information sampled from the original track of music by burning some areas (to represent zeros) and leaving other areas unburned (to represent ones). Although you can’t see it, the disc holds this information in a tight, continuous spiral of about 3–5 billion pits.
Morse code is a good example:
But whoa, WHOA hold ON a second, these patterns of on and off that we see in morse code depend on an important concept that I didn’t even mention: time.
Putting Shit In
Taking Shit Out
Software… Fancy, Virtual Hardware
Hard Gates and Soft Gates
The So-Called “Operating System”
Shhhh… this is a Library: Programs and Programming
Everything You Saw Before, but Now Virtual(ized): the Internet