I also babysit trolls....

The Ten-Minute Computer Science Course

Date/Time Permalink: 03/16/06 03:40:31 am
Category: HOWTOs and Guides

What, you say you still don't understand computers? Come here and sit down. We're gonna blow away all the obfuscation and lay it out in layman's terms.

Think of a light bulb in your house. You control it with a switch; flip it on and off. This is the world's simplest computer. It can store exactly one piece of data. That might not seem very useful, but a hotel has a sign that says "vacancy" or "no vacancy". It's controlled by a single switch. The switch can toggle between two states. We can call them "on" and "off" or we can use them to count: 0 and 1.

Now let's add two switches, for a total of three switches, and have them each going to a light bulb. Three lights, three switches. When a switch is up and a bulb is on, we'll call that "one". We'll call the other way "zero". By setting the switches in different combinations of on and off, we can get more numbers. All three off can be 0. The farthest one right on, that's 1, or 001. Turn the right one off and the middle one on, that's two, as in 010. Turn the right switch on again for three: 011.

Using this system, we find we can count all the way to 8! That's: 000, 001, 010, 011, 100, 101, 110, 111. Always remember that zero is *also* a number! This method of counting is called "binary", and it's all a computer knows. We're simply setting little switches inside the machine to on and off to store one and zero. Our three-switch model could be used to tell an 8-room hotel how many guests it has, for instance.

Now expand the scale...we can't make computers more sophisticated than this simple 1-0 system (yet! they're working on better models!), but we can make millions of these switches very small, and pack them into a very small space: that's a chip! Now, while we're at it, we can make switches point at something besides light bulbs. We can make them go to other switches, so that many switches have to be on before a certain light can come on. In this way, we can set up computer logic. Computer logic allows more sophisticated operations than mere data storage: we can set up "if-then" decisions.

Now, none of this makes any sense until we assign meaning to it. Otherwise, it's just ones and zeros being moved around. In the primitive days of computers, ones and zeros were all the machine could generate. It was years before it got sophisticated enough to handle data in any other format.

Your computer monitor is nothing more than a big bank of very small light bulbs! We call them pixels, and actually they're all drawn on one big tube, but forget it: we'll stick with light bulbs. When you type an "a" from the keyboard in a text-editing program, you see an "a" pop up on the monitor. It's all transparent to the user, but you're actually moving quite a few ones and zeros around. Your keyboard is nothing but a big bank of switches, and for a moment, you turned the "a" switch on and off. That delivered a binary number to the computer: 01100001. The short-hand for this is ASCII-code 97, and ASCII is American Standard Code for Information Interchange. In short, we have the system for translating all those ones and zeros to something meaningful!

The computer system then sends the letter "a" to the screen, which, depending on which font you selected and what size it is, may turn on many pixels. All this just because you can't read and write binary - computers go to a lot of trouble for us humans.

The operation I have just detailed illustrates the only three things that computers can do: they can receive data, process data, and output data. That seems too simple, but it makes sense for every case you apply it to: Playing a DVD movie on your computer? Yes, that movie is stored as a bunch of ones and zeros on the DVD, which is read by the DVD player (the input device). The operating system has to have a special program that knows how to interpret that data as a video signal to be sent to the monitor, and a sound signal to be sent to the speakers (the data processing). Then you see the picture on the screen and hear the sound from the speakers (the output devices). How does the computer know what to do with the data? We make it up! All we have to do is agree on which groups of ones and zeros mean video and which mean audio, and then keep writing the code that way.

Viewing a web page? That's your computer sending a "get" request from the computer where the web page is stored (your computer output a request to my computer's input). The hosting computer responds by sending lots of ones and zeros to your computer, which your computer then interprets into pictures and text. We can keep wrapping more and more layers of indirection over this, making it simpler for us but more complicated for the machines; fortunately, the machines can handle it. So we have the protocols: HTTP is hyper-text transfer protocol (the web's data-transfer language) which guarantees that my computer sends the ones and zeros of my web page in a way that your computer has been programmed to understand. We have TCP/IP, Transmission Control Protocol/ Internet Protocol, below that, which makes it so my computer understood your computer's "get the page" request. Then we have HTML, Hyper-Text Mark-up Language, a "helper" program that tells your web browser how to display the page.

Meanwhile, dozens of input and output processes are running in the background the whole time. Your mouse has a program that checks it repeatedly looking for changes of state (the input device) When it detects a changed signal, it looks up the code in it's program table to determine what to do with it. The user's x-axis has decremented two units, so the user must be moving the mouse left (the data processing)! Then it sends the signal to the monitor to redraw the part of the screen your mouse pointer isn't at anymore, and draw the mouse pointer where you expect it to go. The data table is part of what we call a "driver": a piece of software which translates between your operating system and the individual piece of hardware.

But there's a lot more going on, too! Your keyboard is run by a similar driver, your monitor is constantly refreshing, your hard drives are being controlled by another program for when you read and write data to and from the disk. Still more processing is done to keep control of RAM - random access memory - which is needed always as a temporary buffer to store data in (for instance, the data making the page you're reading now).

Programmers have the same intolerance for complication that you do, so we create programs to help us explain to the computer what we want it to do. The first level of this is called "machine language", which is like a shorthand for large chunks of binary that form individual instructions. Even machine language is too difficult for any human to write in, so the next level above that is "assembly language", which has to be run through a special program to interpret it into binary. Assembly is not very popular to write in directly, so - see where it's going? - over time we've come up with more and more generalized programming languages which are shorthand for assembly. The process of translating all those general instructions into computer-language is called "compiling" or "interpreting", depending on the language.

Of course there's more - but it's just more and more of the same. This completes our ten-minute computer science course! And I'm done!

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

suddenly the moon