Q: Why couldn't the little Goth kid sleep? A: He was afraid of the light.

In Which I Explain A Thumbnail History Of Home Computing In A Response To An Email

Date/Time Permalink: 04/06/12 02:33:50 pm
Category: General

Foreward

Just about exactly one year ago, I received an email from a reader asking some questions about computing culture, and where Microsoft, Apple, and Linux fit into the scheme of things. He mentioned that he was fine with me posting the text of the conversation in my blog. And I meant to do that, but it got buried under the pile on my virtual desk until now.

Be advised that I love hearing from all of you, but I'm lucky if I get a chance to reply at all! And when I do, it isn't usually a four page epic epistle with footnotes and citations like this. But this one time, somebody asked just the right questions and gave me just the right impression that I had a wonderful opportunity to teach a fertile mind. Wherever he is today, I hope he went far in pursuing his dreams!


The original letter

Disclaimer: if you wish, you can answer publicly in a blog post, if you want, and quote the email in full. I have no problem with it :)

Dear Penguin Pete.

I would like to ask you something, which you might think is pro-Microsoft or pro-Apple or something, and it might be, but i'm really just interested in hearing what you think.

My question is this: Without the efforts of Microsoft and Apple, would computers be as easy to use as they are now? What if the development of computer operating systems and the way that computers behaved, were done by teams of volunteers, that wouldn't have had any money to conduct Human Computer Interaction tests, and refine the operating system as well as the GUI and CLI to the needs of the average man. Would we be lacking in the terms of desktop computer usage, with only the powerful geek elite using them, or would we have progressed even more, making computers easy to use, even for the average consumer?

I don't want to sound like i'm pro Big corporation, because they're inherently bad for the common folk. That said, i believe that Mircosoft's efforts in building Windows helped democratize computer usage to the level, where an average man could pick one up and start doing things with it. Same thing with Apple's original Macintosh: despite the price, it was made with the non-technical user in mind.

I've seen some anti-normal user sentiment in the Linux circles (although it's not that big, mind) but when i hear Richard Stallman speaking free software and such, i sometimes get the mental image, that he's horrified about the fact that NORMAL people WITHOUT a Ph.D are using computers to build things, consume things and just talk to other people.

What's your opinion on this?

Best Wishes
(name withheld)
Finland, The Canada of Europe.


My reply

Dear (name withheld),

Get ready for a long letter! :) I assure you, I won't rip into you here, I'll just set out the stuff you seem not to have discovered on your own yet. You sound like a bright person; I'm doing this because it's worth it to inspire a questing mind like yours by pointing you at the things you haven't been told yet.

Your question reflects the state of affairs which I rail against constantly. Specifically, there are facts that are buried by the corporate media which, had they been more openly aired, you would not have needed to ask. But there's a lot of nuanced, interconnected ideas that you have... you're not entirely wrong, and you're certainly not to blame for the parts that you have wrong - as I say, it's the fault of the media not providing you with better information! I'll break this down into parts.

(1) So basically, your first part is summed up as: "Would there be advanced computing systems without Microsoft and Apple?"

In the first place, Microsoft did not pioneer the desktop GUI. Windows didn't take over the market until 1993, when version 3.1 came out. Apple had a desktop GUI before them with the MacIntosh, going clear back to 1984, and in fact when Microsoft copied Apple to launch Windows, Apple sued in a famous litigation case for the "look and feel" of Microsoft's interface.

As you can see from that Wikipedia article, Apple also didn't pioneer the GUI interface... Xerox had it first! And likewise, Xerox sued Apple for copying them!

Now, as a side note, Apple computers contain a Unix-based core. Mac OS X "is a series of Unix-based operating systems and graphical user interfaces developed, marketed, and sold by Apple Inc."

Apple's Mac OS also makes use of the BSD code base, and there's your open source involvement already.

(1){a} So now your question is reduced to "Would there be advanced GUI systems without proprietary, corporate-controlled development, period?"

Now to trace it back to Xerox, the Xerox Alto and the Star were pioneers of GUI workstations starting in 1973.

But I'll skip a bit to avoid boring your leg off - the man you need to meet is Douglas Engelbart.

Douglas Engelbart

Never heard of him? All you hear about is Steve Jobs and Bill Gates, right? Douglas Engelbart!

Douglas Engelbart developed the first GUI at the Stanford Research Institute, and Xerox's systems were based on it. Douglas Engelbart is actually the pioneer of the mouse, graphics on the screen, hypertext, icons and buttons you could click on... way back in the 1960s! We're a long way from Steve Jobs and Bill Gates now, aren't we? :) Anyway, Douglas Engelbart was not in any way a corporate hack with a profit motive, but just a university researcher running off government money (from ARPA).

(2) Now, your query seems to imply that GUIs "brought the computer to the masses" and that before the Great Mouse Revolution, computers were the exclusive domain of the elite eggheads who could mutter incantations in binary or something. So, let me paraphrase this as "Would the public have been able to use computers before the desktop GUI?"

Well, what you're forgetting is that the consumer home computer revolution launched way back in the 1970s. Hobbyists already formed the Homebrew Computer Club back in 1975.

And that article tells the story better than I can:

"The Homebrew Computer Club was an informal group of electronic enthusiasts and technically-minded hobbyists who gathered to trade parts, circuits, and information pertaining to DIY construction of computing devices. It was started by Gordon French and Fred Moore who met at the Community Computer Center in Menlo Park. They both were interested in maintaining a regular, open forum for people to get together to work on making computers more accessible to everyone."

So right there, we have home-based hobbyists, "open forum", "making computers more accessible to everyone", and so on. The gist of my argument is that it's the "home hackers" who did all the research and groundwork - even the founders of Apple were members of this club and back then, their interest was in computer advocacy, not profit. The very kernel of computers-for-the-common-folk was born on the backs of the earliest form of open-source geeks, before the term "open source" was even coined. Corporations merely came along after the fact and monetized and commercialized what was freely traded before.

Furthermore, there's the earliest home computer market. The Radio Shack Tandy TRS-80 was in every mall in America, a floor display at the front of the store, launching in 1977 at a price of $600 - well within reach of the middle-class family. And there's the Commodore series, starting with the VIC-20 in 1980, at around $300 - this was my first computer, I was about 13 years old. Our family was dirt-poor, and we could still afford it. Furthermore, it was taught in school! There was also the Apple Lisa, the Sinclair ZX Spectrum, and the IBM PCjr, all launched in the mid-1980s.

All this stuff was affordable for - and marketed to - the home user. Here's computer ads from the 1980s.

And that's nothing compared to the TV commercials, with William Shatner, Bill Cosby, and a Charlie-Chaplin impersonator right there next to the breakfast cereal ads during the Saturday morning cartoons. What I'm saying here is: people bought them, used them, loved them, and geeked out on them.

And now for the shocker: NONE of the computers available for the home in the early 1980s had a mouse. And NONE of them had a graphical desktop. NONE of them had anything but a command line where you typed commands, and ALL of them ran Basic, the original programming language for non-technical home people. And where did Basic come from? Can you guess?

All the way back in the Homebrew Computer Club, of which both Bill Gates and Steve Wozniak were members - and there was already an open-source version of this Basic programming language being passed around. Here's a great personal memoir from a former member.

Kids played text-based adventure games, where you controlled your adventurer with commands. You can see an early example of this at the beginning of the Tom Hanks movie Big (1988). And then there's Creative Computing magazine, published 1974 to 1985...

"The magazine regularly included BASIC source code for utility programs and games, which users could manually enter into their home computers."

Wait, this is a mind-blower... do I mean that "open source code" was being freely published and shared by home consumers way back in the 1970s/1980s? In a magazine that was sold in every store? Why yes, indeed, I do!

Now going back to the late-1960s/early-1970s, there was really no middle-class home computing. Because at that point, the concept of a desktop was still a fuzzy dream - you had to get time-share on a mainframe system and the only way to do that was be a university student. Computers cost thousands of dollars and even the best ones came as an assembly kit. You had to be an electrician just to put them together. But even there, it was hobbyists, not entrepreneurs, who were pushing the computer out to the people. Back then, the idea that software could be patented, copyrighted, sold, and monetized was silly.

Whew! Pant, pant. My fingers tire. Time for the next point:

(3) OK, Richard Stallman, "geek elitism", "user friendly", and so on.

Hooooo boy. Here's the deal. Could you do me a favor and forget this headful of pre-conceived notions for a minute? Clear your mind. Take a deep breath. Now imagine the following universe:

  1. Everything I've told you here is taught to every child in every school in every nation in the world, starting about grade 3.
  2. All schools have "programming" as a mandatory subject, as well as being integrated with both math and science.
  3. People grow up thinking that programming is something that NORMAL PEOPLE DO. It isn't any harder than basic math, after all. I'd say writing your first "Hello World" program is no more difficult than solving your first long division problem.
  4. Words like "geek", "hacker", "nerd" don't exist any more. Nobody calls you a nerd for knowing how to cook an omelet or change a flat tire on a car, do they? Everybody eats and everybody drives, so cooking and car repair isn't anything out of the ordinary to do, is it? Well, everybody computes in the 21st century - why is programming seen as something that only this stereotypical egghead autistic punk-rock anti-social "nerd" or "hacker" person can do? Because as you can see from this history, this attitude wasn't the case.
  5. "user friendly" is no longer a common idea. Instead, users are made "computer friendly"! We have to do it this way because we humans can change and adapt while computers are stuck being electric current running through logic gates, no matter how much gloss we try to paint over them.

That point there in (4) is the whole impetus for why I've been preaching on my little soap-box for five years on my blog. It's not "programming and computers for elite geeks and everybody else- hands off!" Instead, it's "everybody should learn computing and programming so that NO ONE is elite, and there will be no more geeks, just regular, ordinary people who have adopted to a world with computers in it."

But money wants it different. There's money to be made from keeping people ignorant and exploiting them for that ignorance, and that money funds a lot of misinformation, and so we have the age of corporate robber-barons who control the data and information and do the equivalent of patenting the alphabet and charging everybody ten dollars to read or write. And all you hear in the corporate-funded media is "Oh, hackers, they're evil! Don't be a hacker! People who know how to program are pathetic, anti-social geeks! Don't be one of them! (unless you pay $gazillion dollars to get a degree through our school and come work for us - then you can be one of the elite.)"

Does it all make sense now? :)

Addendum: How did it come to this? Well, it's really quite simple (even I forget this sometimes and need to be reminded). The integrated circuit was only invented in 1959. The human race simply hasn't had enough time to get used to the idea of computers yet. If you look back over history, there were similar adjustment periods for the advent of the airplane, the automobile, the steam engine, the telephone, the printing press, electricity, the sea-going cargo craft, and even back to aquaducts and paved roads. You can see that monopolies have grown up alongside each advance in society, going right back to the Greek philosopher Aristotle who criticized the olive-press industry at the time for being a monopoly. Similar monopolies were attached to the production and export of major traded goods like salt, oil, steel, and diamonds. Each time, they eventually get overthrown.

Thank you for listening, and good luck in your continued learning,
"Penguin" Pete Trbovich

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

suddenly the moon