"We can't stop here! This is Penguin Country!"

<< Previous Page :: Next Page >>

Why I Am Perfectly Justified In Being A "Grammar Nazi"

Date/Time Permalink: 04/06/13 10:05:50 am
Category: Geek Culture

punctuation list

For many years, when online and encountering sloppy spelling, grammar, and punctuation, I resisted the urge to stick up for proper language usage online. "Don't be such a nag!", I'd say to myself. "What if English isn't their first language?", I'd reason. "Not everybody can handle dyslexia, ADHD, Asperger's, etc.", I'd excuse. "It's not that big a deal.", I'd shrug.

On and on we go, year after year, using the Internet, a medium that relies exclusively upon text, while heedless of the simple methods of expressing ourselves with that text. First, we tried to helpfully guide those who struggled along. Then, when the inevitable, damning label of "Grammar Nazi" was hung around our necks, we backed off, puzzled, but not in the mood to be the world's white knight today. But now, the tide has turned all the way around. The straight-A students are now the "bad guys" and the drop-outs the "good guys."

Now we have... I guess I have to coin it myself... "Laziness Nazis"! People who attack you for using correct language! "Oh, you stuck-up snob! You elitist! Who do you think you're trying to impress, with your showing off? You must really think highly of yourself using a semicolon there!"

Wow!

I even complacently tolerated this, as well. For far too long, I recently realized.

Here's one page to give an example - I assure you, I only limit myself to one example to save space. This example is a perfect snapshot of the spirit of the times. Dangerous Minds proposes a new CAPTCHA system to screen out those who cannot distinguish between similar elementary words like (there / their / they're), (were / we're / where), and (lose / loose). And you can ironically dance through the comments and pluck up the flames like wildflowers a-bloom. The mob reacts: How... how DARE they!

At the center of the "only elitists use correct grammar / spelling" movement is this attitude that "knowledge of language does not equal intelligence". And then I wonder... doesn't it?

That's when I had this epiphany.

Because typing the correct word does not cost anything. Leaving out a superfluous apostrophe doesn't require expending extra effort. Remembering a rule of sentence structure does not place a great burden upon people - there are parrots out there with a vocabulary of thousands of words, there are gorillas who learn sign language to communicate with their handlers in a semantically logical fashion. Even dogs I have owned understood the difference between "Where did your ball go?" and "We're going to the vet." And for Heaven's sake, unlike the parrots, gorillas, and dogs, we have built-in spell-checkers at our disposal.

Using correct grammar, spelling, and punctuation does not cost anything. Using correct grammar, spelling, and punctuation requires only mental effort. A person who cannot be bothered to expend the mental effort required to distinguish between two elementary-school words is also someone who cannot be bothered to expend the mental effort required to reason soundly... use logic... read carefully... think critically... have intellectual curiosity... take pleasure in solving problems... seek things out inspired by pure curiosity. Someone who is careless and sloppy in such a simple matter as typing out a few keystrokes - while using a medium that depends entirely upon typing the correct keystrokes in order to use it effectively - and gets belligerently defensive and insulting hostile when other people try to help them use this medium more effectively, is someone with a lazy brain that does not like to do work.

And that, my friends, I am so regretful to report, is an idiot.

Email, commenting, and texting are not constitutionally-protected rights, you know.

Now let's hear all the excuses the intellectually lazy use:

What if English isn't your first language? Then you have undertaken to learn a second language - so LEARN it! I have, in fact, been fluent in Spanish, having been born and raised in Southern California (I am no longer fluent in Spanish because I've lived up north for ten years now, so I'm out of practice). When I was rubbing elbows with Spanish-speakers, I tried to pay attention and get it right. When I made a mistake and was corrected, without fail, I was grateful every single time and thanked them for the lesson. And now that I am too rusty in Spanish to use it effectively, I stay out of Spanish forums. If I undertake Spanish again, I will ensure that I am at least back up to speed when I do so. And if a native speaker corrects my grammar, I express gratitude for the free lesson. In fact, I've never met the ESL (English-second-language) speaker who was hostile at being corrected - it's always the English-native "Laziness Nazis" who use the old ESL defense.

But is English too hard a language? Yes, I know, English has its little quirks: "Why do we park in a driveway and drive in a parkway?" and all that. That's because English is derived from West Germanic, Latin, Greek, Sanskrit, and French. It is not consistent in its rules because it draws from many languages with different rules, and we even import more words from other languages with impunity.

But so what, do you think English is hard? Oh my goodness, try some other languages. Try Spanish, which has the concept of grammatical gender, so that 'taco' is male but 'quesadilla' is female, and then the rest of the words in the sentence have to agree with the gender of the subject. Try Japanese, which has the concept of honorifics, suffixes attached to the end of any word referring to people, which change depending upon the relationship between speaker and "speakee"; thus your brother George is 'George-san' to you, but your father calls him 'George-kun'. Try Thai, which has the concept of phonemic tones, so that five words which are spelled and pronounced identically have radically different meanings depending on the tone of voice you use; the syllable 'na' on a rising note means 'thick', but on a falling note means 'face'. And try Chinese , which uses 600 pictograms just to get started with a baby-level vocabulary!

What, you feel taxed because you have to keep "they're" and "their" straight? Oh my God, when is the telethon? I want to contribute all I can.

What if you have dyslexia, ADHD, and Asperger's? Yes, I know someone who has all three. You do, now, too. In fact, you're reading something written by him right now! And I chose not to go through life with a brass band in front of me declaring my handicaps like a flag that people have to salute. All I have to do, you see, is care about overcoming my own natural disadvantages, and then I overcame them. Because I am a human being, with a spine and a brain and a heart, and I would rather live standing on my two feet than hide behind a diagnosis, like a coward. Which is why I have never mentioned it before and will never mention it again. For the same reason, I also don't bring up that I was raised in poverty, had to miss a lot of school to support my family, was raised in a broken home, could not afford college, was forced to self-educate, or dozens of other excuses I could use.

What excuse is it now? Did you have a poor education? Well, if you are in such a bad way that you fail kindergarten literacy, then what are you doing online at 2AM arguing about which Marvel superhero deserves to have the film with the biggest budget this summer? You have to hurry to your tutoring appointment - you have no time for idle chit-chat! And by the way, how do you manage to hold down a job, procure a place to live, or even find your way around town? Who gave you a laptop? How did you find this website?

You say you didn't have the time to spell it out the long way when you texted me? Well, what are you, an EMT? What are you doing while you text - driving a fire engine on the way to a four-alarmer? Yes, I know, I have a smartphone too, and my fingers are as fat as Grecian columns and the buttons are as tiny as fleas. I still text in complete sentences. And if I'm too busy to do that, then I'm too busy to text at all.

You say "mistakes happen"? Yes, they do! In an essay of this length with nobody to edit it (or publish it, for that matter) but me, I'm sure that I've made a few mistakes. I will try to catch and correct all that I can. I will save a draft and re-read it. When others catch mistakes, I will acknowledge them and correct the error (and have a sense of humor enough to laugh at the irony, given the subject). The difference between the occasional typo and lazy thinking is painfully, painfully obvious. ERRORS are random; LAZY is a habit. Pro-intelligence people admit to error; anti-intelligence people attack the one who pointed it out.

Oh, wait, maybe you'll catch me on some really arcane Strunk-and-White-type style rule. English, you say, has so many rules, nobody could be expected to memorize them all. Oh, please! We're not talking about dangling participles or split infinitives or ending a sentence on a preposition here - we're talking about the difference between 'there', 'their', and 'they're' - an adverb, a pronoun, and a contraction for a pronoun and a verb - words that most any three-year-old child knows.

You say that I am "exclusionist" for wanting to limit my online company to good writers? Why yes, yes I am. This is the "social" web, after all, and I see no problems with applying the same standards to virtual socializing on the Internet that I deploy, sans controversy, in the real-life socializing that takes place in my living room. So from now on, I will also ask all rude, violent, and irresponsible people to stay off my website as well. In fact, as soon as webcams are standard, I intend to be exclusionist to the point that, just as I would when answering the door, I give prospective visitors the once-over through the peephole and refuse them entry if I don't like the way they look.

And now I have an unpleasant task to do. I really, really like British performer Stephen Fry, have been following his work for quite some time, and have just plain looked up to him in the past.

But now, I have to pound sand in his rat-hole.

I have to do this because of this video, linked to and cited more and more every day as a "take that" to us "Grammar Nazis".

I'm afraid that I don't much care for Stephen Fry anymore.

In this little vlog, he does the very un-British (but very, very American) act of taking "Grammar Nazis" to task who, for example, protest the sign in the grocery store that says "Ten items or less" when it should (allegedly) say "Ten items or fewer".

OK, Stephen, I'll play your little game, because I have an appetite for straw men today.

In defense of the offending sign-painter, Fry has the audacity to cite Oscar Wilde and Shakespeare (basically, it amounts to "they didn't use perfect English, either"), and then looks down his noses at us "pedants" for not appreciating the rhythm, music, art, spirit, and joy of language as a dynamic, alive, and vibrant thing.

Gee, golly. We're sorry. No, actually on second thought, we're not the least sorry, Stephen Fry! You write like slugs on opium bugger, you have no more command of English than a USA beauty-pageant contestant, and you're only a British David Hasslehoff pandering to your real fan-base overseas because you're a loser in your own home country. And to put a fine point on how out-of-touch you are and perhaps always have been, you start off getting it wrong regarding whom is calling whom 'elitist'. It's the Laziness Nazis calling the Grammar Nazis 'elitist'; it wouldn't make any sense for it to be the other way around, now bloody would it?

That was the ad hominem appetizer. Now for the main course of reasoned arguments:

One does not have to violate the rules of spelling, grammar, and punctuation in order to produce a beautiful work of language any more than engineers must violate the laws of physics to pull off a perfect rocket launch. When Oscar Wilde or William Shakespeare misplace a comma or noun a verb, they have a license to do so by virtue of being historic and world-famous authors. Historic and world-famous authors do not get to be historic and world-famous authors by being ignorant of the rules, but by having mastered the rules, to a degree such that they know when they can augment them to serve the higher purpose of their own, more inspired writing.

Not that sign-painter. No historic and world-famous author, he. And he knew it. When the store hired him to paint a sign for the express check-out lane, he wasn't trying to wring the most succulent morsel of poetry from the dry bone of commercial store display work. No, he was getting paid bottom dollar to crank out the sloppiest, but still acceptable, possible wording and head for the pub while the paint dried.

And for such a simple task to which he was appointed, he was too incompetent even at that. For the correct phrase would not have been "12 items or less" nor even "12 items or fewer", but "12 item limit", a solution which is unambiguous, irrefutably correct, and takes less space and less time to boot.

But that sign painter, anonymous forever, did not think of it. And do you know why? Because he could not be bothered to expend the mental effort. Because he had a lazy brain that does not like to do work. Because he was a "Laziness Nazi", and people like Stephen Fry are making a living appealing to the Populist masses by telling them that their sloth is justified.

And that, my friends, I am so regretful to report, is an idiot.

proof-reading provided by Cutty Sark

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

Desperate 2AM Case Mod Butchering

Date/Time Permalink: 04/02/13 04:19:30 pm
Category: Humor

And THIS is what happens when you’re SICK and TIRED of driving back to the computer store to exchange yet another part that was almost, but not quite, the right one you needed.

Poor mangled computer

Poor mangled computer

Poor mangled computer

Yes, it runs. Beautifully, in fact. Out of SHEER TERROR.

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

Linux Mint 14 Nadia Gets Penguin Peted

Date/Time Permalink: 03/26/13 05:43:30 pm
Category: Reviews

Linux Mint logo

Yeah, y'all have seen me distro-hop a lot over the (is it eight now?) years I've run this blog. Red Hat, Slackware, Mandriva, Ubuntu, and just recently, Fedora.

I don't know what happened with Fedora, but it seems to like my laptop, so it can stay there. But my desktops... these are old codger Gateway boxen I bought used, and even that was years ago. Let's just say that, if they were human, they would have been able to get their first periods by now. Fedora hates these boxes, and frankly, I'm not crying any tears over disengaging Fedora from them, either. Fedora runs great as long as you don't CHANGE anything, and then it becomes as durable as an inflatable dartboard.

I'd heard about this Mint, see. This new Mint thing all you hip young kids are grooving on. I finally broke down at a party and tried a puff of Mint and got hooked and next thing you know, I'm burning a DVD full of Linux Mint 14 Nadia, a name that sounds like a vampire from Slavic folk tales. Since I am a vampire from Slavic folk tales, we'd get along great.

First I tried it on my Dell "crash test dummy", an even older box, suffice it to say that it would end up in jail if it tried to have sex with the Gateway boxes. Just spun it in there (always use the crappiest computer you have to test a new disk, I always say) and running the DVD live came up with a decently presentable desktop at 1024x768 resolution. Not bad looking, I thought, wondering what ancient crawling horror of a graphics card I had in there... I reached in, and now remembered that I had left this brainless box with no graphics card, it was doing this off the motherboard. On a box that originally ran 640x480 when it was new.

Putting it on my intended Gateway desktop reaped equivalent rewards. Its install is the smoothest install I've ever had, even smoother than Ubuntu. When you reboot to a desktop, the login box flashes up with a snappy "ping". Not the blaring trumpet orchestral fanfare of Ubuntu, but a sharp, short cheep to attract your attention to the fact that it's ready to go to work now.

The whole point to Linux Mint is that it's an Ubuntu but with all the background stuff installed. Drivers, codecs, proprietary engines, and whatnot. I think it's great that we still have a choice in not having any non-free software installed by default, but seriously, who are these people who live without Flash, Java, audio and video support, OpenGL, and even most of the pretty fonts, yet still own a computer for some reason? Right, so what most of us end up doing is boldly downloading GPL'ed Linux while brandishing our patriot flag and making loud speeches about how we're not pawns of the capitalista overlords, and then tiptoe home to our closet in shame through the night installing all the proprietary stuff to make it do things anyway.

Hey, I have no problem admitting that while I am an anti-establishment punk at heart, my primary reason for running and loving Linux is that I am a cheap, cheap bastard.

Anyway, Linux Mint is one distro that even Ubuntu users can look down on for coddling. You thought Ubuntu holds your hand, but Mint will hold any part of you you ask it to. I was walking on the beach with Linux Mint 14 Nadia, and looking back, I saw one set of footprints. I asked Nadia about it, and it told me that those were the times when it carried me.

Minutes after install... well, no, I had to run updates, of course, and reboot, of course... but minutes after that, I flipped Firefox open and went around the web and Flash ran out of the box! Wait, I hadn't installed Flash yet... Then I went to see some Java games and dammit Java ran too! So I finally got out of installing Java (Installing Java is still used for torture in third-world dictatorships sanctioned by the UN). I went to Crackle and started watching a movie, it ran flawlessly! Linux Mint liked my Wacom pad! Linux Mint sang from my speakers! It opened a PDF from my collection for my reading pleasure while updates were finishing! And since I'd gotten the XFCE cut of the distro, I spent zero minutes and zero seconds looking at Gnome3, which is infinitely better than any time at all looking at Gnome 3. All out of the box!

So, yes, if you're tired of sitting up Googling "Firefox Flash plugin no sound" at Slavic vampire hours of the morning, Linux Mint is for you. Since you don't have to manually hunt down codec libraries this time around, this is one system where your family will thank Linux Mint for the four hours they didn't have to listen to you screaming behind the boarded-up office door.

Now, one caveat: the "Software Manager". There is a program by that name in the menus. Avoid it, it is a trick. Synaptic is in there too, go for Synaptic. "Software Manager" may be good for one-package-at-a-time installs, but that's exactly what it will let you pick is ONE package at a time. I got impatient and started picking more packages before it was done installing some. It bogged. I clicked on. It crashed. I gurgled in terror, running 'top' in a terminal and watching as various zombie package-install processes - now manager-less - lurched here and there in the guts of my system doing God Knows What. I think the reboot was well-timed enough to prevent too much damage.

Also, the "Software Manager" won't sort the install-able packages by any kind of logical scheme such as name or version, but instead by popularity, based on how many votes and YouTube-like comments the package gets. I'm not kidding, there's comments in each page for each package. What, am I supposed to quit using a tool I've depended on for 20 years just because some kid with a Guy Fawkes mask for an avatar tells me it's "for fags"? Mint's "Software Manager" also "helpfully" doesn't TELL you if something is installed or not until you click on it, and also whether the thing you're telling it to install now is part of the requirements for something you're already installing... I know the line of jazz about making things so easy to understand that grandma can handle it, but how about not handing grandma a box of dynamite and a book of matches while you're at it?

Alright, really, that one gripe, and then I'm sticking to Synaptic and apt-gets. Outside of that, Linux Mint is beautiful, beautiful beyond words. Not only is it well-thought out, stable, and well-behaved, but it looks damned good while its at it.

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

Google+ Is Now Starting To Become A Happening Place

Date/Time Permalink: 03/14/13 06:34:18 pm
Category: Reviews

A few months back, I made a little grump about Google+. The post actually got some response from the Google+ team. While some of that wish-list will perhaps always go unfulfilled, several other features I've wanted to see have been implemented. What's more, it's starting to get the thing you expect in a social network: people!

Google+ has been buzzing with increasing activity. I'm in more circles, get more feedback, get more adds, and discover more communities every day. It's starting to get some depth. They've added hangouts, integrated Google-pages with the place, and as for Google Reader (which oh so many people are grousing about its being discontinued lately), they axed it because of lack of use, and most of the functionality of an RSS newsfeed is repeated in Google+'s features. I can hit 'explore' and get a fuzzy idea of current news right there.

It's still got its rough patches though. For instance, Google+ says my profile is "35% complete" but I don't add things like my birthday. I never tell a social network my birthday specifically because I refuse to allow any site to paste an astrology symbol on my profile, which sad experience (ahem, Yahoo) has shown to be the default behavior. One of the rules for happiness in life that I have discovered is to give others the fewest possible opportunities to be stupid.

From little maladjusted quirks like that, you can see how I, not really meant for this world, am hard to please when it comes to social networks. Take it into account.

But when I want to swim in the online social pool, I'll take all the noise and deal with it to the best that my neurotic hangups will allow. So I'm glad so many more are joining the party on Google+. Let's hope they handle their growth responsibly!

I don't suffer fools gladly. Come to think of it, I don't suffer geniuses all that gladly, either.

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

How Do We Get More Men Into Needlepoint?

Date/Time Permalink: 03/04/13 04:29:10 pm
Category: General

I almost couldn't post this, because I've been in such an agonized tizzy, screaming myself to sleep at night as my white-knuckled fingers clench the sweat-stained bedsheets all about THE QUESTION.

THE QUESTION will not go away. It is front-paged on every news website, puked onto the table of every TV pundit debate, screamed from the headlines of every newspaper.

"ZOMG HOW DO WE GET MORE WOMEN INTO PROGRAMMING????????"

And science careers? And tech jobs? And STEM careers? And web development? And math? And database administration? And... and... where the wimmin at, dog?

None of these people realize that every time they ask this question, they drive women further away from programming. If there were a smidgen of interest in actually creating a balanced society, we would just make everything equally accessible and then let women make up their own minds what to do.

You know what question we never see? "How do we get more men into needlepoint?"

And nursing careers? And cake decorating? And ballet? Hey, guys, how would you feel if you had a mass of screaming harpies chasing you around with a camera and microphone demanding to know, right this minute, why you, sir, are not actively learning how to warp and weft a 10x20 penelope canvas? Would you be more open to getting into needlepoint then? Or would you be annoyed at being singled out for your gender? That's how women respond (and I know several myself) to "How do we get more women into programming?"

The answer is trumpeted to be, of course, that you must have this culture around needlepoint that is discouraging you from getting into it. Stigma. As soon as women find out you're a male needlepointer, they yell condescending remarks: "Oh, you petit-point pretty well for a man!" Flip that in reverse for women and programming. See how dumb that is? If women are only staying out of programming because of stigma, then men are likewise kept out of needlepoint. It's a female-dominated hobby, you see, so that would stop you from simply downloading patterns and buying thread at Hobby Lobby, because you'd be afraid of the clerks snickering at you behind their fist.

Not that it does. There are male needlepointers, there are female programmers. Ask either one of them what it's like in their culture, and they'll both respond that yes, the other gender does tend to dominate the industry, and yes, they do run into some sexism, but oh well, they deal with it and hope for a more enlightened tomorrow.

Yeah... but why? Why don't the genders show an even 50/50 split across all occupations and interests? Why are soap operas mainly watched by women; why are comic books mainly read by men? And yes, that last question is beginning to make the rounds; the media, endlessly helpless to distinguish one stereotypical pursuit from another, has leaped from asking why women aren't in STEM careers to why aren't there more women "geeks" and from there downhill to why aren't there more female Batman fans.

Oh, wait, they've tried to solve that one. Batman - boy-bits + girl-bits = Batgirl!

But wait, it didn't work. Women still don't read comic books!

Perhaps because they sensed that this "Batgirl" person (and Supergirl, Wonder Woman, etc.) was a shallow attempt at marketing to them by taking the same thing that's popular with boys and painting it pink. There's jillions of female superheros, and they are all male superhero spinoffs with a sex change. You may have Batgirl, but Batgirl drawn and written and sold by men. Oh no! Here it comes again! "How do we get more women to be comic artists?????"

No, really, guys, listen to this: What if we make needlepoint patterns available that picture monster trucks, football games, Playboy models, and Transformers robots? Will that get you into needlepoint?

Oh, it won't? Oh, you hurt my feelings.

(Sidebar: By the way, yes, there's female comic book and cartoon fans. In the manga and anime section, because in Japan, they have special categories of just manga and anime by females for females about females, and if there's male characters and more than two of them in one plot, you can bet the guys will be in each others' pants by issue #3. And if they aren't, the female slash fiction base will correct that oversight. Manga and anime enjoys a 50/50 gender appeal, because it started out from the first place being segregated into genres.)

Why aren't more women into programming?

I have the answer. My answer is so good, so definitive, that I should charge you for it.

The answer is "money."

No, not that the money is bad (although it is bad, but it's perceived as good). It's that every boy who grows up pursuing a tech career has dollar signs in his eyes. Because of Bill Gates, Sergey Brin, Larry Ellison, and Mark Zuckerberg. These guys are trumpeted from the front pages of Forbes, Time, and Business Week. The message is "there's big money in programming!" True, most of the programmers end up slinging Java in some god-forbidden corporate data warehouse for $12/hr. and end up getting laid off in five years when their job gets outsourced, but when they started out, they were aiming for Redmond.

(Another sidebar: Yeah, yeah, I know, there's one guy out there I have to address: You're only motivated to program because you enjoy making computers do nifty stuff. It's the love of the art. And that's why you have a job that has nothing to do with technology, do nothing to promote your success amongst your technology peers, and if somebody offered you fistfulls of cash to do anything computer-related, even consult or deliver a lecture at a conference, you'd push it away and be insulted, amiright? Yeah, you do it for the "love of the game" - you and every NFL player on their way to cash their 8-figure check.)

No, the question we should be asking is "How come there's so many guys in programming?" Because programming, even in 2013, is still seen as an at least stable and potentially lucrative career. And (WARNING: We have reached the part of this essay that will draw screams of "sexism":) guys feel more pressure to make money than gals. Because how many women want to date a broke man?

That's it. There are a lot of men in programming, because programming is seen as a potentially successful career, and men, naturally competitive, want those wads of cash very badly. You want more women in programming? Get rid of the money and the potential for success on a huge scale, men will abandon the field like toddlers fleeing the canned vegetable aisle, and Glamour, Vogue, and Cosmopolitan will start carrying ads: "Learn freelance programming at home! It's the perfect 'mommy-job'!"

Here, I'll prove it:

Presenting a book I've held dear on my shelves for a very long time, waiting for its day in the sun. The book is Introduction to Business Data Processing, by Lawrence S. Orilia, published McGraw Hill, copyright 1979. Important date to remember as we browse through it.

Sure, in 2013 you might think that there's a gulf between "data processing" and "programming", but in 1979, there wasn't one. The book is chock full of code in FORTRAN, COBOL, and BASIC. With flowcharts, endless fields of flowcharts, because that's how data processing was done back then. Check that link I made to Google books - while they don't have a scanned copy, note the keywords in "common terms and phrases". Yes, they go into line numbers and loops and statements. The students who learned from this textbook weren't modern data entry clerks. They would become programmers, because advanced software didn't exist at the time and it was expected that you would know coding in order to just use a computer.

Thirty years later, even Tim O'Reilly can't say "everybody who uses a computer should learn a little programming" without a lynch mob battering down his door.

women_programmers0367

What this book is also chock full of is women sitting at computer terminals. All the way through the book. Females operate the computers, men enjoy lofty positions in suits and ties striding around with clipboards, supervising the women.

women_programmers0369

women_programmers0371

women_programmers0372

You see, computing in this era was still largely seen as office-type work. Before computers it was all typewriters, adding machines, and filing cabinets, which mostly women did in the position of "secretary", so computers at the time were deemed to be glorified typewriters, adding machines, and filing cabinets combined - hence "women's work". Mostly it was the kind of work you wouldn't catch a guy dead doing. Remember, we're not talking the world of 3D rendered graphics and Internet entrepreneurs, we're talking punched cards and magnetic tape.

That's right, women were the first "hackers"!

women_programmers0374

women_programmers0375

His tie is wide enough to land a plane on, so he must be in charge. No way is he getting his hands dirty with this grubby computer stuff.

women_programmers0368

And yes, these queens of code could actually use a command line! You know, that thing I get screamed at all the time by the troll chorus for being an "elitist geek" for advocating?

women_programmers0370

Just look how confident and happy she is! Little did she know that in 30 years, plugging in your own external storage media correctly on the first try would be beyond 95% of either gender without a frantic call to tech support.

But modern macho men, your final humiliation is yet at hand. So only big, strong, logical men can handle all this computer stuff, ehhhh? Well back then, computer operation as an occupation had to be sold down, not up, so they had to include this charming anecdote:

women_programmers0373

It says on page 252: "A computer has become the means of communication between Lana, a four-year-old chimpanzee, and the rest of the world. Two years ago, she started to use the symbols on a computer keyboard to talk to her keepers."

"You see that, ladies?" said 1979 businessmen desperate for technicians, "These things are so simple, even a monkey can learn to use them, so you have no excuse!" A female monkey.

---



---

Remember this was 1979. Nobody had made billions of dollars in software yet. In hardware, yes, certainly, IBM, Honeywell, DEC, and whatnot, were very big deals. So the manufacture of computers was a male-dominated field, but just like typewriters, once we make 'em, let the women type on 'em. Also take note that the word "hacker" had yet to enter public parlance. There was no mythologized archetype of the cyberpunk anti-hero.

So if programming computers wasn't glamorous yet, and software tycoons who would become world-class billionaires were years in the future yet, and operating a computer was so easy that a monkey could do it, then what was there to attract men? Nothing. Yet the world needed programmers. So they had to hire women, and lo, there were the women running all the big, hot, dusty machines.

Now, I don't have time to write a whole book here and you don't have time to read it. I've laid out all the tools you need to build your own answers, because you're all bright people. Think about men, women, jobs and hobbies, and which ones are male-dominated, female-dominated, or 50/50. Think about history. Think about motivation, and the options people have in society.

I hope you have all learned something, not just about the history of computing, but about the genders, society, and the way the public perceptions relate to all of them.

But above all, most of all, more than my dear blood, I hope I never have to hear this stupid, stupid, stupid question ever again.

Follow-Up: I misplaced the bookmark when I was writing this, but I found it again. Here's Stanford research about how programming changed from a female to male profession. Now, sociologically, they cite our old boogey, gender-bias. Subtle gender-bias covered up by personality testing and prerequisite courses. Yes, but saying "we don't have female programmers because personalities are screened for introverts and college records are screened for math credits" does nothing but move the question out to "Where are all the introverted math geek women?" There are introverted math geek women out there. But anyway, it's another point of view worth sharing here.

Update: How many times does YOUR head hit the desk when reading about the latest gender-tech scandal, known as "dongle-gate"?

Update 4/10/13: As I continue to get more responses to this, here's a link somebody found on Fogcreek from 2 years ago: “The Computer Girls” from the April 1967 issue of Cosmopolitan magazine. Once again, somebody else noticed that women used to dominate the computer field, but then answers "why did they leave?" with the knee-jerk "teh sexism!" Always, always, always.

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

Doomed To Obscurity Enters Its (Planned) Final Year

Date/Time Permalink: 01/01/13 06:48:54 pm
Category: Site News

Four years ago today, I announced that I was launching the webcomic Doomed to Obscurity. While it's had some modest claim to a few links and comments and some cult popularity, it's still pretty much lived up to its title. So by the end of this year, my self-imposed commitment to posting every odd-numbered calendar day for five years will end.

Note that this doesn't necessarily mean that the series will be dead forever. Maybe I'll keep it up if it gets more popular. Maybe I'll only publish ebooks. Maybe I'll only post it sporadically.

But more likely, I'll turn my energy to other things. The series has sapped a lot of my creative time. I did this as an experiment to find out: "Can I create an extensive work of fiction on schedule for an extended period of time?" The answer is apparently, yes I can, and even pull a few stunts and have some fun along the way. It's been a great outlet for random snarks, jokes, and goofy thoughts, even though it's sometimes also been a drag and some days appears to be running on fumes and desperation.

But I'm capable of a heck of a lot more. Some other stuff I've drawn in the past, for instance:

practice_human_Gordon_by_Penguin_Pete

Generic_Male_Mesh_by_Penguin_Pete

Girl_in_the_Headlights_by_Penguin_Pete

roulette_wheel_1_by_Penguin_Pete

The_Summoned_by_Penguin_Pete

...so how long should I go on puttering on a newspaper-style strip? I'm curious to see what I could produce in a graphic novel published twice per year rather than ~180 so-so strips per year.

And then there's much more I could be writing, and programming, and designing...

Anyway, enjoy the last "official" year of DTO. Strange things may happen yet. And thanks to you few, loyal fans out there! I'm glad a few of you appreciated my weird little universe.

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

Read This If You're Bitter And Angry About The World

Date/Time Permalink: 12/31/12 01:41:31 pm
Category: General

Here we are on the last day of 2012. Does it feel as if 2012 was discouraging?

In the US in the last few months, it feels like 2012 had it in for us. Hurricane Sandy, mass psycho shootings, and a government that gleefully accepted our votes and cheers over the supposed victories, and then went right back to being the stone-deaf bullshit factory it's always been.

Elsewhere in the world, 2012 was also grim. Yemen, Afghanistan, Pakistan, Syria, Gaza, Benghazi, Rustenburg, Congo, and South Africa in general have all seen their share of human-made atrocities and tragedies and the kind of civil-rights violations that lead to stories that typically begin with the phrase "Thousands took to the streets..." Greece and Spain have had economic crashes, also human-made. And can we even count the whack-job end-of-the-world predictions? It seems that humans can't go a day without making their own problems worse.

From a STEMmer or geek point of view, it's depressing. It's depressing because everybody around you just seems to get stupider and stupider every day. Make no mistake: In the United States of America, there IS an anti-intellectual movement going full-blast. Science and reason really ARE under attack, and will continue to be under attack for a long time yet. You're not imagining it.

If you're bitter and angry about the world's problems, chances are good that you have what they call "depressive realism". You have too clear a fix on reality and are too aware that the human condition is a giant ball of crap to really be optimistic about the future.


I know that depression.

And I know why it's not that bad.

Listen:


The human race is simply too young yet. The frustration we all feel is with the growing pains of advancing from animal to human.

The latest evidence shows that humans, in their earliest form, evolved around 5 to 7 million years ago (MYA) in eastern and central Africa. "Lucy", the earliest-known specimen of Australopithecus, lived about 3 MYA. Sometime during the next million years after Lucy, we started to make our earliest stone tools. First evidence of the ability to make fire has so far been pegged to about 1 MYA. So humans went some 6 millions years before they finally arrived at the ability to make fire at will - the trademark human advancement. About 350 to 200 thousand years ago (KYA), we saw the rise of the human prototype known as Neanderthals, the first type of human to show signs of social organization and hence, language. So humans took over 700 thousand years after they invented fire before they even communicated in anything but grunts and barks.

The first homo sapiens emerged between 200 and 100 KYA, again in East Africa. They demonstrated tool-making, social organization, and migration. What about agriculture? We didn't advance that far until just 12 KYA, with the Neolithic Revolution that happened spontaneously in various spots around the world. This is the first record of mankind willfully tending crops and herding livestock, transitioning from a nomadic hunter-gatherer species to creatures who, for the first time, had a reason to settle down in one place and call it "home". So humans were around 4 million, 88 thousand years before they could finally sustain themselves with a reliable food source and, for the first time, spend a few minutes of their day thinking about something besides how not to starve to death.

Earliest human writing is pegged to about 4000 BCE (6 KYA) in Mesopotamia. So humans lived another 6000 years after the advent of agriculture before the smartest of them figured out how to scratch down some kind of permanent record, and for the first time, gain the mere capability of handing down knowledge through the generations. The first true technology innovation, the wheel, shows up about this time, along with the earliest organized true cities. If you put fire and wheels together, you get cars, and if you logically extend writing, you get computers. So with transportation and information technology, you can see how those two disciplines alone still shape most of our society today. Pretty much all of human ability right now is confined within how fast and efficiently we can move either physical objects (including ourselves) or data, with a side order of how efficiently we can produce power (we still use fire a lot).

What we think of now as wonders of the "ancient" world, things like the pyramids and sphinx in Egypt, the Acropolis of Athens, the Great Wall of China, the founding of the Roman Empire, and so on, were all built within the last 6000 years. Even Stonehenge, just a damned circle of rocks planted with some notion of tracking the seasons, was only built between 3000 and 2000 BCE, just around 4,400 years ago. Rocks, the earliest computer, and it took us 6,990,000 years to dope that out.

Are you starting to understand why grandma can't cope with a tablet computer yet?

Stonehenge was only built during the most recent 1% of human history. Meanwhile, the human brain takes millions of years to have a slight change due to evolution.

As recently as only 500 years ago, average human life expectancy topped over 40 years for the first time. Up until that time of the 17th century, 2/3rds of all children born in northern Europe died before the age of four. So it's only been in the recent two millenniums that humans anywhere could expect as good as a 50/50 shot at living to see their own grandchildren. Even today, average life expectancy only runs between 40 and 60 years in most of Africa, and only reaches the peak 77-80 range in First-World countries (North America, Europe, Australia, and Japan). It's only been recently in the past few years that we are starting to see a spike in centenarians, humans living to see their 100th birthday, running from estimates of 23,000 centenarians worldwide in 1950 to 209,000 in the year 2000 and 455,000 in 2009. In just the last five centuries, average human lifespan worldwide has reached 67 years, just barely doubling. Think about how much intellectual value a human piles on in their later decades, and then consider that, barring a few historic individuals, it's only been the last five centuries that we're starting to get consistent access to that.

Meanwhile, electric engineering has only been around for just under two centuries. This marvel of modern science upon which we currently chat, the Internet, was not possible before the computer, which was not possible before the microprocessor, which was not possible before the invention of the integrated circuit, which first appeared in 1949, in a patent filed by Werner Jacobi. That's right, computers, and all of the wonders thereof, are only 63 years old! Only a couple of decades older than our very first visits to a non-Earth sphere.

In the United States, the Nineteenth Amendment to the United States Constitution, mandating the right of women to vote, is only 92 years old. The Thirteenth Amendment, which abolished slavery throughout all of the United States, is only 147 years old. The Civil Rights Act of 1964, finally mandating the concept of equal rights for all humans in the United States regardless of race, color, gender, or religion, is only 48 years old. The United States Federal Department of Education was only founded in 1867, meaning that the US was in business just under a full century before it attained, for the very first time, an organized method of ensuring that its citizens could simply read and write. And the idea that anyone could have the right to even a basic education goes back again to that Civil Rights Act. When was the last time we amended the Civil Rights Act? 1991, when we extended and strengthened the ability of employees to sue their employer for discrimination. Bush, Sr., vetoed the previous version of the act but had to buckle on that one.

Universal health care and gay marriage and pot legalization? It will happen. It's just that, you have to understand, it takes more than a few weeks wearing stupid costumes and waving stupid signs and getting stoned in Zuccotti Park.


Now, then: You are upset because the human race is not progressing faster? Sweetheart, we are just barely out of tails and flinging poo. WE ARE STILL ANIMALS! There is no escaping that we are animals, and we will remain animals for many, many millenniums to come. Yes, it is frustrating now feeling like you were born trapped in a primitive world without the wonderful societies that we dream about for the future.

I'm sorry, but you were born too soon for flying cars and colonizing space.

But at least you weren't one of the billions and billions of people who lived and died back there without ever having seen electronics... or machines... or tools... or reliable sources of food... or writing... or even speech! Pause a moment and bow your head for the Australopithecus Einsteins, who had to be content with chewing grubs out of bark and hooting at each other, maddened with the idea that they should be able to live more comfortably if only they could teach the others the value of sharpening a stick to use to dig out more food. Millions of them lived and died and lost in time back there, too discouraged to even write in the mud with their finger, because who the hell would come along who was smart enough to read it?

Do not be so vain, young and smart people, as to be discouraged for the human race because you could not fix it in one month. Quit feeling so sorry for yourself, and devote your life to aiding as much of human progress as you can. Live for the future, when more advanced humans will be able to appreciate what you were living with now.

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

Attention Reddit (and everybody who cares about it): A History Lesson

Date/Time Permalink: 12/28/12 10:02:14 am
Category: Geek Culture

It seems I can't avoid Reddit.com even if I want to these days. It gets quoted on every other news feed I check. Reddit, formerly the new Usenet, is now the new Internet Oracle, whose every babbling is posted alongside Twitter tweets and Facebook comments as one more "citizen in the street" comment. It streams by in CNN tickers and blares from the pages of my hometown newspaper. Obama did an interview there.

Which is starting to make it a teeny bit concerned with its own self-image. Because, like countless other social news and bookmarking sites out there, it's turned into a vat of boiling crap.

Prerequisite reading: "Why Social Bookmarking Will Always Suck".

That old post wasn't just meant as idle ranting. There are sociological forces at work here which so far go un-examined in academia. Social, psychological, and political scientists could be reaping gobs of research from online forum sites, but apparently they're not important enough to justify the research grants. So, if any of us care any at all about the atmosphere in which most of us spend the majority of our time, we will care at least once in a while about the quality of the time spent there.

The problem on Reddit is the quality: They're drowning in crap.

I've seen it before: Usenet, Yahoo groups, Slashdot, Delphi forums, About, 4chan, Ebaumsworld, Digg, Something-Awful, Tumblr, Twitter, MySpace, ten million PHPBB boards, and even BBS discussion sites of the '80s of yore.

The only one thing that sets Reddit apart from countless discussion forums is that it is the most conservative online community ever! It is insulated and xenophobic - Redditors consider themselves thousands of miles higher than every other website, infinite degrees morally superior to every other human being on Earth, just for having gotten a free, unscreened nick on the site. In their opinion, if it didn't happen on Reddit, it didn't happen. To Redditors, history began with the first Reddit post. Reddit doesn't just repost tired meme jokes, it recites them in unison in high Latin during midnight mass. As tight as Freemasons, you can't even get in the door there without the secret handshakes and lodge signs. No Reddit account can do any wrong; even spammers and astroturfers and child pornographers and scam artists are lovingly embraced and tenaciously defended, once they show that they have a Reddit account. It is the Manson cult of social media, while thinking it's Haight-Ashbury. It is impossible to get banned from the site, something even 4chan can't claim. Although the site has a reputation for being a traffic driver, akin to Digg and Slashdot, it actually drives the least traffic per post of any social linking site - that's because the vast majority of Reddit links link to other pages on Reddit itself. Beyond that, a huge chunk of Reddit posts are merely comparisons of itself with other sites, trying to squeeze that crucial last millimeter out of the ruler. Bellybutton lint: It's what's for dinner.

In previous forums, companies needed to create a walled garden to keep a community this insulated; Reddit is the first community to do this to themselves. And of course, with such an airtight, hermetically-sealed, monocultural, inbred web society, it's a little hard for news from the Outside World to get in there.

So let your friendly neighborhood Redditor know "Why Social Bookmarking Will Always Suck".

But tell them outside of Reddit! The site has immunity built up to any opinion not branded with the barcode of the Hivemind; Reddit is as deaf to differing opinions as Glenn Beck. But it has millions of users; these same users are bound to sleep, eat, and go to work sometime. Some of them, miraculously enough, even have significant others.

Answer for them this conundrum, and then they can stop screaming "Why are we drowning in crap?" at each other every day. Because the rest of us would like to browse "the Mainstream Media" in peace again.

Penguin Pete's neon sig

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

Gnome3: User-Friendly Is Not Equal To User-Insult

Date/Time Permalink: 12/27/12 10:32:48 am
Category: Reviews

Like everybody in the Linux community, I have at last been dragged kicking and screaming onto Gnome 3. We had no choice; everything on our Linux desktops has been slowly failing from being so badly aged. My old Fedora release experience has so far been rescued by the graces of "fallback mode" on the laptop, while the desktops were still running old Ubuntus. So I had dodged being affected by Gnome3 so far.

At the same time, Gnome now has the entire Linux desktop world at gunpoint: The majority of software that runs on Linux requires Gnome and GTK. I've tried running everything on alternatives - Gnome has a desktop lock-in going on right now that is worse than anything imagined by Apple or Microsoft in their kinkiest dreams. Do without Gnome, and your printers will break, your Bluetooth will refuse to connect, none of the weather applets will talk to your desktop, your videos will freeze, and taxi cabs will suddenly pass you by in the snow without stopping for you.

So I finally grabbed a copy of Fedora 17, figuring, well, all of the crabbing I've been hearing about Gnome3 must have settled somebody's hash by now. Maybe the claims of UI horror have been greatly exaggerated. They've had months to fix it - how bad could it be?

Imagine being arrested and sentenced for life, in leg irons and chains, to preschool. No more adult for you, you have to sit in those tiny wobbly plastic chairs and drink Hawaiian Punch and play Hi-Ho Cherry-O and watch Barney for the rest of your life.

That bad.

I am not being funny.

And I see that the outraged screams of all of you (INCLUDING Linus Torvalds!) simply are not enough to penetrate the sawdust skull of William Jon McCann, Gnome "designer" who, in this interview from his Stalin-like reality-distortion-bubble, responded to the criticism of Gnome3 thus:

"Even many of the same people who are now claiming that GNOME2 was such a great thing for them were some of the most vocal opponents of the things we did in GNOME2."

Yeah, McCann, let me see if I can explain to you why this is. Pull up a wobbly plastic chair:

You had a house. That was "1". People lived in it. Then you broke all the lights and removed the doors and called it "2". And everybody complained about that. So then you also removed the roof and the furniture, and populated it with rabid pitbulls who attack the tenants, and that's "3". So now people complain even louder, so now you go "Well then, I guess 2 wasn't so bad after all, now was it?"

I'm beginning to question whether somebody who reasons this way has the best interests of users in mind.

In any case, does everybody finally see what happens when you ignore established or competent users in favor of Joe Sixpack? I was complaining about the preschool-ization of the Linux desktop five years ago at least (example). I got yelled down for it. I gave up. Nobody wanted to hear it from me. Apparently, nobody even got it when I framed it in a Joe Sixpack parody (strips #727-#730).

Now I see, what the hey, Tom's Hardware arriving at this conclusion when writing "Gnome 3: Why It Failed":

"So, when the power users are leaving, GNOME doesn't really seem to care. After all, GNOME 3 isn't designed for them. But what the GNOME Project leaders don't seem to understand is that new Linux users are like vampires, or werewolves, or zombies. Stick with me here.

New Linux users don't just spontaneously pop into existence, they have to be 'bitten' by someone who is already involved. Average Joe, who needs to use his computer and doesn't care how it works, doesn't wake up one day and, out of the clear blue sky exclaim, 'You know what? I think I'm gonna screw around with Linux today.' New users are typically converted by a friend or family member who gets them set up and interested.

By gutting GNOME of every power user-oriented feature (a functional desktop, virtual desktops, on-screen task management, applets, hibernation, and so on) it's losing that intermediate-to-advanced crowd that's responsible for bringing users on-board. The power user demographic isn't going to recommend and support GNOME 3-based systems if they've already jumped ship."

That was Adam Overa, and his words, above, ought to be emblazoned on marble slabs at every offramp leading into Silicon valley. You could substitute the name of any technology product and be just as right. This seems to be the single most important fact to technology adoption that nobody, since the Commodore VIC-20, seems to have understood.

Can we get this into the Jargon File or something, please? "Overa's User Base Law", summarized as: "New customers are brought in by old customers. If you shoot at your old customers, you won't get new customers, and you won't have any old customers, either."

Both Gnome 3 and Ubuntu's Unity (not a hair better) suffer from the phone fad: trying to make all devices behave like a phone. Yeah, but if we want a phone, we know where to get one. When you have a desktop, you have it because of reasons, reasons being that you need all that extra power, usually for productivity.

Anyway, we're installing XFCE and using that instead. When you have Gnome and all of the GTK support, you can switch to XFCE and pretend Gnome doesn't exist and almost not know the difference. So that looks like the state of the Linux desktop currently.

Penguin Pete's holidays

Update The Register also has scathing things to say about the Gnome hijack of the Linux desktop.

Follow me on Twitter for an update every time this blog gets a post.
Stumble it Reddit this share on Facebook

<< Previous Page :: Next Page >>
suddenly the moon