YOU MUST LEARN / by Katherine Hajer

The cliché is that those who don't know their history are doomed to repeat it. There's another cliché, though, the one about history being written by the victors. In this case it's not so much the victors as the advertisers, though, and historical accuracy is not in their selfish best interests.

I've seen lists like these in some of the computing magazines etc., and they always have one or two things wrong with them. First of all, they're often inaccurate, and rush too quickly to get to the star names. It's like they're worried their readers will stop reading if they learn something new.

Second of all, they're often confused. They'll start way early in the timeline, or way late, like a student who didn't exactly understand what their own thesis statement was.

This is my timeline for the development of personal computing. Don't let your eyes glaze over, 'cos unless you're a computer science major (or act like one), you may learn some surprising things. Besides, if you're reading this blog, it means you're taking a break from on-line games or porn or your friends' status updates or whatever else you usually look at.
  • The rough sketch for what we now call the personal computer (or tablet, or smartphone, or whatever) was published in July 1945 by someone called Vannevar Bush. He wrote about it in an essay called "As We May Think" in The Atlantic Monthly, and it's still available on-line today. It's as good and accessible a read as anything that magazine publishes now. Bush gives a series of small examples, which, while interesting, leave you thinking, "okay, so....?" until he puts them all together and delivers the knockout punch at the end.
  • That July 1945 issue of The Atlantic Monthly was read by, amongst other people, a man named Douglas Engelbart. Once he finished serving in the Pacific theatre of the Second World War, he went home to the US and started working on creating some of the things Bush presented in his essay.
  • Engelbart invented the mouse in 1963. Bill English carved the first prototype out of a block of wood. Engelbart patented the mouse in 1970, but the patent papers were filed in 1967.
  • 1968: The Mother of All Demos. Engelbart demonstrates using the mouse, display editing, copying & pasting text, hypertext (links), and multiple windows. The whole thing is video conferenced, so that those who want to see the demo but can't attend in person can watch on closed-circuit TV.
Please pause and re-read that last entry. All of that stuff was working well enough to demonstrate live in 1968.

Oh yeah: in 1969 Engelbart (again!) helped start ARPAnet, which eventually became what we now call the Internet. I don't think it's a big exaggeration to say that he's shaped to a very large extent everything the world thinks of as "normal" in a human-computer experience, and yet most people haven't heard of him. Luckily he seems to be a force for good.

And that is where I'm going to end my timeline, because from where I'm sitting, everything that comes afterwards is a long, slow, painful crawl to commercial acceptance from that 1968 demo. If you look around Doug Engelbart's site thoroughly, you'll see that his overarching aim has been to augment human intelligence. That we were stuck with the 1968 paradigm for so long (albeit with prettier video interfaces) is a tad worrying.

Where is computing going now? On the one hand I'm glad that innovations like the gesture-based commands in the Wii and Kinect systems made it to market, because I think a thinking environment that encourages us to use all of our bodies instead of being hunched over a desktop is a good thing. On the other hand, it's a tad worrying that these are coming out of the gaming world, which means they might be a hard sell in the business realm. After all, back in the 80s PCs themselves sometimes had to be purchased at large corporations as "word processors" or "adding machines" to avoid refusals from the accounting department.

Notice I made it this far without mentioning Bill Gates or Steve Jobs (or even Steve Wozniak). Notice how young Gates and Jobs were when all this was happening. Bush's essay was published ten years before either of them were born. I don't mean that Gates and Jobs haven't contributed; I just mean that there was already a lot in place by the time they started working on things.

The advertisers tell us that computing is changing very quickly, and that we have to run to keep up. Given that the idea came in 1945, was realised by 1968, and then didn't catch on until the 1980s, I'm not so sure.