YOU MUST LEARN

The cliché is that those who don't know their history are doomed to repeat it. There's another cliché, though, the one about history being written by the victors. In this case it's not so much the victors as the advertisers, though, and historical accuracy is not in their selfish best interests.

I've seen lists like these in some of the computing magazines etc., and they always have one or two things wrong with them. First of all, they're often inaccurate, and rush too quickly to get to the star names. It's like they're worried their readers will stop reading if they learn something new.

Second of all, they're often confused. They'll start way early in the timeline, or way late, like a student who didn't exactly understand what their own thesis statement was.

This is my timeline for the development of personal computing. Don't let your eyes glaze over, 'cos unless you're a computer science major (or act like one), you may learn some surprising things. Besides, if you're reading this blog, it means you're taking a break from on-line games or porn or your friends' status updates or whatever else you usually look at.
  • The rough sketch for what we now call the personal computer (or tablet, or smartphone, or whatever) was published in July 1945 by someone called Vannevar Bush. He wrote about it in an essay called "As We May Think" in The Atlantic Monthly, and it's still available on-line today. It's as good and accessible a read as anything that magazine publishes now. Bush gives a series of small examples, which, while interesting, leave you thinking, "okay, so....?" until he puts them all together and delivers the knockout punch at the end.
  • That July 1945 issue of The Atlantic Monthly was read by, amongst other people, a man named Douglas Engelbart. Once he finished serving in the Pacific theatre of the Second World War, he went home to the US and started working on creating some of the things Bush presented in his essay.
  • Engelbart invented the mouse in 1963. Bill English carved the first prototype out of a block of wood. Engelbart patented the mouse in 1970, but the patent papers were filed in 1967.
  • 1968: The Mother of All Demos. Engelbart demonstrates using the mouse, display editing, copying & pasting text, hypertext (links), and multiple windows. The whole thing is video conferenced, so that those who want to see the demo but can't attend in person can watch on closed-circuit TV.
Please pause and re-read that last entry. All of that stuff was working well enough to demonstrate live in 1968.

Oh yeah: in 1969 Engelbart (again!) helped start ARPAnet, which eventually became what we now call the Internet. I don't think it's a big exaggeration to say that he's shaped to a very large extent everything the world thinks of as "normal" in a human-computer experience, and yet most people haven't heard of him. Luckily he seems to be a force for good.

And that is where I'm going to end my timeline, because from where I'm sitting, everything that comes afterwards is a long, slow, painful crawl to commercial acceptance from that 1968 demo. If you look around Doug Engelbart's site thoroughly, you'll see that his overarching aim has been to augment human intelligence. That we were stuck with the 1968 paradigm for so long (albeit with prettier video interfaces) is a tad worrying.

Where is computing going now? On the one hand I'm glad that innovations like the gesture-based commands in the Wii and Kinect systems made it to market, because I think a thinking environment that encourages us to use all of our bodies instead of being hunched over a desktop is a good thing. On the other hand, it's a tad worrying that these are coming out of the gaming world, which means they might be a hard sell in the business realm. After all, back in the 80s PCs themselves sometimes had to be purchased at large corporations as "word processors" or "adding machines" to avoid refusals from the accounting department.

Notice I made it this far without mentioning Bill Gates or Steve Jobs (or even Steve Wozniak). Notice how young Gates and Jobs were when all this was happening. Bush's essay was published ten years before either of them were born. I don't mean that Gates and Jobs haven't contributed; I just mean that there was already a lot in place by the time they started working on things.

The advertisers tell us that computing is changing very quickly, and that we have to run to keep up. Given that the idea came in 1945, was realised by 1968, and then didn't catch on until the 1980s, I'm not so sure.

want to speed up web access? block Facebook!

Back when I was researching the most efficient way to commit Facebook suicide, I came across an article about how Facebook can pursue you even in the non-Facebook afterlife. I made a note to try it out once my account was good and gone.

The account is officially dead, so I went ahead with the next step and blocked "www.facebook.com" in my router's firewall settings. These settings were originally designed for parents to keep children from porn sites and the like. It felt pretty strange blocking myself from a site that I never use anyhow, but I was curious as to why people were recommending it in the first place.

Adding the domain to my settings took under five minutes. It would have taken under two if I had known where in the router's settings I needed to do the data entry.

After the router restarted and I was back on the web, I headed over to my favourite on-line magazine web site to do some light reading. To my astonishment, the site loaded much faster than it normally does, so quickly I checked the status bar on my browser for an error message. Nope, nothing. So I scrolled down the home page to see if there had been a noticeable revamp of the layout or something else to explain the speed. Everything looks the same, except for what's shown in the screen shot below, and that was my doing:

I knew from the "www.facebook.com" test I did after the router reset that the block is very fast — the router is very quick to check and invoke the blocked URL settings (which is about what you'd expect, but it's nice to see it in action).

Traditionally pulling in information and displaying it from disparate URLs was known to slow down page loads, but this was the first time I'd ever really noticed it since switching from dial-up to DSL over ten years ago.

Out of curiosity I went to a couple of my favourite newspaper sites. Same thing, and for apparently the same reason.

Now, I'm not nearly enough of a propeller-head to do the measurements and attach some numbers to this, but what I thought would be a "set it and forget it" ethical stance against a site that had annoyed me turned out to have some immediate, positive benefits.

It was tempting to see what would happen if I blocked "www.google.com," but I didn't. Why? Mostly because, while I refuse to be anyone's fangirl, Google doesn't bug me nearly as much as Facebook does. But that's another blog post.

an experiment

Yesterday I had two people, who were never with me at the same time, ask if I knew how many people visited my web site.

I had a response all ready. "The 'web site' is one page," I said. "I haven't even launched it yet. Not until I get the first draft of my novel done. Right now isn't the time to be fiddling with web pages. I need something to talk about first."

Unfortunately for my ego, they both had the same counterpoint all ready to fire back with. "But what about your blogs? You've been running those for over two years now."

"Those are just because I like to write about topical stuff," I said.

This is true. I do not want to be one of those whiny people who are always saying, "Oh, well if you read my blog..." But I write. And writers like to be read. Also, I agree if I'm putting it out there, maybe I should pay attention to who's glancing at it, much less reading it.

So tonight I went on Google Analytics and got tags for both my blogs and my web site. Judging from my comments rate, I'm not expecting to find a secret legion of Eyrea-visiting web denizens. But currently I don't have hard numbers of any kind.

If you have made it this far, I humbly request 30 seconds and two mouse-clicks from you. You don't have to do anything else. Really.

I need to make sure all of the analytics scripts work. One of them is on this blog. The other two are here:
Please click on the above links. That's it! You don't even have to read anything. If you load the pages, it'll show up in my report as a visitor. I'd do it myself, but I suspect Google filters my own visits out. I'll play around and see.

And no, the reports don't tell you who exactly visited or anything fancy/privacy-invading like that. Just how many people visited, where in the world they were visiting from, and how long they stayed. So if you want a personal thank-you (which I would love to give you), please leave a comment to let me know you helped out with the experiment!

In praise of —

Have you ever been caught complaining that something ought to exist, only to find out that it does exist, exactly the way you want it, and, in fact, has existed for some time in that state? Your emotions do this weird thing where you're both delighted and embarrassed at the same time. That's what my emotions do, anyhow. Most recently that happened with my discovery that it's as easy as anything to use the em-dash on-line for things like this blog. If you already know this, you may well be rolling your eyes and thinking, "Yeah, so?". Hey, remember your first time.

The em-dash is that longish dash that gets used for pauses in mid-sentence. It's heavier than a comma, but (to me, there's some controversy about it) lighter than parentheses, semi-colons, or colons. They can be used to death — some of the nineteenth-century poets went a little silly with them — but they're also very handy and seem to be more in use of late.

The alternative is to type two or even three hyphens, like so: --. The problem with those on-line is that one hyphen can word wrap while the other one just sits there on the previous line. Instead of looking like you intended a longish pause in the word flow, it just looks like you can't type. There are various ways to get one entered using a word processor (most word processors will replace a double hyphen with an em-dash anyhow), but for HTML you have to write —. I guess what kept me from discovering it myself is that it's "m", not "em".

Now that I'm writing about it, I'm wondering why they're so difficult to get keyed in while semi-colons are right on the home row. Hmmmmm....