Ten years on, can we stop worrying now?

Ten years ago this month the Sunday Times published an article by Douglas Adams called “How to Stop Worrying and Learn to Love the Internet”. You can read it here.

Some starting observations:

  1. It’s a tragedy that Adams died, aged 49, in 2001, depriving us of more great literature in the vein of the Hitchhiker’s Guide, of genuinely innovative new media projects such as H2G2, and of the witty, insightful commentary we find in the Sunday Times column.
  2. Adams’ insights have stood the test of time.  Everything he wrote at the end of the Nineties stands true as we near the start of the Tens.
  3. We still haven’t stopped worrying.

Adams from 1999:

… there’s the peculiar way in which certain BBC presenters and journalists (yes, Humphrys Snr., I’m looking at you) pronounce internet addresses. It goes ‘wwwDOT … bbc DOT… co DOT… uk SLASH… today SLASH…’ etc., and carries the implication that they have no idea what any of this new-fangled stuff is about, but that you lot out there will probably know what it means.

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on…

2009: John Humphrys is still huffing and puffing [Update 3/9/09 - further proof provided!], and…

you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

The moral panic continues, now transferred to social networking and camera phones.

And Douglas Adams hit the nail of the head in his taking to task of the term “interactive”:

the reason we suddenly need such a word is that during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television. Before they came along all entertainment was interactive: theatre, music, sport – the performers and audience were there together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for. We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.

The same fallacy persists, now transferred from the term “interactive” to “social“.

Ten years ago, Douglas Adams identifed a few problems.

  • “Only a minute proportion of the world’s population is so far connected” – this one’s well on the way to being fixed, as much by the spread of internet-capable mobile devices as by desktop or laptop PCs.
  • It was still “technology,” defined as “‘stuff that doesn’t work yet.’ We no longer think of chairs as technology, we just think of them as chairs.” – has the internet in 2009 reached the same level of  everyday acceptance as chairs? Almost, I think, though the legs still fall off with disappointing regularity.

The biggest problem, wrote Adams, is that “we are still the first generation of users, and for all that we may have invented the net, we still don’t really get it”. Invoking Steve Pinker’s The Language Instinct (read this too, if you haven’t already), he argued that it would take the next generation of children born into the world of the web to become really fluent. And for me that’s been the most amazing part. Reflecting the other day on Tom Armitage’s augmented reality post to the Schulze and Webb blog, I realised that I see that development in my own children’s engagement with technology.

  • At birth a child may assume that anything is possible: a handheld projector holds no special amazement for my three-year-old.
  • Through childhood we are trained, with toys among other things, to limit our expectations about how objects should behave. My six-year-old, who has been trained by the Wii, waves other remote controls about in a vain attempt to use gestures.
  • My nine-year-old, more worldliwise, mocks him for it.

We arrive in the world Internet-enabled and AR-ready, it’s just that present-day technology beats it out of us. I work for the day when this is no longer the case.

Last words to Douglas Adams, as true today as in 1999:

Interactivity. Many-to-many communications. Pervasive networking. These are cumbersome new terms for elements in our lives so fundamental that, before we lost them, we didn’t even know to have names for them.

Update 3/9/09: Debate about Twitter on the Today programme, and Kevin Anderson takes up the theme.

Television may be the gin of the information age, but that doesn’t mean the web is pure water

Flickr - in the future we will all wear shiny suits and watch bright red televisions

The new media revolutionary in me so much wants to believe Clay Shirky’s “Here Comes Everybody” hypothesis, that the web heralds a new era of mass participation, collaboration and creativity. With our mobile phones and broadband connections we remade society, so that my five-year-old son cannot conceive of a world without the web (“Daddy, if people didn’t have computers, how did they buy things from the Internet,” he once asked.) We are the generation that Changed Everything. How cool is that?

But then my inner history graduate rebels. I’m innately suspicious of anyone who says human behaviour has changed fundamentally. The joy of history is in its humanity, in all the stories that show how our ancestors were ordinary people who laughed, loved, tricked and schemed just like we do today. If Baby Boomers claim they invented sex, just refer them to Roman pottery and the satirical cartoons of the 18th Century.

And so I believe in our bright human future: that so long as people survive, they will behave much like their stone age forebears. The context may be different, but people are people across time and space. And that’s a Good Thing.

So I’m deeply conflicted when, on the blog accompanying his book, Shirky launches a puritanical attack on television as a sink that dissipates our thoughts, and compares it to the socially sedative role of gin in our early industrial revolution cities. The theory goes that:

The transformation from rural to urban life was so sudden, and so wrenching, that the only thing society could do to manage was to drink itself into a stupor for a generation. The stories from that era are amazing– there were gin pushcarts working their way through the streets of London.

And it wasn’t until society woke up from that collective bender that we actually started to get the institutional structures that we associate with the industrial revolution today. Things like public libraries and museums, increasingly broad education for children, elected leaders–a lot of things we like–didn’t happen until having all of those people together stopped seeming like a crisis and started seeming like an asset.

It wasn’t until people started thinking of this as a vast civic surplus, one they could design for rather than just dissipate, that we started to get what we think of now as an industrial society

Television, the dominant mass media of the second half of the 20th Century is our modern-day equivalent of gin. But despair not, for Shirky has us all roused from our stupor by the Internet in all its chaotic glory, millions of Wikipedia edits, captioned cat photos and all.

The fact that Internet users watch less TV has been a commonplace for some time, so Shirky builds on this to show that if everyone watches just a little less TV and participates a little more online, whole new sources of value will be unlocked from our newly productive endeavours. We The Web Users can be morally superior to the Telly Addicts of the past: they consumed, we create.

It’s a great analogy, but I’m suspicious of the conclusion. Why? Because TV watching is not the only thing being edged out to make way for all those hours online. Not only do we watch less TV, we also sleep less and spend less time interacting with our families.

I started to list some things I do less as a result of having the internet:

  • watch tv
  • talk about tv
  • buy magazines
  • phone people up
  • write letters
  • go to the shops
  • go to the library
  • queue to pay bills
  • look out of the window on trains
  • sleep

Now a couple of these things – watching TV, buying magazines – do seem like gin, the one-way attention sink activities, (though as fundamentally social beings, it’s never long before two or more people assembled before a television set are debating and discussing the content, hurling abuse at the screen or fighting over the remote control).

But what about the others?

I now communicate less by phone and letter, and more by email or text. Where’s the cost in that? Well I reckon it’s in the nuances, the tone of voice, the side-tracked conversations, the pictures scribbled in the margins, that just don’t happen so much online. So I’ve substituted some inconvenient but rich communications media for handier, cheaper, but less subtle ones.

I shop online for stuff so I don’t have to go to the shops, and I Google for information so I don’t have to go to the library. So there goes a whole load of opportunities for collaboration – chance meetings with friends, taking my cue subconsciously from what other shoppers are looking at, and so on.

Then there’s the contemplation time. I used to stand in queues, look out of the window, ignore the TV and let my mind wander. Greater efficiency in transactions and communications is squeezing out those times, and I wonder if the quality of my communications is suffering just as their quantity increases. And that’s before the sleep deprivation kicks in, tiredness and drunkenness sharing many symptoms.

So maybe TV was the gin of the information age, but the internet has a way to go before it’s the clean drinking water that will unleash our productivity. Exchanges on online social networks are so far a pale shadow of the sophisticated interactions that happen when people get together in the real world. And whatever the medium, tomorrow’s people are highly likely to remain much like the people we know today: at once creative and lazy, generous and greedy. If attention is a finite resource, so surely is virtue.

The irony that I’m saying this on a blog is not lost on me. And no, I’m not about to retreat to my log cabin with a manual typewriter, but I do believe there are a few things we need to work on. To do that, we need to understand the good and bad stuff we’re leaving behind, as much as the huge potential of the new technology we embrace.

Disclosure: I write this post having made it up to page 99 of Here Comes Everybody. It’s a great, thought-provoking book and I fully expect to revise my opinion by the time I reach the end. Please consider this a review in perpetual beta :)

Thomas A. Watson Ate My Internet

“But daddy, if people didn’t have computers, how did they buy things from the internet?”

It’s amazing how something we’ve come to take for granted hangs from such a fragile thread.

As part of a new product trial for my employer, we recently had a visit from two very helpful telecoms engineers who checked out our broadband connection.

Living where we do within spitting distance of our local phone exchange, our broadband should have been blazing, but it turned out all those bits and bytes were struggling to be heard over the noise on the line. The engineers (who’d already proved they were a class act by taking off their boots at the door, without being asked) ran some checks, showed me some impressive looking waveforms and diagnosed a collision between the 19th and 21st centuries.

Thomas A. Watson

Back in the days when Crazy Frog was nothing but a proud native American Chief defending the plains of the Wild West (probably), Thomas Augustus Watson – of “Mr Watson, come here! I need you!” fame – had the bright idea of a bell to alert recipients to incoming calls. A bell. An actual bell. Not a Truetone, not even a Polyphonic. Just an actual, real, ringing bell. To make the bell ring, a pair of wires ran in parallel along the cable that carried the talking. When a call was coming in, power would surge down the lines and make the bell ring. Ingenious!

Fast forward about 120 years and even our cordless DECT phone has a choice of ten tinny tunes. If we could be bothered we could set the phone to play a different tinny tune depending on the caller. The bell wires in my home are pretty much redundant, but they’re still there, just in case I decide to plug in a phone with an actual, real, ringing bell.

And therein lay the problem, according to the engineer standing in my living room in his socks. Our phone had an old extension cable running upstairs. The two ringing wires from that extension were funnelling radio noise back into our phone system and drowning out the internet. Two minutes and a small screwdriver later the old extension cable had been disconnected and we were two megabits per second better off. Sorted.

Now attenuation is all that stands between me an broadband nirvana. Apparently.