On the way to dConstruct: a social constructionist thought for the day

A desire to put some theoretical acro props under my vague unease with the determinist narrative of so much of our technology discourse has led me to the writing of the French anthropologist Bruno Latour. His work on the social construction of science, an ethnography of the R&D lab, has a special resonance for me, a humanities graduate who finds himself colleague to a legion of French engineers.

I’m stumbling intermittently through Catherine Porter’s translation of Latour’s 1991 work “We have never been modern“, as a prelude to David Edgerton’s “The Shock of the Old“. At times it feels a bit like eating up the broccoli before allowing myself desert, but the rich, buttery morsels like the following make it all worthwhile.

The story so far.

Latour argues that modernity, from Civil War England onwards, managed its contradictions by placing boundaries between nature and society. Thomas Hobbes, writer of the Leviathan, was taken up as a founder of political philosophy while Robert Boyle, he of the air pumps, was channelled as a natural philosopher and pioneer of scientific method. In truth both men speculated on both politics and science, but this inconsistency was whitewashed by their modern successors seeking only the pure narrative of one or the other.

And so we are today in a world still riven by CP Snow’s two cultures, where right-wing bloggers can grab acres of media coverage against climate scientists by finding just the tiniest trace of political “contamination” on the lab’s email servers.

But I wonder if the disconnection and reconnection of nature and society is also a useful way to understand some of the ideas I’m expecting to hear today at dConstruct, a conference at the cutting edge of technology and media convergence.

The 19 years since Latour published “Nous n’avons jamais été moderne” roughly spans my working life so far. I’ve witnessed the amazing things that can happen when you expose the humanities-soaked world of newspapers, books and TV to the attentions of software engineers and computer scientists. The results have been delightful and depressing, often both at the same time. Who knew back then that floaty copywriters would have to cohabit – for better or for worse – with the number-crunchers of search engine optimisation?

This fusing of the worlds of media and technology is only just beginning, and the next step is evident in the hand-held touch-sensitive, context-aware marvel of creation that is the latest smartphone.

Hitherto we have seen the the world of human-created information, the texts of the ancients and the tussles of our own times, through the pure window of the newspaper, the book, the TV, the PC screen. But the smartphone is a game-changer, like Robert Boyle’s air pump. With its bundle of sensors, of location, of proximity, and in the future no doubt heat, light, pressure and humidity it becomes a mini-lab through which we measure our world as we interact with it.

All manner of things could be possible once these facts of nature start to mix with the artifacts of society. My Foursquare checkins form a pattern of places created by me, joined with those of my friends to co-create something bigger and more valuable. My view of reality through the camera of the phone can be augmented with information. We will all be the scientists, as well as the political commentators, of our own lives. This is the role of naturalism in my “Mobile Gothic” meander.

To recycle Latour on Robert Boyle’s account of his air pump experiments:

“Here in Boyle’s text we witness the intervention of a new actor recognised by the new [modern] Constitution: inert bodies, incapable of will and bias but capable of showing, signing, writing and scribbling on laboratory instuments before trustworthy witnesses. These nonhumans, lacking souls but endowed with meaning, are even more reliable than ordinary mortals, to whom will is attributed but who lack the capacity to indicate phenomena in a reliable way. According to the Constitution, in case of doubt, humans are better off appealing to nonhumans. Endowed with their new semiotic powers, the latter contribute to a new form of text, the experimental science article, a hybrid of the age-old style of biblical exegesis – which has previously been applied only to the Scriptures and classical texts – and the new instrument that produces new inscriptions. From this point on, witnesses will pursue their discussions in its enclosed space, discussions about the meaningful behavious or nonhumans. The old hermeneutics will persist, but it will add to its parchments the shaky signature of scientific instruments.”

I don’t yet know where I stand in this picture. Am I the experimenter, his audience, or the chick in the jar?

An Experiment on a Bird in an Air Pump by Joseph Wright of Derby, 1768

A desire to put some theoretical acroprops under my vague unease with the determinist narrative of so much of our technologydiscourse has led me to the work of the French anthropologist Bruno Latour. His work on the social construction of science, anethnography of the R&D lab, has a special resonance for me, a humanities graduate who finds himself colleague to a legion of 

French engineers.

I’m stumbling intermittently through Catherine Porter’s translation of Latour’s 1991 work “We have never been modern”, as a

prelude to David Edgerton’s “The Shock of the Old”. At times it feels a bit like eating up the broccoli before allowing myself

desert, but the rich, buttery morsels like the following make it all worthwhile.

The story so far.

Latour argues that modernity, from Civil War England onwards, managed its contradictions by placing boundaries between

naure and society. Thomas Hobbes, writer of the Leviathan, was taken up as a founder of political philosophy while Robert

Boyle, he of the chicks in air pumps, was channelled as a natural philosopher and pioneer of scientific method. In truth both

men speculated on both politics and science, but this inconsintency was whitewashed by their modern successors seeking only

the pure narrative of one or the other.

And so we are today in a world still riven by CP Snow’s two cultures, where right-wing bloggers can grab acres of media

coverage against climate scientists by finding just the tiniest trace of political “contamination” on the lab’s email servers.

But I wonder if the disconnection and reconnection of nature and society is also a useful way to understand some of the ideas

I’m expecting to hear today at dConstruct, a conference at the cutting edge of technology and media convergence.

The 19 years since Latour published “Nous n’avons jamais été moderne” roughly spans a working life in which I’ve witnessed

the amazing things that can happen when you expose the humanities-soaked world of newspapers, books and TV to the

attentions of software engineers and computer scientists. The results have been delightful and depressing, often both at the

same time. Who knew back then that floaty copywriters would have to cohabit – for better or for worse – with the

number-crunchers of search engine optimisation?

This fusing of the worlds of technology and media is only just beginning, and the next step is evident in the hand-held

touch-sensitive, context-aware marvel of creation that is the latest smartphone.

Hitherto we have seen the the world of human-created information, the texts of the ancients and the tussles of our own times,

through the pure window of the newspaper, the book, the TV, the PC screen. But the smartphone is a game-changer, like

Robert Boyle’s air pump. With its bundle of sensors, of location, of proximity, and in the future no doubt heat, light, pressure

and humidity it becomes a mini-lab through which we measure our world as we interact with it.

All manner of things could be possible once these facts of nature start to mix with the artifacts of society. My Foursquare

checkins form a pattern of places created by me, joined with those of my friends to co-create something bigger and more

valuable. My view of reality through the camera of the phone can be augmented with information. We will all be the scientists,

as well as the poticial commentators, of our own lives. This is the role of naturalism in my “Mobile Gothic” meander.

To recycle Latour on Robert Boyle’s account of his air pump experiments:
“Here in Boyle text we witness the intervention of a new actor recognised by the new [modern] Constitution: inert bodies,

incapable of will and bias but capable of showing, signing, writing and scribbling on laboratory instuments before trustworthy

witnesses. These nonhumans, lacking souls but endowed with meaning, are even more reliable than ordinary mortals, to whom

will is attrributed but who lack the capacity to indicate phenomena in a reliable way. According to the Constitution, in case of

doubt, humans are better off appealing to nonhumans. Endowed with their new semiotic powers, the latter contribute to a new

form of text, the experimental science article, a hybrid of the age-old style of biblical exegesis – which has previously been

applied only to the Scriptures and classical texts – and the new instrument that produces new inscriptions. From this point on,

witnesses will pursue their discussions in its enclosed space, discussions about the meaningful behavious or nonhumans. The

old hermeneutics will persist, but it will add to its parchments the shaky signature of scientific instruments.”

I don’t yet know where I stand in this picture. Am I the man in the white coat or the chick in the belljar?

When too much perspective can be a bad thing

An article by my former colleague and TEDx Leeds speaker Norman Lewis reminds me of an ingenious device imagined by Douglas Adams in the Restaurant at the End of the Universe. Yes, I know you all like a good Douglas Adams quote.

First, though, listen to Norman, writing about ‘Millennials’ and Enterprise2.0 on his Futures Diagnosis blog:

The Millennial issue in the workplace has become symptomatic of the uncertainty of the ‘information age’ which exaggerates the novelty of the present at the expense of the past. This generational shift is regarded as unprecedented and a unique feature of our times. The workplace (and indeed, the world) is now divided into two periods: the past where everything remained the same with little change and the current moment with its constant change where change and disruption are incessant.

This rhetoric of unprecedented change is precisely that, rhetoric. What about the generational shift that occurred in the 1960s? The rise of the teenager in the post-War period was indeed unprecedented and had a huge impact on Western society. But did this result in the end of the enterprise as we know it? No, the exact opposite. It helped to forge the enterprise as we know it.

This is spot on. As I’ve argued before, what has changed in the last decade is the enterprise’s level awareness of stuff that has previously gone on behind its back.

Throughout the so-called “mass media” era, managers were encouraged to delude themselves that they had the attention of their employees and customers, who were in reality talking amongst themselves all along.

The web puts an end to the delusion. It acts like Douglas Adams’ Total Perspective Vortex:

… allegedly the most horrible torture device to which a sentient being can be subjected.

When you are put into the Vortex you are given just one momentary glimpse of the entire unimaginable infinity of creation, and somewhere in it a tiny little mark, a microscopic dot on a microscopic dot, which says, “You are here.”

Why is the web like this? Because of the convergence of communications, entertainment and commerce into a single seamless mass.

Once upon a time, television appeared to be an uncontested safe harbour for entertainment and commerce, the corporate-networked desktop PC a clearly bounded productivity tool. Sociability and communication happened out of sight and out of mind.

Now those things are collapsing in on each other. When commercial messages have to compete with pictures of your kids, cute kittens and plans for nights out, there is no contest. When employees openly use the same tools to converse with their peers as to conduct business it becomes clear at once that bonds of friendship are stronger than those of salaried fealty. When even the biggest brand is reduced to a fraction of one percent of searches on the web, it becomes just another microscopic dot on a microscopic dot.

These truths are not new, but the tools to discover them are.

Executives stepping out of the Vortex for the first time are understandably mind-blown. Realising quite how insignificant their businesses and products are in the lives of their consumers, they become easy prey to social media’s snake-oil salesforce, who promise to swell the ranks of their Twitter followers and guarantee instant Google gratification.

Maybe they’d do better to remember that they were young once, and that, as Adams wrote: “In an infinite universe, the one thing sentient life cannot afford to have is a sense of proportion.”

Ten years on, can we stop worrying now?

Ten years ago this month the Sunday Times published an article by Douglas Adams called “How to Stop Worrying and Learn to Love the Internet”. You can read it here.

Some starting observations:

  1. It’s a tragedy that Adams died, aged 49, in 2001, depriving us of more great literature in the vein of the Hitchhiker’s Guide, of genuinely innovative new media projects such as H2G2, and of the witty, insightful commentary we find in the Sunday Times column.
  2. Adams’ insights have stood the test of time.  Everything he wrote at the end of the Nineties stands true as we near the start of the Tens.
  3. We still haven’t stopped worrying.

Adams from 1999:

… there’s the peculiar way in which certain BBC presenters and journalists (yes, Humphrys Snr., I’m looking at you) pronounce internet addresses. It goes ‘wwwDOT … bbc DOT… co DOT… uk SLASH… today SLASH…’ etc., and carries the implication that they have no idea what any of this new-fangled stuff is about, but that you lot out there will probably know what it means.

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on…

2009: John Humphrys is still huffing and puffing [Update 3/9/09 - further proof provided!], and…

you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

The moral panic continues, now transferred to social networking and camera phones.

And Douglas Adams hit the nail of the head in his taking to task of the term “interactive”:

the reason we suddenly need such a word is that during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television. Before they came along all entertainment was interactive: theatre, music, sport – the performers and audience were there together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for. We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.

The same fallacy persists, now transferred from the term “interactive” to “social“.

Ten years ago, Douglas Adams identifed a few problems.

  • “Only a minute proportion of the world’s population is so far connected” – this one’s well on the way to being fixed, as much by the spread of internet-capable mobile devices as by desktop or laptop PCs.
  • It was still “technology,” defined as “‘stuff that doesn’t work yet.’ We no longer think of chairs as technology, we just think of them as chairs.” – has the internet in 2009 reached the same level of  everyday acceptance as chairs? Almost, I think, though the legs still fall off with disappointing regularity.

The biggest problem, wrote Adams, is that “we are still the first generation of users, and for all that we may have invented the net, we still don’t really get it”. Invoking Steve Pinker’s The Language Instinct (read this too, if you haven’t already), he argued that it would take the next generation of children born into the world of the web to become really fluent. And for me that’s been the most amazing part. Reflecting the other day on Tom Armitage’s augmented reality post to the Schulze and Webb blog, I realised that I see that development in my own children’s engagement with technology.

  • At birth a child may assume that anything is possible: a handheld projector holds no special amazement for my three-year-old.
  • Through childhood we are trained, with toys among other things, to limit our expectations about how objects should behave. My six-year-old, who has been trained by the Wii, waves other remote controls about in a vain attempt to use gestures.
  • My nine-year-old, more worldliwise, mocks him for it.

We arrive in the world Internet-enabled and AR-ready, it’s just that present-day technology beats it out of us. I work for the day when this is no longer the case.

Last words to Douglas Adams, as true today as in 1999:

Interactivity. Many-to-many communications. Pervasive networking. These are cumbersome new terms for elements in our lives so fundamental that, before we lost them, we didn’t even know to have names for them.

Update 3/9/09: Debate about Twitter on the Today programme, and Kevin Anderson takes up the theme.

Here Comes Everybody bigger (and smaller) than ever before

Back in May I blogged about Clay Shirky‘s book “Here Comes Everybody”. I was torn: I wanted to believe that social media could indeed make the world a better place, yet my inner history graduate protested that people are people, and have communicated and interacted for good and ill since time immemorial.

In “Television may be the gin of the information age, but that doesn’t mean the web is pure water” I questioned Clay’s contention that the web unlocks a cognitive surplus previously wasted on one-way television viewing.

In “Erm, excuse me, but I think Everybody was here all along” I challenged the title of Clay’s book: surely it was the media that was late to everybody’s party, rather than the other way round.

So when David Cushman on Faster Future gave his readers the chance to ask questions for a video interview of Clay I put in my twopenceworth (as with everything here, in a personal capacity that does not necessarily represent the opinions, strategies or positions of my employer):

I’d like to know why Clay chose the title “Here Comes Everybody”? I rather thought that everybody was here all along, in that communicating and self-organising have been characteristics of human society for thousands of years. Is technology really changing people’s behaviour, or simply making existing behaviour more visible in the online space?

Thanks to David for asking my question, and to Clay for answering it so eloquently. Here’s the video…

(and David’s accompanying blog post)

Do I buy the answer? I’m not sure, though I’m pleased to see Clay’s focus on people, not technology as the driving factor. “I’m not a technical determinist…. it’s the novelty of scale, ” he says.

Make sure to watch right to the end for another gem. I love the idea that things can now happen globally on a much smaller scale than ever before, as well as at large scale in the mighty networked crowd.

Other episodes of David Cushman’s Clay Shirky interview here, here, here and here.

Second verse, same as the first, a little bit louder and a little bit worse

Two recent news stories continue my theme that social media doesn’t so much change people’s behaviour, as expose pre-existing behaviours for all to see, often with unexpected consequences.

Exhibit 1: ‘Dumbest criminal’ records crimes

A Leeds man has been dubbed the city’s “dumbest criminal” by a councillor for posting videos of anti-social behaviour on the YouTube website.

Andrew Kellett, 23, from Stanks Drive, Swarcliffe, published 80 videos and was given an interim anti-social behaviour order (Asbo) by Leeds magistrates.

Kellett has been previously convicted of various offences but the Asbo stops him from boasting of his activities.

BBC News, 21 May 2008

This one’s fairly straightforward: people have been speeding, racing, dodging taxi fares and stealing petrol since the advent of the automobile. But even as some wring their hands over the spread of CCTV and enforcement cameras, others now bravely disintermediate the authorities altogether. Why wait for your crimes to be exposed when you can post them on the internet yourself?

Our legal system’s response? Stop, you’re making it too easy! Shooting fish in a barrel is one thing, but fish who voluntarily jump into the barrel and bob up to the surface with targets tattooed on their bellies – where’s the fun in that? So he gets an ASBO to stop him putting any more of his crimes on Youtube.

Exhibit 2: Students ‘had hints’ before exam

An exam board is investigating suggestions that some teachers gave students hints about what questions would be in an A-level biology exam.

Discussions in an online student forum ahead of OCR’s A2 biology practical identified key areas for revision.

OCR said it would watch the results to see if anyone had gained an advantage.

BBC News, 28 May 2008

Now, I reckon teachers with an inside track on the practical exam have always discretely “advised” pupils what to revise. Not to do so when you’ve shepherded a bunch of teenagers through the course material for the best part of two years would be almost inhuman, even without the pressure to perform in league tables. Exam boards must have long realised this conflict of interest.

It takes a bunch of students chatting in an online forum to force them to admit the situation and “monitor” results. The Facebook generation may be adept at negotiating the social intricacies of poking, but it seems some of them have totally failed to grasp the point of a nod and wink. And it only takes a few to spoil it for everyone.

People being people, much as they’ve always been: loving, creating, cheating and scheming in the same proportions as they always did. The new variable is visibility, and that changes everything.

Erm, excuse me, but I think Everybody was here all along

It’s taken me a while (and 83 more pages of Here Comes Everybody) to understand my unease with the “technology changes everything” discourse around social media, and now to reach an alternative hypothesis. In my last post I questioned whether the advent of the internet in the place of television could, as Clay Shirky suggests, awaken some kind of latent creativity and collaboration. Could the web really turn the tables on the mass media, humble big corporations and bring about revolutions?

Here Comes Everybody contains a number of such vignettes to back up the case for the technology-led societal shift: the phenomenal accumulation of quality volunteer-contributed content in Wikipedia, British students’ Facebook revolt against changes to their HSBC bank charges, Belarus “flash mob” protests, and so on. Nothing like these things could happen, the story goes, without new tools built on top of mobile phones and the internet.

Except that they could, and did. Because for every story of 21st Century people getting together to achieve something amazing using new technology, there’s a story from history of people who did much the same without the benefit of the world wide web. One of these even gets into Shirky’s book: the 1989 fall of the Berlin Wall and all that it stood for. But to that we might add any number of 20th Century educational movements such as the Workers’ Education Association, student boycotts of Barclays and Nestle in the 1980s, the demonstrations of May 1968 (the same year, by the way, that a contract was awarded to build something called the Arpanet).

These big things, of course, are just the tip of the iceberg. To these we must add countless more localised acts of collaboration and creativity: the village antiques society of which my grandmother was treasurer, the baby-sitting circle where my mum and dad traded nights out with other parents using curtain-rings as currency, countless fanzines photocopied and posted. Sure, it was a little harder to shift ideas around the world, but from what I can recall we mostly managed OK. After all, making and sharing stuff are two of the most defining characteristics of being human.

So how come it still feels like the internet is changing everything? I have a suggestion.

When Clay Shirky talks in his blog post about a massive television-related bender spanning the whole second half of the 20th Century, he’s half right. But it wasn’t the mass of the population that was rendered senseless by the broadcast media – no they kept on creating and collaborating much as people always have. Rather, the intoxication induced by television was mainly in the minds of big business and mass media. Broadcasters and brands became so drunk with the power of pushing content one-way into people’s living rooms that they forgot that their “audience” might be busy doing other things.

It was a wise executive who admitted “I know half my advertising doesn’t work, I just don’t know which half” because the mythical housewife never was waiting patiently for the television to tell her which brand of soap powder to buy. She was too busy chatting to her next-door neighbour while they scrubbed their doorsteps, or making bunting to string along the street on carnival day. But business, the media and government didn’t get that. It was their tragedy that there was no return path. Information flowed in only one direction – away from them – leaving them to revel in their own self-importance.

It’s my contention that the amount of collaboration and creativity in the world is not changing greatly as a result of new communications technologies. There may be a little incremental creation, but mostly it’s substitutional of other activities that have gone on in some shape or other for thousands of years. What has changed is that new technologies make those old activities more visible. All those conversations used to happen in drafty village halls, through the post and over the phone. Now they are on the web for all to search and to see. It’s no longer possible for the mass media and big businesses, or even governments, to imagine that they have it all their own way, because the curtain has been drawn back to reveal just how irrelevant some of them have become.

It’s not so much a case of “Here Comes Everybody”, as of “Everybody Was Here All Along”. People aren’t late to this party, technology and business are. Only by understanding that can traditional organisations have a chance of being welcomed into the conversation. If they come at this change from a technology point of view – thinking that they’re going to instantly enable incremental communications for an amazed and grateful populace – then they’ll likely fail to make the grade. But if they understand that it’s mainly substitutional then they’ll see why their customers set the bar so high.

People have been communicating and interacting for thousands of years without the help of mobile phones and computers. They have developed sophisticated ways of doing so. Social niceties and nuances make their collaborations highly efficient. If you or your business want to be a part of that you’d better first watch and learn. See how natural are the conversations, and how easily people negotiate complex issues of coordination and collaboration. Then try to design tools and talk in a language that matches that quality. Or to put it another way, Here Comes Technology, Late As Usual (but if you sit quietly at the back for a bit Everybody might let you join in).

Update 2 October 2008: David Cushman interviewed Clay Shirky in London and is posting a series of videos at Faster Future, including an answer to my question. Worth a look.

Update 1 November 2008: Simon Collister is not alone. I still haven’t finished my copy either.

Update 11 October 2011: John Dodds on the (re)discovery of the second screen.

Television may be the gin of the information age, but that doesn’t mean the web is pure water

Flickr - in the future we will all wear shiny suits and watch bright red televisions

The new media revolutionary in me so much wants to believe Clay Shirky’s “Here Comes Everybody” hypothesis, that the web heralds a new era of mass participation, collaboration and creativity. With our mobile phones and broadband connections we remade society, so that my five-year-old son cannot conceive of a world without the web (“Daddy, if people didn’t have computers, how did they buy things from the Internet,” he once asked.) We are the generation that Changed Everything. How cool is that?

But then my inner history graduate rebels. I’m innately suspicious of anyone who says human behaviour has changed fundamentally. The joy of history is in its humanity, in all the stories that show how our ancestors were ordinary people who laughed, loved, tricked and schemed just like we do today. If Baby Boomers claim they invented sex, just refer them to Roman pottery and the satirical cartoons of the 18th Century.

And so I believe in our bright human future: that so long as people survive, they will behave much like their stone age forebears. The context may be different, but people are people across time and space. And that’s a Good Thing.

So I’m deeply conflicted when, on the blog accompanying his book, Shirky launches a puritanical attack on television as a sink that dissipates our thoughts, and compares it to the socially sedative role of gin in our early industrial revolution cities. The theory goes that:

The transformation from rural to urban life was so sudden, and so wrenching, that the only thing society could do to manage was to drink itself into a stupor for a generation. The stories from that era are amazing– there were gin pushcarts working their way through the streets of London.

And it wasn’t until society woke up from that collective bender that we actually started to get the institutional structures that we associate with the industrial revolution today. Things like public libraries and museums, increasingly broad education for children, elected leaders–a lot of things we like–didn’t happen until having all of those people together stopped seeming like a crisis and started seeming like an asset.

It wasn’t until people started thinking of this as a vast civic surplus, one they could design for rather than just dissipate, that we started to get what we think of now as an industrial society

Television, the dominant mass media of the second half of the 20th Century is our modern-day equivalent of gin. But despair not, for Shirky has us all roused from our stupor by the Internet in all its chaotic glory, millions of Wikipedia edits, captioned cat photos and all.

The fact that Internet users watch less TV has been a commonplace for some time, so Shirky builds on this to show that if everyone watches just a little less TV and participates a little more online, whole new sources of value will be unlocked from our newly productive endeavours. We The Web Users can be morally superior to the Telly Addicts of the past: they consumed, we create.

It’s a great analogy, but I’m suspicious of the conclusion. Why? Because TV watching is not the only thing being edged out to make way for all those hours online. Not only do we watch less TV, we also sleep less and spend less time interacting with our families.

I started to list some things I do less as a result of having the internet:

  • watch tv
  • talk about tv
  • buy magazines
  • phone people up
  • write letters
  • go to the shops
  • go to the library
  • queue to pay bills
  • look out of the window on trains
  • sleep

Now a couple of these things – watching TV, buying magazines – do seem like gin, the one-way attention sink activities, (though as fundamentally social beings, it’s never long before two or more people assembled before a television set are debating and discussing the content, hurling abuse at the screen or fighting over the remote control).

But what about the others?

I now communicate less by phone and letter, and more by email or text. Where’s the cost in that? Well I reckon it’s in the nuances, the tone of voice, the side-tracked conversations, the pictures scribbled in the margins, that just don’t happen so much online. So I’ve substituted some inconvenient but rich communications media for handier, cheaper, but less subtle ones.

I shop online for stuff so I don’t have to go to the shops, and I Google for information so I don’t have to go to the library. So there goes a whole load of opportunities for collaboration – chance meetings with friends, taking my cue subconsciously from what other shoppers are looking at, and so on.

Then there’s the contemplation time. I used to stand in queues, look out of the window, ignore the TV and let my mind wander. Greater efficiency in transactions and communications is squeezing out those times, and I wonder if the quality of my communications is suffering just as their quantity increases. And that’s before the sleep deprivation kicks in, tiredness and drunkenness sharing many symptoms.

So maybe TV was the gin of the information age, but the internet has a way to go before it’s the clean drinking water that will unleash our productivity. Exchanges on online social networks are so far a pale shadow of the sophisticated interactions that happen when people get together in the real world. And whatever the medium, tomorrow’s people are highly likely to remain much like the people we know today: at once creative and lazy, generous and greedy. If attention is a finite resource, so surely is virtue.

The irony that I’m saying this on a blog is not lost on me. And no, I’m not about to retreat to my log cabin with a manual typewriter, but I do believe there are a few things we need to work on. To do that, we need to understand the good and bad stuff we’re leaving behind, as much as the huge potential of the new technology we embrace.

Disclosure: I write this post having made it up to page 99 of Here Comes Everybody. It’s a great, thought-provoking book and I fully expect to revise my opinion by the time I reach the end. Please consider this a review in perpetual beta :)