And yet it moves! Digital and self-organising teams with a little help from Galileo

IMG_20160810_171835.jpg

 

This summer, after a lovely 2 week holiday in Tuscany, I returned to Leeds and straight into a classroom full of government senior leaders discussing agile and user-centred design. Their challenges set me thinking once more about the relationship between technology and social relations in the world of work. One well-known story from the Italy of 400 years ago is helping me make sense of it all.

Galileo's_sketches_of_the_moon.png
Galileo’s sketches of the moon

1. Magnification

Galileo Galileo did not invent the telescope but he greatly improved it, reaching more than 20x magnification and pointing it for the first time at the seemingly smooth, celestial bodies of the night sky. In March 1610, he published drawings of the universe as never seen before. What seemed to the naked eye a handful of constellations appeared through Galileo’s telescope as thousands of teeming stars. He showed the moon pocked with craters, mountain ranges and plains. He used his observations and calculations of the planets to confirm a long held but never proven conjecture that the earth and other planets travel elliptically around the sun.

With its twin, the microscope, the telescope was a transformative technology of Galileo’s age, affording new ways of seeing things that people thought they already knew well. Our tools are the smartphone and the web. They too change how we see the world in many ways. Most of all they shed new light upon, and throw into relief, the detail of the social. Minutiae of conversations and interactions that used to occur fleetingly in private before disappearing into thin air can now be shared, stored and searched in previously unimaginable ways.

So let’s focus our gaze upon the world of work. (I am not the first to draw this parallel. Steve Denning write eloquently about what he calls the “Copernican Revolution In Management“.) In a pre-digital era, organisations appeared to be made of smooth, reporting lines, opaque meeting agendas and crisp minutes. Now the wrinkles and pits of communication and interaction are exposed in detail for all to see – every email, every message, every line of code.

Digital communications facilitate, magnify and expose people’s timeless habits of co-operation. These social phenomena are not new. It’s just that, until recently, indicators of productive informality were hidden from view. In the absence of evidence, we focused more attention, and founded our theories of management, on things that were immediately obvious: explicit hierarchies and formal plans.

up close.jpg

Now by observing the details, we can confirm a long-held theory: that self-organisation is rife in the workplace. The new communications tools reveal…

  • the human voices of individuals and interactions in Slack groups, wikis and code repositories
  • the depth of customer collaboration in Twitter replies and support forums
  • the endless resourcefulness of teams responding to change in Trello boards and live product roadmaps.

social.jpg

We should be careful not to over-claim for this shift. As a student of history and the social sciences, I am instinctively suspicious of any narrative which has human nature suddenly change its spots. I come to bury mumbo-jumbo, not to praise it. I reject the teal-coloured fantasy of Frederick Laloux’s “next stage of human consciousness.” More likely the behaviours Laloux identifies have always been with us, only hidden from view. Future generations may judge that we are living through a paradigm shift, but such things can only be confirmed after the fact.

2. Empiricism

The day after Galileo’s publication, the stars and planets carried on doing their thing, much as they had for the billions of days before. After all, heliocentrism was not even an original idea. Aristarchus of Samos had proposed it in the 3rd Century BC; Islamic scholars discussed it on and off throughout the middle ages; and Nicolaus Copernicus himself had revived it more than 20 years before Galileo was born. In one way, nothing had changed. In another, everything had changed. As with another famous experiment – dropping different objects from the Leaning Tower of Pisa to test the speed of falling bodies – Galileo was all about empiricism. He did not ask whether a proposition was more elegant to the mind’s eye or more convenient to the powerful. He designed tests to see whether it was true.

The Manifesto for Agile Software Development is itself an empirical text, founded in the real-world experiences of its authors. It begins (my emphasis): “We are uncovering better ways of developing software by doing it and helping others do it.” The authors set out four pairs of value statements in the form “this over that“, stressing “that while there is value in the items on the right, we value the items on the left more”.

We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more.

These were the values of 17 balding and bearded early Noughties software professionals who gathered at the Snowbird ski resort in Utah. It would be easy to mistake the manifesto for a creed – a set of assertions that true followers must accept as gospel. But they’re not that at all. This is not a religion. Empiricism says we have the power to see for ourselves.

In scores of learning and development sessions over the past couple of years, my associates and I have conducted a little experiment of our own. This is the method:

  • Without sharing the text of the manifesto, we hand out eight randomly ordered cards each showing a different value statement – “contract negotiation”, “working software”, “following a plan” and so on.
  • Then we ask participants to rank them in the order that they would value when delivering a service.
  • There are no right or wrong answers. We jut want to understand what they value.

The result: 90% of the time the items on the left bubble to the top of the list – regardless of participants’ roles and experiences. Of course many project managers say they value “following a plan”, but most of them value “responding to change” more highly. I had a couple of contract managers on one course. They ranked the “contract negotiation” card pretty high up their list. But they put “customer collaboration” at the top.

When people recall their best experiences at work, the things they describe are invariably the things on the left. For the ones who have been around big organisations for 20 years or more, they often speak in terms of “that’s how we used to do things” – before the so-called professionalisation of “information technology” tried to replace trust and teamwork with contracts and stage gates. For others there are more recent stories of emergencies and turnarounds when everyone pulled together around a common cause and just got stuff done in an amazingly productive, naturally iterative rhythm.

3. Reaction

From the time of Copernicus in the 1540s until Galileo’s work in the 1610s, Catholic Church leaders were mostly comfortable with heliocentricity. While Copernicus’ propositions remained “just a theory” they were interesting but unthreatening. But Galileo’s evidence, his assertion of empiricism over the authority of Aristotelian ideas, provoked a backlash. They accused him of heresy and threatened him with torture until he solemnly recanted his view that the earth moved round the sun. This he did, though allegedly muttered under his breath, “And yet it moves.”

That’s the thing about this set of propositions we call “agile”, or “lean”, or “post-agile” or whatever. Often we contrast these with something called “waterfall” as if these were equally valid, alternative ways of getting things done. I think that’s a mistake. They’re not things we pick and choose, any more than Galileo chose to make the earth travel round the sun. Agile and waterfall are alternative theories of how things get done – how things have always got done.

Digging a little into the history, it turns out that “waterfall” was never meant to be taken literally:

“Dr Winston Royce, the man who is often but mistakenly called the “father of waterfall” and the author of the seminal 1970 paper Managing the Development of Large Software Systems, apparently never intended for the waterfall caricature of his model to be anything but part of his paper’s academic discussion leading to another, more iterative version.” – Parallel Worlds: Agile and Waterfall Differences and Similarities

But when people feel threatened by new ideas, there’s a risk, as happened with astronomy, that they back further into their corner and end up espousing more extreme views than they would have held if left unchallenged.

Some who attribute their successes to top-down command-and-control management may fear they have a lot to lose from the growing evidence base for self-organisation. We need to find unthreatening ways to talk to the small group of people – in my experience less than 10% – for whom the values of the left-hand side do not spring naturally to the top of the list.

Coexistence is possible. Equivalence is not. Many religious believers, for example, manage to square their faith in a divine creator with the iterative circle of Darwinian evolution. What’s not credible though is a like-for-like, pick-and-mix approach to agile and waterfall. Nobody argues for evolution of the flea and creation of the elephant. Because one of these is an account that is based on empiricism, the other on an appeal to authority.

4. Conclusion

It took more than a century for the Catholic Church to overcome its aversion to heliocentrism. Meanwhile scientists in the Protestant world continued to circulate and build on Galileo’s findings. Remember Isaac Newton: “If I have seen further, it is by standing on the shoulders of giants.” The last books by Copernicus and Galileo were finally removed from the Church’s banned list in 1835.

If the last few years of domestic and international affairs have taught us anything, it should be that the arrow of progress can go backwards as well as forwards. Rightness and rationality can easily lose out to conflicting interests. If we believe there’s a better way, then it’s down to every one of us to model that better way, in how we work, and how we talk about our work. We can do this by:

  • working out loud to make our collaboration visible and legible
  • collecting and sharing evidence of self-organisation in action
  • resisting mumbo jumbo with simple, factual accounts of how we get stuff done
  • accepting coexistence with other theories but never false equivalence.

In 1992, Pope John Paul II expressed regret for how the Galileo affair was handled. But plans to put a statue of the astronomer in the grounds of the Vatican proven controversial, and were scrapped in 2009.

On the way to dConstruct: a social constructionist thought for the day

A desire to put some theoretical acro props under my vague unease with the determinist narrative of so much of our technology discourse has led me to the writing of the French anthropologist Bruno Latour. His work on the social construction of science, an ethnography of the R&D lab, has a special resonance for me, a humanities graduate who finds himself colleague to a legion of French engineers.

I’m stumbling intermittently through Catherine Porter’s translation of Latour’s 1991 work “We have never been modern“, as a prelude to David Edgerton’s “The Shock of the Old“. At times it feels a bit like eating up the broccoli before allowing myself desert, but the rich, buttery morsels like the following make it all worthwhile.

The story so far.

Latour argues that modernity, from Civil War England onwards, managed its contradictions by placing boundaries between nature and society. Thomas Hobbes, writer of the Leviathan, was taken up as a founder of political philosophy while Robert Boyle, he of the air pumps, was channelled as a natural philosopher and pioneer of scientific method. In truth both men speculated on both politics and science, but this inconsistency was whitewashed by their modern successors seeking only the pure narrative of one or the other.

And so we are today in a world still riven by CP Snow’s two cultures, where right-wing bloggers can grab acres of media coverage against climate scientists by finding just the tiniest trace of political “contamination” on the lab’s email servers.

But I wonder if the disconnection and reconnection of nature and society is also a useful way to understand some of the ideas I’m expecting to hear today at dConstruct, a conference at the cutting edge of technology and media convergence.

The 19 years since Latour published “Nous n’avons jamais été moderne” roughly spans my working life so far. I’ve witnessed the amazing things that can happen when you expose the humanities-soaked world of newspapers, books and TV to the attentions of software engineers and computer scientists. The results have been delightful and depressing, often both at the same time. Who knew back then that floaty copywriters would have to cohabit – for better or for worse – with the number-crunchers of search engine optimisation?

This fusing of the worlds of media and technology is only just beginning, and the next step is evident in the hand-held touch-sensitive, context-aware marvel of creation that is the latest smartphone.

Hitherto we have seen the the world of human-created information, the texts of the ancients and the tussles of our own times, through the pure window of the newspaper, the book, the TV, the PC screen. But the smartphone is a game-changer, like Robert Boyle’s air pump. With its bundle of sensors, of location, of proximity, and in the future no doubt heat, light, pressure and humidity it becomes a mini-lab through which we measure our world as we interact with it.

All manner of things could be possible once these facts of nature start to mix with the artifacts of society. My Foursquare checkins form a pattern of places created by me, joined with those of my friends to co-create something bigger and more valuable. My view of reality through the camera of the phone can be augmented with information. We will all be the scientists, as well as the political commentators, of our own lives. This is the role of naturalism in my “Mobile Gothic” meander.

To recycle Latour on Robert Boyle’s account of his air pump experiments:

“Here in Boyle’s text we witness the intervention of a new actor recognised by the new [modern] Constitution: inert bodies, incapable of will and bias but capable of showing, signing, writing and scribbling on laboratory instuments before trustworthy witnesses. These nonhumans, lacking souls but endowed with meaning, are even more reliable than ordinary mortals, to whom will is attributed but who lack the capacity to indicate phenomena in a reliable way. According to the Constitution, in case of doubt, humans are better off appealing to nonhumans. Endowed with their new semiotic powers, the latter contribute to a new form of text, the experimental science article, a hybrid of the age-old style of biblical exegesis – which has previously been applied only to the Scriptures and classical texts – and the new instrument that produces new inscriptions. From this point on, witnesses will pursue their discussions in its enclosed space, discussions about the meaningful behavious or nonhumans. The old hermeneutics will persist, but it will add to its parchments the shaky signature of scientific instruments.”

I don’t yet know where I stand in this picture. Am I the experimenter, his audience, or the chick in the jar?

An Experiment on a Bird in an Air Pump by Joseph Wright of Derby, 1768
A desire to put some theoretical acroprops under my vague unease with the determinist narrative of so much of our technologydiscourse has led me to the work of the French anthropologist Bruno Latour. His work on the social construction of science, anethnography of the R&D lab, has a special resonance for me, a humanities graduate who finds himself colleague to a legion of 

French engineers.

I’m stumbling intermittently through Catherine Porter’s translation of Latour’s 1991 work “We have never been modern”, as a

prelude to David Edgerton’s “The Shock of the Old”. At times it feels a bit like eating up the broccoli before allowing myself

desert, but the rich, buttery morsels like the following make it all worthwhile.

The story so far.

Latour argues that modernity, from Civil War England onwards, managed its contradictions by placing boundaries between

naure and society. Thomas Hobbes, writer of the Leviathan, was taken up as a founder of political philosophy while Robert

Boyle, he of the chicks in air pumps, was channelled as a natural philosopher and pioneer of scientific method. In truth both

men speculated on both politics and science, but this inconsintency was whitewashed by their modern successors seeking only

the pure narrative of one or the other.

And so we are today in a world still riven by CP Snow’s two cultures, where right-wing bloggers can grab acres of media

coverage against climate scientists by finding just the tiniest trace of political “contamination” on the lab’s email servers.

But I wonder if the disconnection and reconnection of nature and society is also a useful way to understand some of the ideas

I’m expecting to hear today at dConstruct, a conference at the cutting edge of technology and media convergence.

The 19 years since Latour published “Nous n’avons jamais été moderne” roughly spans a working life in which I’ve witnessed

the amazing things that can happen when you expose the humanities-soaked world of newspapers, books and TV to the

attentions of software engineers and computer scientists. The results have been delightful and depressing, often both at the

same time. Who knew back then that floaty copywriters would have to cohabit – for better or for worse – with the

number-crunchers of search engine optimisation?

This fusing of the worlds of technology and media is only just beginning, and the next step is evident in the hand-held

touch-sensitive, context-aware marvel of creation that is the latest smartphone.

Hitherto we have seen the the world of human-created information, the texts of the ancients and the tussles of our own times,

through the pure window of the newspaper, the book, the TV, the PC screen. But the smartphone is a game-changer, like

Robert Boyle’s air pump. With its bundle of sensors, of location, of proximity, and in the future no doubt heat, light, pressure

and humidity it becomes a mini-lab through which we measure our world as we interact with it.

All manner of things could be possible once these facts of nature start to mix with the artifacts of society. My Foursquare

checkins form a pattern of places created by me, joined with those of my friends to co-create something bigger and more

valuable. My view of reality through the camera of the phone can be augmented with information. We will all be the scientists,

as well as the poticial commentators, of our own lives. This is the role of naturalism in my “Mobile Gothic” meander.

To recycle Latour on Robert Boyle’s account of his air pump experiments:
“Here in Boyle text we witness the intervention of a new actor recognised by the new [modern] Constitution: inert bodies,

incapable of will and bias but capable of showing, signing, writing and scribbling on laboratory instuments before trustworthy

witnesses. These nonhumans, lacking souls but endowed with meaning, are even more reliable than ordinary mortals, to whom

will is attrributed but who lack the capacity to indicate phenomena in a reliable way. According to the Constitution, in case of

doubt, humans are better off appealing to nonhumans. Endowed with their new semiotic powers, the latter contribute to a new

form of text, the experimental science article, a hybrid of the age-old style of biblical exegesis – which has previously been

applied only to the Scriptures and classical texts – and the new instrument that produces new inscriptions. From this point on,

witnesses will pursue their discussions in its enclosed space, discussions about the meaningful behavious or nonhumans. The

old hermeneutics will persist, but it will add to its parchments the shaky signature of scientific instruments.”

I don’t yet know where I stand in this picture. Am I the man in the white coat or the chick in the belljar?

When too much perspective can be a bad thing

An article by my former colleague and TEDx Leeds speaker Norman Lewis reminds me of an ingenious device imagined by Douglas Adams in the Restaurant at the End of the Universe. Yes, I know you all like a good Douglas Adams quote.

First, though, listen to Norman, writing about ‘Millennials’ and Enterprise2.0 on his Futures Diagnosis blog:

The Millennial issue in the workplace has become symptomatic of the uncertainty of the ‘information age’ which exaggerates the novelty of the present at the expense of the past. This generational shift is regarded as unprecedented and a unique feature of our times. The workplace (and indeed, the world) is now divided into two periods: the past where everything remained the same with little change and the current moment with its constant change where change and disruption are incessant.

This rhetoric of unprecedented change is precisely that, rhetoric. What about the generational shift that occurred in the 1960s? The rise of the teenager in the post-War period was indeed unprecedented and had a huge impact on Western society. But did this result in the end of the enterprise as we know it? No, the exact opposite. It helped to forge the enterprise as we know it.

This is spot on. As I’ve argued before, what has changed in the last decade is the enterprise’s level awareness of stuff that has previously gone on behind its back.

Throughout the so-called “mass media” era, managers were encouraged to delude themselves that they had the attention of their employees and customers, who were in reality talking amongst themselves all along.

The web puts an end to the delusion. It acts like Douglas Adams’ Total Perspective Vortex:

… allegedly the most horrible torture device to which a sentient being can be subjected.

When you are put into the Vortex you are given just one momentary glimpse of the entire unimaginable infinity of creation, and somewhere in it a tiny little mark, a microscopic dot on a microscopic dot, which says, “You are here.”

Why is the web like this? Because of the convergence of communications, entertainment and commerce into a single seamless mass.

Once upon a time, television appeared to be an uncontested safe harbour for entertainment and commerce, the corporate-networked desktop PC a clearly bounded productivity tool. Sociability and communication happened out of sight and out of mind.

Now those things are collapsing in on each other. When commercial messages have to compete with pictures of your kids, cute kittens and plans for nights out, there is no contest. When employees openly use the same tools to converse with their peers as to conduct business it becomes clear at once that bonds of friendship are stronger than those of salaried fealty. When even the biggest brand is reduced to a fraction of one percent of searches on the web, it becomes just another microscopic dot on a microscopic dot.

These truths are not new, but the tools to discover them are.

Executives stepping out of the Vortex for the first time are understandably mind-blown. Realising quite how insignificant their businesses and products are in the lives of their consumers, they become easy prey to social media’s snake-oil salesforce, who promise to swell the ranks of their Twitter followers and guarantee instant Google gratification.

Maybe they’d do better to remember that they were young once, and that, as Adams wrote: “In an infinite universe, the one thing sentient life cannot afford to have is a sense of proportion.”

Ten years on, can we stop worrying now?

Ten years ago this month the Sunday Times published an article by Douglas Adams called “How to Stop Worrying and Learn to Love the Internet”. You can read it here.

Some starting observations:

  1. It’s a tragedy that Adams died, aged 49, in 2001, depriving us of more great literature in the vein of the Hitchhiker’s Guide, of genuinely innovative new media projects such as H2G2, and of the witty, insightful commentary we find in the Sunday Times column.
  2. Adams’ insights have stood the test of time.  Everything he wrote at the end of the Nineties stands true as we near the start of the Tens.
  3. We still haven’t stopped worrying.

Adams from 1999:

… there’s the peculiar way in which certain BBC presenters and journalists (yes, Humphrys Snr., I’m looking at you) pronounce internet addresses. It goes ‘wwwDOT … bbc DOT… co DOT… uk SLASH… today SLASH…’ etc., and carries the implication that they have no idea what any of this new-fangled stuff is about, but that you lot out there will probably know what it means.

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on…

2009: John Humphrys is still huffing and puffing [Update 3/9/09 – further proof provided!], and…

you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

The moral panic continues, now transferred to social networking and camera phones.

And Douglas Adams hit the nail of the head in his taking to task of the term “interactive”:

the reason we suddenly need such a word is that during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television. Before they came along all entertainment was interactive: theatre, music, sport – the performers and audience were there together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for. We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.

The same fallacy persists, now transferred from the term “interactive” to “social“.

Ten years ago, Douglas Adams identifed a few problems.

  • “Only a minute proportion of the world’s population is so far connected” – this one’s well on the way to being fixed, as much by the spread of internet-capable mobile devices as by desktop or laptop PCs.
  • It was still “technology,” defined as “‘stuff that doesn’t work yet.’ We no longer think of chairs as technology, we just think of them as chairs.” – has the internet in 2009 reached the same level of  everyday acceptance as chairs? Almost, I think, though the legs still fall off with disappointing regularity.

The biggest problem, wrote Adams, is that “we are still the first generation of users, and for all that we may have invented the net, we still don’t really get it”. Invoking Steve Pinker’s The Language Instinct (read this too, if you haven’t already), he argued that it would take the next generation of children born into the world of the web to become really fluent. And for me that’s been the most amazing part. Reflecting the other day on Tom Armitage’s augmented reality post to the Schulze and Webb blog, I realised that I see that development in my own children’s engagement with technology.

  • At birth a child may assume that anything is possible: a handheld projector holds no special amazement for my three-year-old.
  • Through childhood we are trained, with toys among other things, to limit our expectations about how objects should behave. My six-year-old, who has been trained by the Wii, waves other remote controls about in a vain attempt to use gestures.
  • My nine-year-old, more worldliwise, mocks him for it.

We arrive in the world Internet-enabled and AR-ready, it’s just that present-day technology beats it out of us. I work for the day when this is no longer the case.

Last words to Douglas Adams, as true today as in 1999:

Interactivity. Many-to-many communications. Pervasive networking. These are cumbersome new terms for elements in our lives so fundamental that, before we lost them, we didn’t even know to have names for them.

Update 3/9/09: Debate about Twitter on the Today programme, and Kevin Anderson takes up the theme.

Here Comes Everybody bigger (and smaller) than ever before

Back in May I blogged about Clay Shirky‘s book “Here Comes Everybody”. I was torn: I wanted to believe that social media could indeed make the world a better place, yet my inner history graduate protested that people are people, and have communicated and interacted for good and ill since time immemorial.

In “Television may be the gin of the information age, but that doesn’t mean the web is pure water” I questioned Clay’s contention that the web unlocks a cognitive surplus previously wasted on one-way television viewing.

In “Erm, excuse me, but I think Everybody was here all along” I challenged the title of Clay’s book: surely it was the media that was late to everybody’s party, rather than the other way round.

So when David Cushman on Faster Future gave his readers the chance to ask questions for a video interview of Clay I put in my twopenceworth (as with everything here, in a personal capacity that does not necessarily represent the opinions, strategies or positions of my employer):

I’d like to know why Clay chose the title “Here Comes Everybody”? I rather thought that everybody was here all along, in that communicating and self-organising have been characteristics of human society for thousands of years. Is technology really changing people’s behaviour, or simply making existing behaviour more visible in the online space?

Thanks to David for asking my question, and to Clay for answering it so eloquently. Here’s the video…

(and David’s accompanying blog post)

Do I buy the answer? I’m not sure, though I’m pleased to see Clay’s focus on people, not technology as the driving factor. “I’m not a technical determinist…. it’s the novelty of scale, ” he says.

Make sure to watch right to the end for another gem. I love the idea that things can now happen globally on a much smaller scale than ever before, as well as at large scale in the mighty networked crowd.

Other episodes of David Cushman’s Clay Shirky interview here, here, here and here.

Second verse, same as the first, a little bit louder and a little bit worse

Two recent news stories continue my theme that social media doesn’t so much change people’s behaviour, as expose pre-existing behaviours for all to see, often with unexpected consequences.

Exhibit 1: ‘Dumbest criminal’ records crimes

A Leeds man has been dubbed the city’s “dumbest criminal” by a councillor for posting videos of anti-social behaviour on the YouTube website.

Andrew Kellett, 23, from Stanks Drive, Swarcliffe, published 80 videos and was given an interim anti-social behaviour order (Asbo) by Leeds magistrates.

Kellett has been previously convicted of various offences but the Asbo stops him from boasting of his activities.

BBC News, 21 May 2008

This one’s fairly straightforward: people have been speeding, racing, dodging taxi fares and stealing petrol since the advent of the automobile. But even as some wring their hands over the spread of CCTV and enforcement cameras, others now bravely disintermediate the authorities altogether. Why wait for your crimes to be exposed when you can post them on the internet yourself?

Our legal system’s response? Stop, you’re making it too easy! Shooting fish in a barrel is one thing, but fish who voluntarily jump into the barrel and bob up to the surface with targets tattooed on their bellies – where’s the fun in that? So he gets an ASBO to stop him putting any more of his crimes on Youtube.

Exhibit 2: Students ‘had hints’ before exam

An exam board is investigating suggestions that some teachers gave students hints about what questions would be in an A-level biology exam.

Discussions in an online student forum ahead of OCR’s A2 biology practical identified key areas for revision.

OCR said it would watch the results to see if anyone had gained an advantage.

BBC News, 28 May 2008

Now, I reckon teachers with an inside track on the practical exam have always discretely “advised” pupils what to revise. Not to do so when you’ve shepherded a bunch of teenagers through the course material for the best part of two years would be almost inhuman, even without the pressure to perform in league tables. Exam boards must have long realised this conflict of interest.

It takes a bunch of students chatting in an online forum to force them to admit the situation and “monitor” results. The Facebook generation may be adept at negotiating the social intricacies of poking, but it seems some of them have totally failed to grasp the point of a nod and wink. And it only takes a few to spoil it for everyone.

People being people, much as they’ve always been: loving, creating, cheating and scheming in the same proportions as they always did. The new variable is visibility, and that changes everything.

Erm, excuse me, but I think Everybody was here all along

It’s taken me a while (and 83 more pages of Here Comes Everybody) to understand my unease with the “technology changes everything” discourse around social media, and now to reach an alternative hypothesis. In my last post I questioned whether the advent of the internet in the place of television could, as Clay Shirky suggests, awaken some kind of latent creativity and collaboration. Could the web really turn the tables on the mass media, humble big corporations and bring about revolutions?

Here Comes Everybody contains a number of such vignettes to back up the case for the technology-led societal shift: the phenomenal accumulation of quality volunteer-contributed content in Wikipedia, British students’ Facebook revolt against changes to their HSBC bank charges, Belarus “flash mob” protests, and so on. Nothing like these things could happen, the story goes, without new tools built on top of mobile phones and the internet.

Except that they could, and did. Because for every story of 21st Century people getting together to achieve something amazing using new technology, there’s a story from history of people who did much the same without the benefit of the world wide web. One of these even gets into Shirky’s book: the 1989 fall of the Berlin Wall and all that it stood for. But to that we might add any number of 20th Century educational movements such as the Workers’ Education Association, student boycotts of Barclays and Nestle in the 1980s, the demonstrations of May 1968 (the same year, by the way, that a contract was awarded to build something called the Arpanet).

These big things, of course, are just the tip of the iceberg. To these we must add countless more localised acts of collaboration and creativity: the village antiques society of which my grandmother was treasurer, the baby-sitting circle where my mum and dad traded nights out with other parents using curtain-rings as currency, countless fanzines photocopied and posted. Sure, it was a little harder to shift ideas around the world, but from what I can recall we mostly managed OK. After all, making and sharing stuff are two of the most defining characteristics of being human.

So how come it still feels like the internet is changing everything? I have a suggestion.

When Clay Shirky talks in his blog post about a massive television-related bender spanning the whole second half of the 20th Century, he’s half right. But it wasn’t the mass of the population that was rendered senseless by the broadcast media – no they kept on creating and collaborating much as people always have. Rather, the intoxication induced by television was mainly in the minds of big business and mass media. Broadcasters and brands became so drunk with the power of pushing content one-way into people’s living rooms that they forgot that their “audience” might be busy doing other things.

It was a wise executive who admitted “I know half my advertising doesn’t work, I just don’t know which half” because the mythical housewife never was waiting patiently for the television to tell her which brand of soap powder to buy. She was too busy chatting to her next-door neighbour while they scrubbed their doorsteps, or making bunting to string along the street on carnival day. But business, the media and government didn’t get that. It was their tragedy that there was no return path. Information flowed in only one direction – away from them – leaving them to revel in their own self-importance.

It’s my contention that the amount of collaboration and creativity in the world is not changing greatly as a result of new communications technologies. There may be a little incremental creation, but mostly it’s substitutional of other activities that have gone on in some shape or other for thousands of years. What has changed is that new technologies make those old activities more visible. All those conversations used to happen in drafty village halls, through the post and over the phone. Now they are on the web for all to search and to see. It’s no longer possible for the mass media and big businesses, or even governments, to imagine that they have it all their own way, because the curtain has been drawn back to reveal just how irrelevant some of them have become.

It’s not so much a case of “Here Comes Everybody”, as of “Everybody Was Here All Along”. People aren’t late to this party, technology and business are. Only by understanding that can traditional organisations have a chance of being welcomed into the conversation. If they come at this change from a technology point of view – thinking that they’re going to instantly enable incremental communications for an amazed and grateful populace – then they’ll likely fail to make the grade. But if they understand that it’s mainly substitutional then they’ll see why their customers set the bar so high.

People have been communicating and interacting for thousands of years without the help of mobile phones and computers. They have developed sophisticated ways of doing so. Social niceties and nuances make their collaborations highly efficient. If you or your business want to be a part of that you’d better first watch and learn. See how natural are the conversations, and how easily people negotiate complex issues of coordination and collaboration. Then try to design tools and talk in a language that matches that quality. Or to put it another way, Here Comes Technology, Late As Usual (but if you sit quietly at the back for a bit Everybody might let you join in).

Update 2 October 2008: David Cushman interviewed Clay Shirky in London and is posting a series of videos at Faster Future, including an answer to my question. Worth a look.

Update 1 November 2008: Simon Collister is not alone. I still haven’t finished my copy either.

Update 11 October 2011: John Dodds on the (re)discovery of the second screen.