The first day of Christmas: back to the future

2016’s slogans pitted past against future: “Take back control”; “Make America great again”. Yet the past can be a platform for positive futures too: think of the optimistic “New Elizabethan” age of England’s early 1950s; or the Kennedys’ “Camelot” in 1960s USA. How might we rekindle those spirits in 2017?

What are you doing? 10 years of continuous partial attention

Today, for me, marks a decade of 140 character updates, 10 years of paying continuous partial attention to hundreds of wonderful people around the world. So I downloaded my Twitter archive and munged it in an Excel pivot table. Here’s what I learned…

2006-2008: what are you doing?

2006 to 2008.png

Following just a handful of people, I first experienced Twitter as a text messaging service. SMS, remember that? Every message was an answer to a single, simple question: “What are you doing?”

Within a week I think I understood the archival potential…

And before the year was out, I had an inkling of Twitter’s fragility…

“the places whose main or only selling point is unspoiltness – places we go to witness or take part in something special, but just by being there we destroy whatever that quality was. The perfect village? The perfect bar? Twitter?” – blog post: Polperro

Along the way, I picked up on the emerging conventions of the platform…

My first @ message…

2009: a tipping point

2006 to 2016.png

My first proper use of a hashtag…

… and some reckons about what made this platform so compelling…

“Maybe it’s this merging of monologue and dialogue in one service that makes microblogging (or whatever you call it) so powerful a communications tool? One for those of us who, most of the time, are not very good at listening?” – Twitter: where monologues collide

Up to that point, I’d used Twitter to keep up with a particular group of remote friends and colleagues. In 2009, as I recall, the number of users in Leeds hit some kind of critical mass – it became a useful place for conversation about my home city.

In November that year, after a chance Twitter exchange, I lured the author Steven Johnson to Leeds to talk about a personal hero of mine, Joseph Priestley. It rained, and not that many people made it to the talk, but even so…

In fact, almost every fun thing I’ve been involved in since around that time, from Leeds Walkshops to the Global GovJam has been enabled and enriched by this platform…

2011-12: Re-de-centralisation

It’s far from perfect. Nervous of Twitter’s long-term future, quite a few of us tried to find more open alternatives. I managed 246 identi.ca updates before getting sucked back into the Twitter ecosystem. This single point of control in our communications infrastructure still makes me uneasy.

2012: Unfollowing all the brands and bots

“A few days ago I ran a critical index finger down my Twitter “friends” list, unfollowing a few dozen accounts that did not belong to real people… I’m delighted with the results: my Twitter feed suddenly feels so much more human.” – All brands must die (after a long and happy life)

Since then I’ve kept this rule. Sorry, brands and bots, if you have something interesting to say, I’m sure one of my real friends will pass it on.

In 2012, I left a well-paid, permanent job to freelance in the world of digital service design. I’m pretty sure those people, the people I follow, and who follow me back, made that possible.

Being myself, most of the time

Part of the privilege that comes with playing life on the lowest difficulty setting is being able to be myself on social media, without the need to compartmentalise or anonymise for fear of context collapse.  I’ve never felt the need to separate personal and professional identities or to create a closed account for family and friends. I am painfully aware that many others do not enjoy this freedom.

At the same time, Twitter’s liberal approach to multiple accounts and usernames has allowed me to play with the medium. I once spent a few months impersonating the revolutionary journalist Camille Desmoulins, mainly to improve my French language skills…

And I created a mute account so I could share my wonder at living in the future with someone who I knew would appreciate it: @my7yearoldself …

2016: Meet my awesome filter bubble

Over the years the number of people I follow has grown. There are just so many interesting people in the world. But analysing my tweets I found a core of about 30 people whose words I retweet time and again.

I made them into a list, and for the past few days I’ve been consulting this instead of checking my timeline. So far I’m liking the result. By sticking just to this list, I can have a sense of completion, without getting drawn into the endless duration of the infinite scroll.

There’s been a lot of talk over the past weeks and months about whether filter bubbles are a Bad Thing, the cause of mutual mistrust across seemingly unbridgeable divides. My take: everyone needs a filter bubble. How awful would life be without like-minded people to share and reinforce beliefs and interests? Twitter’s asymmetrical follower model and untampered timeline have afforded the possibility of curating my filter bubble in a more controlled and transparent way than other social media platforms. I hope they keep those features. The risk arises when we mistake that bubble for the whole world, with everyone outside it as the Other.

This is my personal filter bubble. Sometimes I need to step outside it, but it’s an awesome bubble to be in. Thank you all…

Many others, not on this list, have also contributed to making Twitter great for me – thank you too.

In fact, in 10 years of following fairly liberally, only twice have I unfollowed someone because their ragey tweets were polluting my timeline. Again, I am aware that others have far worse online experiences. Some of the people whose tweets I have most enjoyed are no longer on Twitter. That’s a terrible shame. The platform’s owners and users must work harder to make it a safe place.

Some facts and figures

  • 9188 tweets
  • 6871 with an @
  • 3247 retweets
  • 2661 replies
  • 2570 accounts mentioned
  • 2313 with a #
  • 1210 uses of the pronoun “I”
  • 93 accounts mentioned more than 20 times each
  • 23 clients and connected apps used to tweet

My tweets

Here’s a word cloud of my tweets – 2006-2016…

Screenshot 2016-11-18 21.15.17.png

… and finally the 14 times I tweeted just a single word…

  • Annotating
  • Walking
  • Docked
  • roflysst
  • Haircut!
  • CDG3
  • LBIA
  • Snow!
  • Asleep
  • Awake
  • Wrapped
  • Wrapping
  • North
  • Baclava!

The experiment continues.

800px-an_experiment_on_a_bird_in_an_air_pump_by_joseph_wright_of_derby_1768
An Experiment on a Bird in an Air Pump by Joseph Wright of Derby, 1768

In shared light: why making thing visible makes things better

“In Elizabethan amphitheatres, like the 1599 Globe Theatre, performances took place in ‘shared light’. Under such conditions, actors and audiences would be able to see each other… This attention to a key original playing condition of Shakespeare’s theatre enables the actors to play ‘with’ rather than ‘to’ or ‘at’ audiences. Actors therefore develop their ability to give and take focus using voice, gesture and movement.” — Emma Rice to Step Down From London’s Shakespeare’s Globe, Playbill, Oct 25, 2016

Some rights reserved - Phil Glockner
Phil Glockner, the Original Starbucks

Early, too early, one morning I blunder into a railway station Starbucks for a coffee and croissant to take onto the train. I’m the only customer. I place my order and shuffle along to the end of the counter where the barista will hand down my drink.

What happens next in the customer experience is critically important. We know that Starbucks knows this too, because of a leaked 2007 memo from chairman Howard Schultz, in which he bemoaned the commoditisation of his brand:

“For example, when we went to automatic espresso machines, we solved a major problem in terms of speed of service and efficiency. At the same time, we overlooked the fact that we would remove much of the romance and theatre that was in play with the use of the La Marzocca machines. This specific decision became even more damaging when the height of the machines, which are now in thousands of stores, blocked the visual sight line the customer previously had to watch the drink being made, and for the intimate experience with the barista.”

As I said, it was early, much too early for an intimate experience with a barista. And in any case, the barista was still learning the ropes. I guess first thing on a shift, when there’s one customer and no queue, is a great time for some coaching from the supervisor. This is what I heard him say:

“You have 23 seconds for the milk… Oh, and relax. You can’t concentrate when you’re stressed.”

23 seconds! That’s what removed the romance from my coffee.

“You must either make a tool of the creature, or a man of him. You cannot make both. Men were not intended to work with the accuracy of tools, to be precise and perfect in all their actions. If you will have that precision out of them, and make their fingers measure degrees like cog-wheels, and their arms strike curves like compasses, you must unhumanise them.” John Ruskin, The Nature of Gothic

Some things in this carefully commodified service experience were never meant to be seen by the customer. When they do burst into view, it feels wrong, uncanny.

In this post I want to explore the reasons for that uncanniness, and how we might play with it to develop new service opportunities. Is it really so obvious what should and should not be visible to the user? What’s the impact on users when a component slips out of sight? And how might we make service better by keeping more things, more visible for longer?

The line of visibility

Some rights reserved - JP Swizzlespokes
JP Swizzlespokes, Experience Design wk05 #whiteboard

The line of visibility is a well-known concept in the fields of customer experience management and service design. To use, like Howard Schultz, a theatrical metaphor, it divides the service blueprint into front-stage activities seen by the customer, and back-stage ones unseen by the customer but nonetheless essential to the delivery of the service.

In the coffee shop:

  • Front-stage: the theatre and romance of taking the order, writing the customer’s name on a cup, grinding the beans, making the coffee, presenting the coffee to the customer
  • Back-stage: the operational efficiency of managing rosters, training staff, timing operations, replenishing stock, and so on.

At first glance, the allocation of activities to front or back-stage appears uncontroversial. In reality, it is much murkier, and deserves more critical attention:

  • A restaurant might make a show of fresh food preparation with an open kitchen on full view to the diners, but still have a room behind the scenes for the freezers and dishwashers.
  • Recently, after returning a hire car, I was given a lift by a new member of staff. The conversation we had about the rental company’s graduate scheme made me warm to the company and more likely to return.
  • Much has been written about the 8 simple words on the underside of the machine on which I’m typing this now: ‘Designed by Apple in California. Assembled in China‘.

Visibility and the value chain

I’ve been thinking about visibility in the context of whole value chain maps. In his mapping technique, Simon Wardley arranges components from the most visible user needs at the top to the unseen at the bottom:

screen2bshot2b2015-02-022bat2b20-25-38
A value chain – wardleymaps.com

In this interpretation, visibility is said to recede as we traverse the network – the more “hops” away from the customer, the less it needs to concern them. But is that really true?

Invisible things can have very visible effects. Amazon’s recommendation engine is deeply buried in the company’s infrastructure, yet customers experience its insights and biases every time they use the site.

Visible things may get up to all sorts of unseen activities. What if that camera or video recorder in the corner is participating in a distributed denial of service attack right now?

Invisibility and commodification

barcoded wood

Why is it that some things naturally seem to merit visibility while others have to hide themselves from view?

I think it has to do with commodification. To turn something into a commodity is to take it out of its context, to make it fungible so that it can be substituted, traded and transferred. In an example by the philosopher Andrew Feenberg:

a tree is cut down and stripped of its branches and bark to be cut into lumber. All its connections to other elements of nature except those relevant to its place in construction are eliminated.

This is what people are doing when they commit metaphorical sleights of hand such as “data is the new oil“. They take something that has deep meaning to an individual and, by aggregation, transform it into something that can be traded without further challenge or debate.

The logic of commodification prohibits the end user from interest in, or influence over, anything but the surface-level components. Before we know it, any breach of the line of visibility feels illegitimate. From Fairtrade foodstuffs to the employment rights of Uber drivers, demands to deepen visibility into the supply chain come to be seen as “political” incursions in the supposedly rational domains of technological production and economics.

Consider the much-maligned EU cookie directive.

Unregulated, the behemoths of the attention economy would place all their tracking of users below the line of visibility. “Users don’t need to know about that stuff,” they’d say. “It’s technical detail. Nothing to worry about. Move along now.” The Jobsterbedunners might even hold up web users’ continued browsing of sites in such compromised circumstances as some kind of “revealed preference” for covert tracking.

But people who care about privacy have a different opinion on where the line should be drawn. Their only option is a “political” intervention to drag the publicity-shy cookie blinking over the line of visibility. Now Europe’s internautes can take back control, every time they visit a website. Say what you like about the implementation, but we Brits will miss those privacy protections when they’re gone.

Shared light

What if there was another way to realise value? One that didn’t depend on enclosing the value chain by making it opaque to end users?

To Feenberg, decontextualisation is “primary instrumentalisation” the first part of a two-step process:

The primary instrumentalisation initiates the process of world making by de-worlding its objects in order to reveal affordances. It tears them out of their original contexts and exposes them to analysis and manipulation while positioning the technical subject for distanced control…

But the story doesn’t end there. There’s a crucial, secondary step where visibility has to be re-established:

At the secondary level, technical objects are integrated with each other as the basis of a way of life. The primary level simplifies objects for incorporation into a device, while the secondary level integrates the simplified objects to a social environment.

Through this secondary instrumentalisation, this resource integration, users tell us what they want technology to be. Think, for example, of the camera-phone as a concept worn smooth by countless buying and use decision over the course of a decade. This part of the value creation process cannot happen in strategy and planning; it can only happen in use.

Premature commodification would close down such possibilities just when we ought to be keeping our options open. Co-creation, on the other hand, places the service user, the service designer, and the service provider on the same side – and all of us play in all those positions at one time or another.

We maximise value when the interests of all the actors are aligned, when asymmetries of knowledge between them are reduced. To borrow another controversial theatrical analogy, co-creation flourishes in “shared light” when actors and audiences can see each other equally.

  • Not only do we see the coffee being made, we see the staff being trained.
  • We are no longer passive recipients of the recommendation algorithm, we can understand why and how it behaves.

Some service design patterns

Here are just some of the patterns that play with the line of visibility. By making things visible, they make things better.

Seeing over the next hill: We meet much of the most valuable service when facing a change or challenge for the first time. But unless we know what to expect, it’s hard for us to make decisions in our best interests, or to trust others seeking to support us. Deliver service so that people can always see over the next hill, so they know what to expect, what good looks like, and who they can trust to help them along the journey.

Provenance: People can take reflective pride in where their things come from – and be repulsed by a supply chain’s dirty secrets. Design like they’re watching. Document the journey and make it part of the service. My Fairphone may have been a little pricier than an equivalent smartphone, but it comes with a story of fair materials, good working conditions, reuse and recycling.

Individualisation: Service is intrinsically full of variation. When we treat its delivery like factory mass production, we make it inflexible, unresponsive, and ultimately destructive of value. Anticipate variation, embrace it and celebrate it. This will likely means fewer targets and processes, more self-organising, empowered teams. Be like homecare organisation Buurtzorg, which prioritises “humanity over bureaucracy” and “maximises patients’ independence through training in self-care and creation of networks of neighbourhood resources.”

A last word from actor-network theorist Michel Callon in his afterword to Feenberg’s ‘Between Reason and Experience’:

“Keeping the future open by refraining from making irrevocable decisions that one could eventually regret, requires vigilance, reflection, and sagacity at all times. Politics, as the art of preserving the possibility of choices and debate on those choices, is therefore at the heart of technological dynamics.”

“Evolution. What’s it like?” The three lives of the front-facing camera

“Evolution. What’s it like? So one day you’re a single-celled amoeba and then, whoosh! A fish, a frog, a lizard, a monkey, and, before you know it, an actress.
[On-screen caption: “Service limitations apply. See three.co.uk”]
I mean, look at phones. One, you had your wires. Two, mobile phones. And three, Three video mobile.
Now I can see who I’m talking to. I can now be where I want, when I want, even when I’m not. I can laugh, I can cry, I can look at life in a completely different way.
I don’t want to be a frog again. Do you?”

— Anna Friel, 3 UK launch advert, 2003

Today, in 2016, that ad feels so right, and yet so wrong. Of course phones have changed massively in the intervening decade-and-a-bit — just not how the telecoms marketeers of the early Noughties fantasised. In this post I want to trace what evolution of technology might really be like. I’ll do it by following the unstable twists and turns around one small element of the construct we now call a smartphone.

Something was missing from the Anna Friel commercial. All the way through, the director was at pains to avoid even the tiniest glimpse of something the audience was eager to see. You know, a phone. At the time I worked for Three’s competitor Orange whose brand rules also forbade the appearance of devices in marketing. The coyness was partly aesthetic: mobiles in those days were pig-ugly. Moreover, the operators had just paid £4 billion each for the right to run 3G networks in the UK. They wanted consumers to think of the phone as a means to an end, a mere conduit for telecommunications service, delivered over licensed spectrum.

To see a device in all its glory, we must turn to the manufacturer’s literature. Observe the product manual of the NEC e606, one of three models offered by Three at its launch on 3 March 2003:

NEC_e606_eng.pdf-0.png
NEC e606 product manual

Notice where a little starburst has been Photoshopped onto the otherwise strictly functional product shot? That’s the only tangible hint of the phone’s central feature, the thing that makes it worth buying despite being pricier and weightier than all the other matte grey clamshells on the market. By this point, loads of phones have digital cameras built in, but they are always on the back, facing away so the holder can use the tiny colour screen as a viewfinder. This is something different: a front-facing camera. It exists so that Anna Friel can see who she is talking to.

Let’s map* this network.

maps3.001.png

Loosely, the vertical axis answers the question “how much do users care about this thing?” The nearer the top, the more salient the concept. The horizontal concerns stability of the concept – the further to the right, the less controversial. But at this point the choice of nodes and the connections between them matters more to me than their precise placement. This forms an actor-network – a set of concepts that belong together, in at least one contested interpretation.

  • Phone calls are over on the top right, a very stable concept. Users understand what phone calls are for, know how to access them, and accept that they cost money.
  • If the operators can persuade users to add pictures, to see who they’re talking to, they have a reason to sell not just plain old telephony service but 3G, that thing they’ve just committed billions of pounds to building. Cue the front-facing camera.
  • Video calling and 3G cellular networks rely on each other, but both are challenged. Do users really need them? Will they work reliably enough to be a main selling point for the device? Whisper it softly, “service limitations apply”.
  • Because of this weakness, the assemblage is bolstered by a less glamorous but more stable concept – asynchronous video messaging. This at least can be delivered by the more reliable and widespread 2.5G cellular. Users don’t care much about this, but it’s an important distinction to our network.

What then remains for the telco executive of 2003 to do? Maybe just wait for the technology to “evolve”?

  • More 3G base stations will be built and the bandwidth will increase
  • Cameras and screens will improve in resolution
  • People will take to the idea of seeing who they’re talking to, if not on every call, then at least on ones that really matter.

All these things have come to pass. But could I draw the same network 10 years later with everything just a bit further over to the right? No, because networks come apart.

Nokia’s first 3G phone, the 6630 had no front-facing camera. Operators used their market muscle and subsidies to push phones capable of video calling. Yet many of the hit devices of the next few years didn’t bother with them. The first two versions of the Apple iPhone likewise. Even the iPhone 3G was missing a front facing camera. Finally in 2010, the operators had to swallow their pride and market an iPhone 4 with Apple’s exclusive Facetime video calling service that ran only over unlicensed spectrum wifi.

This is the social construction of technology in action. Maybe evolution is a helpful metaphor, maybe not. Whatever we call it, this is the story of how, over the course of a decade, by their choices what to buy and what to do, users taught the technology sector what phones were for. Hint: it wasn’t video calling.

Just when we think the front-facing camera is out of the frame, it makes a surprising comeback. This time it’s not shackled to either video calls or mobile messaging. Instead it emerges as a tool of self-presentation in social media.

16522941831_66f5407cdd_o
Some rights reserved – Ashraf Siddiqui

“Are you sick of reading about selfies?” asks an article in The Atlantic, announcing that selfies are now boring and thus finally interesting. “Are you tired of hearing about how those pictures you took of yourself on vacation last month are evidence of narcissism, but also maybe of empowerment, but also probably of the click-by-click erosion of Culture at Large?” Indeed, for all its usage, the term — and more so the practice(s) — remain fundamentally ambiguous, fraught, and caught in a stubborn and morally loaded hype cycle.”

‘What Does the Selfie Say? Investigating a Global Phenomenon’, Theresa M. Selft and Nancy K. Baym

Time for another map.

maps3.002.png

  • By 2013, 3G (now also 4G) cellular mobile is no longer in doubt, but its salience to users is diminished. It is a bearer of last resort when wifi is not an option for accessing the Internet.
  • The lynchpin at the top right is not the phone call but social media, with its appetite for videos and photos. In their service, we find the front-facing camera, now though rarely used for calling.
  • Only a fraction of selfies even leave the phone. Many of them are shared in person, in the moment, on the bright, HD screen. They are accumulated and enhanced with storage and processing powers that barely figured on the phones of 2003.

Call it evolution if you like, this total dissolution and reassembly of concepts.

We’re not done yet. Here’s another commercial for your consideration. One for the Samsung Galaxy S4 mapped above. Can you spot the third incarnation of the front-facing camera?

Man 1: “Hey, sorry I was just checking out your phone. That’s the Galaxy S4, right?”
Man 2: “Yeah, I just got it.”
Woman: “Did your video just pause on its own?”
Man 2: “Yeah it does it every time you look away from the screen.”
Man 1: “And that’s a big screen too.”
Man 2: “Yeah, HD.”
Man 3: “Is that the phone you answer by waving your hand over it?”
Man 2: “Yeah.”
Man 1: [waves hand over Man 2’s phone] “Am I doing it right?”
Man 2: “Someone has to call you first…”

Samsung Galaxy S4 TV advert, 2013

See how far a once-secure concept has fallen? The guy needs reminding (in jest at least) how phone calls work! Compared to the 3G launch video, this scene is more quotidian; the phone itself is present as an actor.

And what is the front-facing camera up to now? Playing stooge in the S4’s new party trick: the one where the processor decides for itself when to pause videos and answer calls. If the user never makes another video call or takes another selfie, it’ll still be there as the enabler of gesture control. Better add that to my map:

maps3.003.png

We used to think the phone had a front-facing camera so we could see each other. Then it became a mirror in which we could see ourselves. Now, it turns out, our phones will use it so they can observe us.

Maybe that’s what evolution is like.


* These maps are not Wardley value chain maps though I see much value in that technique. More on that in a later post.

The quick and the dead, or 6 things that change when your service goes live

Some of the organisations I work with are just starting out on this digital transformation thing. More and more of them, however, have been at it for quite some time. After 2, 3, even 4 years, a delivery process of discovery, alpha and beta is well embedded, in parts of the organisation at least.

Now I’m seeing more of the next struggle. I think it feels hard because, while alpha and beta can be treated as phases of service development, being live affects the whole organisation. This post is a first go at answering an existential challenge for digital specialists: what does it mean to go live?

1. It’s… alive!

No metaphor is wholly adequate. But it’s fair to say that accounts of organisational life have shifted over the last decades from the mechanical analogies of Taylorism to natural and biological ones. There’s less talk of levers and gears, more of evolution and growth.

What these analogies capture, and the machine-age ones miss, is the sense of aliveness. “Going live”, like Frankenstein’s monster, means crossing a threshold from being a well-assembled collection of parts to a sensing, thinking, adapting being in its environment.

There’s a quiet focus that comes from seeing serious numbers of people accessing your service right now. Digital teams make user activity visible. They fight hard to stay together for their service after crossing into live.

As Kit Collingwood-Richardson puts it, going live is like having a baby, with a whole future of parenting stretching ahead. “Go live is the start”.

2. Time in reverse

In big organisations, agile development teams and service operations teams can sometimes feel like they’re on different planets. But I reckon they have something important in common: a healthy focus on the here-and-now. As Mat Wall says, agile is basically: “What do you want by Friday? And how can we make it better than last week?” Both those questions would be familiar in any high-performing front line service team. In a live organisation this common focus for development and operations becomes a powerful unity of purpose.

Agile development and operations both occupy what the sociologist Anthony Giddens describes as the temporal existence of “durée” – performing routinely, but with the possibility of change in every repetition. To Giddens, this is only the first of three sorts of time:

  • durée of day-to-day experience
  • life span of the individual
  • longue durée of institutions

The middle layer – the life span of the individual – is “irreversible time”. Its arrow goes in only one direction, and I have the grey hairs to prove it. This is where we find dedicated change management, the top-left to bottom-right sweep of the Gantt chart.

In contrast, the day-to-day and institutional time – a commitment stretching out indefinitely – are “reversible”. They always have the possibility to do over, to do differently, to do better.

To go live is to adopt a different attitude to time. We’re no longer burning down towards a deadline. We have embraced changefulness as a daily habit, supported by a long-term structure. We are committed to be here every day, as long as it takes, as long as there are people to be served. Any service that lacks this habit and structure isn’t live, it’s dead already.

3. Discovery is a culture, not a phase

digital-by-default-service-standard-image
Service lifecycle – Government Service Design Manual

Despite the recycling symbols, the service lifecycle drawn left-to-right can look a bit, well, waterfall-y. Discovery leads to alpha, as alpha leads to beta, and beta leads to live in resolutely irreversible time.

In particular, the distance between discovery and live seems to me wholly misleading. After all, live, running service affords all sorts of discovery possibilities that wouldn’t otherwise exist.

I know I’m not the only one who has tried redrawing this picture. I’ve tried drawing it as a stacked bar chart, as a circle, as a Möbius strip, and I’ve ended up with this…

dabl dna.png

In a sufficiently advanced organisation, discovery is a culture, not a phase. Intertwined live service and discovery continually fulfil and refine the purpose of the organisation.

  • Curious about something that’s showing up in the web analytics? Go and do some user research!
  • Hearing something new from customers in research? Go and see for yourself what is happening on the front line!

The odd ones out in this picture are alphas and beta – the phases where early-stage digital transformation organisations probably spend most of their time and attention.

Don’t think of (capital “D”) Discovery as something we do to prepare for the (definite article) Alpha Phase, think of an alpha version as one potential response to a new discovery. Alphas and betas are just tactical things we can make to bridge discoveries back into live service.

4. The strategy is (continuous) delivery

The discovery|live duality respects no boundaries between strategy, “change” and operations. Instead of clumsily executing planned but discontinuous change, the live organisation is always sensing and responding, making work visible, and reflecting frequently on how to do better.

Going live demands, in the words of David Marquet, that we move the authority to where the information is:

  • Policy advisors, strategists and designers can accomplish their work more effectively and at greater pace because they have very frequent contact with the realities of everyday service delivery
  • Everyone who delivers service has the power to make better decisions, multiple times per day; they must be trusted to take decisions that are aligned to the organisation’s purpose and priorities
  • Process changes are no longer pushed to workers on the front line; instead they frequently pull in change based on the demands they experience in contact with customers
  • Colleagues, suppliers and customers work together in a spirit of productive informality
  • Everyone becomes – to a greater or lesser degree – a service designer.

This is what I meant when I said learning by doing was the last target operating model you’d ever need.

5. All users become co-creators

Going live is often seen as only affecting the people inside an organisation. After all, users shouldn’t have to care what labels we use internally. Nevertheless, it should feel different to be a user of a live organisation.

Sure, your digital team may be highly user-centric already. You frequently engage users in defined, intentional activities – research visits, usability studies, private beta versions and so on. But to the majority of users, big organisations still appear unfeeling, inert and unresponsive.

When service is genuinely live, every interaction with users is an opportunity for new learning. Because the organisation is alive, it can sense people’s needs and adapt itself to meet them. Users become everyday co-creators of service. They learn to be more demanding, and to expect frequent change.

That’s when the fun really starts: when users realise that service can adapt to fit them. They begin to bring more than just their needs. They bring their unique capabilities to be combined with emergent competencies of the organisation. The “so that” line in the user story template comes into its own. We lift our sights from a deficit-based view of user needs to an asset-based vision of human potential. In live service, customer relationships are an endless source of ideas and innovation.

6. The new high score

Arbitrary double standards between capital and operational spending can easily bend organisational priorities out of shape. Agile abhors upfront spending divorced in time from actual customer value. Yet this is precisely what common accounting conventions reward. We need to change the high score.

Ironically the knowledge organisation’s most valuable assets are often its least visible. The conscious competence learning model presents “unconscious competence” as the apex of a pyramid of skill. Having begun in blissful ignorance, learners first become conscious of their own incompetence. They must go through a stage of consciously improving their competence until it comes so naturally that they can do it without noticing. But if we don’t notice the stuff we’re best at, there’s a risk we’ll undervalue it.

So the live organisation needs a new kind of balance sheet, one that deprecates unnecessary inventory and investments. Instead it recognises its most valuable asset: the growing skills, knowledge, networks and confidence of customers, workers and suppliers alike.

Live is when real digital transformation begins. It marks a radically different way of managing everyday work, and a new culture of continuous discovery. It will flatten decision-making structures, and transform passive users into active co-creators. The ways we measure and account for success will be different. But the potential payoffs are huge.

Go live. I dare you.

 

And yet it moves! Digital and self-organising teams with a little help from Galileo

IMG_20160810_171835.jpg

 

This summer, after a lovely 2 week holiday in Tuscany, I returned to Leeds and straight into a classroom full of government senior leaders discussing agile and user-centred design. Their challenges set me thinking once more about the relationship between technology and social relations in the world of work. One well-known story from the Italy of 400 years ago is helping me make sense of it all.

Galileo's_sketches_of_the_moon.png
Galileo’s sketches of the moon

1. Magnification

Galileo Galileo did not invent the telescope but he greatly improved it, reaching more than 20x magnification and pointing it for the first time at the seemingly smooth, celestial bodies of the night sky. In March 1610, he published drawings of the universe as never seen before. What seemed to the naked eye a handful of constellations appeared through Galileo’s telescope as thousands of teeming stars. He showed the moon pocked with craters, mountain ranges and plains. He used his observations and calculations of the planets to confirm a long held but never proven conjecture that the earth and other planets travel elliptically around the sun.

With its twin, the microscope, the telescope was a transformative technology of Galileo’s age, affording new ways of seeing things that people thought they already knew well. Our tools are the smartphone and the web. They too change how we see the world in many ways. Most of all they shed new light upon, and throw into relief, the detail of the social. Minutiae of conversations and interactions that used to occur fleetingly in private before disappearing into thin air can now be shared, stored and searched in previously unimaginable ways.

So let’s focus our gaze upon the world of work. (I am not the first to draw this parallel. Steve Denning write eloquently about what he calls the “Copernican Revolution In Management“.) In a pre-digital era, organisations appeared to be made of smooth, reporting lines, opaque meeting agendas and crisp minutes. Now the wrinkles and pits of communication and interaction are exposed in detail for all to see – every email, every message, every line of code.

Digital communications facilitate, magnify and expose people’s timeless habits of co-operation. These social phenomena are not new. It’s just that, until recently, indicators of productive informality were hidden from view. In the absence of evidence, we focused more attention, and founded our theories of management, on things that were immediately obvious: explicit hierarchies and formal plans.

up close.jpg

Now by observing the details, we can confirm a long-held theory: that self-organisation is rife in the workplace. The new communications tools reveal…

  • the human voices of individuals and interactions in Slack groups, wikis and code repositories
  • the depth of customer collaboration in Twitter replies and support forums
  • the endless resourcefulness of teams responding to change in Trello boards and live product roadmaps.

social.jpg

We should be careful not to over-claim for this shift. As a student of history and the social sciences, I am instinctively suspicious of any narrative which has human nature suddenly change its spots. I come to bury mumbo-jumbo, not to praise it. I reject the teal-coloured fantasy of Frederick Laloux’s “next stage of human consciousness.” More likely the behaviours Laloux identifies have always been with us, only hidden from view. Future generations may judge that we are living through a paradigm shift, but such things can only be confirmed after the fact.

2. Empiricism

The day after Galileo’s publication, the stars and planets carried on doing their thing, much as they had for the billions of days before. After all, heliocentrism was not even an original idea. Aristarchus of Samos had proposed it in the 3rd Century BC; Islamic scholars discussed it on and off throughout the middle ages; and Nicolaus Copernicus himself had revived it more than 20 years before Galileo was born. In one way, nothing had changed. In another, everything had changed. As with another famous experiment – dropping different objects from the Leaning Tower of Pisa to test the speed of falling bodies – Galileo was all about empiricism. He did not ask whether a proposition was more elegant to the mind’s eye or more convenient to the powerful. He designed tests to see whether it was true.

The Manifesto for Agile Software Development is itself an empirical text, founded in the real-world experiences of its authors. It begins (my emphasis): “We are uncovering better ways of developing software by doing it and helping others do it.” The authors set out four pairs of value statements in the form “this over that“, stressing “that while there is value in the items on the right, we value the items on the left more”.

We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more.

These were the values of 17 balding and bearded early Noughties software professionals who gathered at the Snowbird ski resort in Utah. It would be easy to mistake the manifesto for a creed – a set of assertions that true followers must accept as gospel. But they’re not that at all. This is not a religion. Empiricism says we have the power to see for ourselves.

In scores of learning and development sessions over the past couple of years, my associates and I have conducted a little experiment of our own. This is the method:

  • Without sharing the text of the manifesto, we hand out eight randomly ordered cards each showing a different value statement – “contract negotiation”, “working software”, “following a plan” and so on.
  • Then we ask participants to rank them in the order that they would value when delivering a service.
  • There are no right or wrong answers. We jut want to understand what they value.

The result: 90% of the time the items on the left bubble to the top of the list – regardless of participants’ roles and experiences. Of course many project managers say they value “following a plan”, but most of them value “responding to change” more highly. I had a couple of contract managers on one course. They ranked the “contract negotiation” card pretty high up their list. But they put “customer collaboration” at the top.

When people recall their best experiences at work, the things they describe are invariably the things on the left. For the ones who have been around big organisations for 20 years or more, they often speak in terms of “that’s how we used to do things” – before the so-called professionalisation of “information technology” tried to replace trust and teamwork with contracts and stage gates. For others there are more recent stories of emergencies and turnarounds when everyone pulled together around a common cause and just got stuff done in an amazingly productive, naturally iterative rhythm.

3. Reaction

From the time of Copernicus in the 1540s until Galileo’s work in the 1610s, Catholic Church leaders were mostly comfortable with heliocentricity. While Copernicus’ propositions remained “just a theory” they were interesting but unthreatening. But Galileo’s evidence, his assertion of empiricism over the authority of Aristotelian ideas, provoked a backlash. They accused him of heresy and threatened him with torture until he solemnly recanted his view that the earth moved round the sun. This he did, though allegedly muttered under his breath, “And yet it moves.”

That’s the thing about this set of propositions we call “agile”, or “lean”, or “post-agile” or whatever. Often we contrast these with something called “waterfall” as if these were equally valid, alternative ways of getting things done. I think that’s a mistake. They’re not things we pick and choose, any more than Galileo chose to make the earth travel round the sun. Agile and waterfall are alternative theories of how things get done – how things have always got done.

Digging a little into the history, it turns out that “waterfall” was never meant to be taken literally:

“Dr Winston Royce, the man who is often but mistakenly called the “father of waterfall” and the author of the seminal 1970 paper Managing the Development of Large Software Systems, apparently never intended for the waterfall caricature of his model to be anything but part of his paper’s academic discussion leading to another, more iterative version.” – Parallel Worlds: Agile and Waterfall Differences and Similarities

But when people feel threatened by new ideas, there’s a risk, as happened with astronomy, that they back further into their corner and end up espousing more extreme views than they would have held if left unchallenged.

Some who attribute their successes to top-down command-and-control management may fear they have a lot to lose from the growing evidence base for self-organisation. We need to find unthreatening ways to talk to the small group of people – in my experience less than 10% – for whom the values of the left-hand side do not spring naturally to the top of the list.

Coexistence is possible. Equivalence is not. Many religious believers, for example, manage to square their faith in a divine creator with the iterative circle of Darwinian evolution. What’s not credible though is a like-for-like, pick-and-mix approach to agile and waterfall. Nobody argues for evolution of the flea and creation of the elephant. Because one of these is an account that is based on empiricism, the other on an appeal to authority.

4. Conclusion

It took more than a century for the Catholic Church to overcome its aversion to heliocentrism. Meanwhile scientists in the Protestant world continued to circulate and build on Galileo’s findings. Remember Isaac Newton: “If I have seen further, it is by standing on the shoulders of giants.” The last books by Copernicus and Galileo were finally removed from the Church’s banned list in 1835.

If the last few years of domestic and international affairs have taught us anything, it should be that the arrow of progress can go backwards as well as forwards. Rightness and rationality can easily lose out to conflicting interests. If we believe there’s a better way, then it’s down to every one of us to model that better way, in how we work, and how we talk about our work. We can do this by:

  • working out loud to make our collaboration visible and legible
  • collecting and sharing evidence of self-organisation in action
  • resisting mumbo jumbo with simple, factual accounts of how we get stuff done
  • accepting coexistence with other theories but never false equivalence.

In 1992, Pope John Paul II expressed regret for how the Galileo affair was handled. But plans to put a statue of the astronomer in the grounds of the Vatican proven controversial, and were scrapped in 2009.

#DearestEngland, some signals so far

Screenshot 2016-06-26 23.23.34

For the last three days I’ve been running an experiment – a minimal, digital-only intervention, just to test the waters and see if it’s worth investing further time and effort.

It was planned whichever way the Referendum went, but the fact that the call for contributions went live around the time England awoke to the results has given Dearest England extra poignancy.

Having seen an underwhelming response to last year’s appeal for Dearest England letters, I wanted to test some hypotheses:

  1. That this time around, people would be ready to talk about the future of England
  2. That, as it has in Scotland, this open, independent initiative could provide a valuable space for personal reflections and collaborative creations.

It’s early days, but these are my reflections so far…

We have plenty to say.

Regardless of the #DearestEngland hashtag, my Twitter feed would have been full of conversation and reaction to the Referendum result. Richard Pope’s exceptional “Dear England” post stands out as the kind of call to action I hope to see more of:

a shared patriotic, progressive mission building something new, to redesign how we run our democracy for the 21st century.

To build a better nation, we’ll need more than 140 characters.

Twitter today is stuffed full of screenshots – images of Facebook posts, blog comment threads, newspaper articles, phone screens. Accessibility nightmare they may be, but they speak of a need to say more than can fit in a single tweet. Usually I love the character limit, the Orwellian clarity and brevity it enforces. But clarity and brevity are built on shared norms and frames of reference, both of which suddenly seem to be in shorter supply than we thought just a few days ago. The format of the letter seems apt: long enough to say what you mean, short enough to be digestible at scale.

We need to talk about the future – but maybe we’re not ready for that just yet.

People are still digesting what just happened, adjusting to the realisation of a deeply divided country. Dearest England needs to be there for the next phase, when more of us are ready to write explicitly to the future. That phase needs to come soon. We don’t have much time to shape what happens next.

We need to talk about England – but that won’t be easy either.

Dearest Scotland and Dearest India both proudly go by the tagline “Nation | Vision | Voice”. That didn’t feel right for an entity as elusive as England. It is surely telling that, mid-Euro 2016, “Dearest England” could be so easily construed as addressing a football team. But it feels to me that England is a conversation we have to have, and that having it will not in any way diminish other, overlapping identities.

 We need to talk about our place in the World.

There’s a strong feeling that conversations about England’s future should not be insular. Physical geography may make Great Britain an island, but the social construct of England exists only in relation to others peoples and states. We have to draw in the voices of the many other individuals and nations with whom we interact. Indeed it is those with a dual identity, one foot in England, another in Europe and the World, who feel that present threat most keenly.

We can start the conversation online – but it has to spread beyond the cyber bubble.

Much of the soul-searching in my Twitter feed related to how few opposing voices we heard on social media channels. How could it be that our Facebook and Twitter friends were so unanimous when the country as a whole was so split-down-the-middle? There is something here about algorithms and filter bubbles. We need to make more transparent the relationship between divergent opinions, spam control and monetisation strategies in the social networks we use. That won’t be straightforward.

But maybe digital is one big bubble? This weekend, out of urgency and convenience, I’ve made Dearest England an online only affair. But Dearest Scotland and Dearest India have been much more tangible than that. A truly open and inclusive conversation needs to take place in physical spaces as well as cyber ones. Let there be workshops, pens and paper, physical prototypes, acting-it-out! Partly that’s because not everyone is online. But, more than that, even for those of us who seem glued to our devices, not everything we are is online.

The experiment continues. I would love to see your letters.

  • Start your letter “Dearest England,”
  • Take a photo of it
  • Post it with hashtag
  • There are no other rules