Who’d have thought it?
Since the late 18th Century, moral panics have centred on the propensity for industrialisation and financialisation to turn people into machines.
‘You must either make a tool of the creature, or a man of him. You cannot make both. If you will have that precision out of them, and make their fingers measure degrees like cog-wheels, and their arms strike curves like compasses, you must unhumanise them.’ — John Ruskin
But now the Volkswagen emissions scandal lays bare that we stand on the cusp of the opposite peril.
We have trusted machines to perform repeat operations without fear or favour. We could count on them to do the same thing over and over and always render the same results.
Volkswagen’s cars didn’t have a fault in their diesel motors — they were designed to lie to regulators, and that matters, because regulation is based on the idea that people lie, but things tell the truth. — Cory Doctorow
Likewise, corporate brands have been defined by their superhuman consistency and unnatural longevity. Large teams of people with weighty manuals have been devoted to the maintenance of corporate “values”, “tone of voice” and “brand personality”.
That must be why it comes as such a shock to discover that both can exhibit the hitherto exclusively human characteristics of inconsistency, fallibility and mendacity. They think nothing of behaving one way on the emissions test treadmill and another on the open road.
At the corporate level, some of the debate about Volkswagen has focused on governance, on the relationship between innovation and regulation. Can those acting in the public interest ever enforce moral behaviour on private enterprise with systems of numerical targets and controlled-environment tests? No. Given big enough financial incentive and small enough risk of detection, corporations will game those systems. Every last one of them, all of the time.
The novel feature exposed by Volkswagen is that corporations are increasingly playing the game through the medium of software.
VW’s “defeat device” is not a physical device but a programme in the engine software that lets the car perceive if is being driven under test conditions – and only then pull out all the anti-pollution stops. — The Guardian
How will we respond when such potentially malign ingenuity is embedded in every vehicle and household object? An arms race of more targets, tests and regulations will be futile.
Opening the data will get us somewhat further. The wonder of connecting all the things to the internet is that we can see how they are performing day in, day out, in the wild, not just under test conditions. How long before a continuous stream of emissions and mechanical safety data replaces the annual MOT test?
But that will not be enough. The quantitative data will only ever capture an incomplete picture of the many externalities that such machines belch out.
Publishing the code will allow sufficiently skilled and committed auditors to understand behaviour at an atomic level. Surely we must legislate all black box software out of safety critical systems.
But can software really encode the most difficult trade-offs that people make in the moment? Trolley problems have been a stock in trade for philosophers since the 1960s. They won’t be settled by a C++ function.
The parallel social constructs of corporation and computer have long sheltered behind a convenient fiction: of being neutral technologies, forces of nature, a domain apart from the messy, contested world of the human politics. Volkswagen reveals this to be a false division.
Meanwhile we humans have been trained for two centuries in machine-age educational institutions for careers in command-and-control organisations. But we urgently need to make a break for it, back to the exposed flank of behaving like people at work. And we need to get there before our unaccountable creations form an unholy alliance to pull it off better than we do.
We’ll need new kinds of transparency and accountability, ones that recognise the responsibilities of directors, designers and developers as indivisible from those of the companies and code they create.
To the people who run big organisations: we expect you to be explicit about your business models and service blueprints. We need to know whether you see us as your customer, or as just another product in a multi-sided marketplace. We need to be sure that you treat the society in which you operate with respect, not contempt.
To the people exploring new opportunities and refining service through customer development or lean start-up methodologies: a reminder not merely to optimise for the easy metrics. Your success depends on your ability to design for the full diversity of people in this most complicated of worlds.
Think about that Spotify “how to build a Minimum Viable Product” graphic that did the rounds on Twitter a while back…
Basically it’s lean saying screw the externalities. How else could you end up with a road-hogging, obesity-making, fossil fuel guzzling car as superior to a bicycle?
And to the people who develop the software: you bear a special responsibility. With each line of code you commit, you pour a little of your soul inside the machine. You cannot do this anonymously; you can never wholly shuffle off this responsibility, even when someone else writes the specification or pays the bill.
Increasing storage capacity will soon make it feasible for even the most deeply embedded system to carry a copy of its entire version history. Maybe the Volkswagen engineers would have paused for thought if they’d known that their names would be openly entwined for eternity in the defeat device’s mitochondrial DNA.
Updated 3/10/2015 to add that brilliant Cory Doctorow quote on regulation