For the third time in the past few months I’m assailed by a survey so shockingly poor that I wonder why the service provider in question has bothered at all.
First it was East Coast trains with a lengthy paper questionnaire about my journey, conducted entirely in mind-boggling forced-choice price/quality trade-offs.
Then came a letter from an Ofsted inspector slipped into my child’s book bag at primary school. “Your views about the school are important to us,” said the letter. The less-than-24-hours’ notice to go online and complete a survey suggested otherwise.
This time, as I log out of my online account, my bank butts in with an entreaty to help them develop new features. Like this one…
Let’s leave aside the dubious value of any question in user research starting “imagine if…” We’ll also charitably disregard the fact that all the bright ideas my bank is asking about have been standard features of their competitors since the days when the Internet sounded like a fax machine.
What really winds me up about this – and the examples before it – is the complete absence of a space to explain or qualify my choices in free text.
The East Coast one went on for 14 A4 pages without so much as a simple text box for me to have my say.
And when Ofsted states…
By sharing your views, you’ll be helping your child’s school to improve. You will also be able to see what other parents have said about your child’s school.
… they don’t actually mean said the way you or I, or a child in Key Stage 1, would understand the word. What they mean is clicked. Only strengths of agreement/disagreement and yes/no answers are permitted.
I’m not suggesting that large-scale, structured surveys are bad in themselves. But I do believe that asking any question without listening properly to the rich, human voice of the respondent does a disservice to surveyor and surveyed alike.
At the organisational level, asking only closed questions runs risks in two directions – gaining false reassurance or prematurely discounting profitable opportunities. In the bank example above, I do indeed value searching and sorting through my transactions, but much prefer to do so in an Excel spreadsheet or separate online personal finance service rather than on my bank’s own website. How am I meant to convey this subtlety in the survey? And how are the bank’s service managers to know this is what I want?
Maybe you think I’m only seeing half of the picture. Perhaps these three organisations also have sophisticated qualitative programmes wide open to unstructured feedback. Statistically speaking, I’m much more likely to be tapped up for ten minutes doing a quick online survey than for participation in an in-depth interview or ethnographic study.
Actually this make things worse, not better.
Consider the disempowering message sent to the thousands of travellers, parents and bank account holders on the blunt end of closed choice questionnaires. In signing off those questions, managers have assumed the sole right to structure the terms of conversation with the customers who are surveyed. “We want to know what you think,” they say, “but only so long as it fits within the narrow confines of our pre-existing plans and prejudices.” It’s as if they’ve rolled out the welcome mat to invite you into the conversation, only to snatch it away from under your feet.
Service dominant logic demands a dialogue, a collaborative learning effort between customers and service providers. In their essay ‘Co-creating the voice of the customer’, Bernie Jaworski and Ajay K. Kohli list the following features of a co-creating dialogue:
- Is the conversation end point clear or unclear?
- Do the comments build on those that came before them?
- Is there a willingness to explore assumptions that underlie the dialogue?
- Is the conversation exploratory: no topic is “off-limits?”
- Is there an eagerness for new ideas?
- Do the firm and the customer each shape the structure and content of the conversation?
It’s hard to do any of these things in a smash-and-grab raid to snatch a few data points on a five-point scale.
In 2014, organisations have no excuse for behaving so oafishly.
- If you really need to ask closed choice questions, add an optional space where people can explain or clarify their answers. It shows you might be genuinely listening, not just engaged in a box-ticking exercise.
- Worried you’ll be overwhelmed with more answers than you can read? What a great problem to have. Throw all the answers into a tool like Wordle so you can at least see common terms that crop up time and again.
- Instead of a big survey upfront, try to gather user input a little and often. Ask for micro-feedback at relevant points in the user journey. That way you can adapt your questioning to context and find precisely the users who are grappling with the issues you want to know more about.
- Spread the conversation out through your service design process. Think of every survey as a chance to recruit and screen users for deeper collaboration at the next stage. You may be surprised how many are prepared to give contact details for follow-up discussion on interesting findings.
- Above all, keep an open mind – which is much easier to do when you ask an open question.