User testing is always illuminating. The mirrored glass, the dimmed lights, and the unreal relay of sound from one room to the next. These things become familiar. But the users, no matter how carefully screened and segmented, are all different. They make every session both humbling and surprising.
Last week I dropped in on a test of one of our flagship products, running in prototype on a touch screen phone. The sessions I saw went well: no problems using the phone, some encouraging stuff on our product, a few issues, no showstoppers.
But then this…
- The thumb deployed to tap links, to hunt and peck at letters in text input
- The forefinger to slide and drag
- Even sometimes the middle finger to scroll
And since then I’ve been watching how people treat their touch screens – some lovingly, some harshly. And the more I watch, the more I wonder if “touch” is even the right word. More like…
- A stroke screen
- A press screen
- A smear screen
- A stab screen
This amazing, visceral dexterity at once reveals the inadequacy of the previous great user interface breakthrough, that fistful of plastic, the mouse, and its faux precise on-screen avatar, the pixel-pointed arrow. The four-year-old child who was looking for the mouse behind the TV is now a six-year-old jabbing impatiently at the screen.
Microsoft Word tells me this post has a Flesch-Kincaid Grade Level of 5.1, so to all you 10-year-old mobile designers out there, this pearl of wisdom is for you.
The way we design for these screens needs to change, to consider not just the size of the screen but the hands with which people hold and control it.
- Are they big hands or small hands?
- Does it work as well with the left as with the right?
- Does this component suggest fingers or thumbs?
In such choices lies the difference between user frustration and user delight.
Update 22/08/2010: Nice observations from Dan Saffer of Kicker Studios on Finger Positions for Touchscreens