How I learned to stop wasting my time in usability tests
James leaned back in his chair and rubbed his eyes. He had sat through 5 usability test sessions today and frankly, it had been hard going. Duplicate problems had started appearing with the second participant of the day, and James's attention had wandered a bit. By participant 3 he was checking email and by participant 4 he was on Facebook. He knew he should pay more attention during the sessions, but he has convinced himself that he works best when he reviews the screen recordings of each session the next day.
It was when James played back the first recording that he knew something was wrong. The screen video had recorded fine, and he could see the participant's face in the picture-in-picture recording. But there was no audio. He checked the mute button on his computer but the volume was up to maximum. He jumped to the next recording, then the next. They were all the same. For some reason the microphone hadn't worked.
He had no idea what his 5 participants were saying.
James looked back at his notes and sighed. There wasn't much on that half-page of doodling that was going to help. He could recall a couple of stand-out problems from the first few participants but he knew that wasn't going to be enough for the design team.
What would he put in his report?
The traditional approach to note-taking
If you've ever observed a usability test, you'll know that it's often hard to keep up with the tempo of what's going on. Nothing seems to be happening — and then suddenly a handful of usability problems appear at once. It seems impossible to get them all down: you write down one usability issue, but that prevents you from observing the next problem. You look at the participant — who is now struggling with a different problem — and you wonder how the participant got here and what you've missed. If you have the job of moderating the session and taking notes, it's even more difficult: how can you focus on the participant and take notes?
So it's no surprise that many people think the easiest solution to this problem is to use James's approach. Skip the note taking and just review the recordings, or alternatively just note down the key issues after each participant has finished.
Although this sounds the easy solution, it's not. Passively observing multiple usability sessions is tedious. It gets increasingly difficult to stay focused as you watch subsequent participants. I'm not sure if this is due to being cooped up in a room for a long time or because subsequent participants run into many of the same problems you've seen earlier or because some participants aren't very interesting. But whatever the cause, you need some way of helping you to pay attention, and there's only so much coffee you can drink.
Another reason why this isn't a great solution is that your notes end up as a jumble of observations. Review your notes, and you'll find participant opinions ("I prefer a larger search box") mixed up with behavioural observations ("The participant doesn't understand the 'contact' label") and both of those laced with demographic notes ("The participant spends about £30/month on their mobile phone bill").
What is datalogging?
Here's another solution.
As you watch the test, you should note down your observations of the participant's behaviour as single letter codes: an approach known as "datalogging". Datalogging is a kind of shorthand developed by students of animal and human behaviour (if you'd like more background on the method, consult the ethologist's bible, "Measuring Behaviour").
The advantages of datalogging
Datalogging has several advantages:
- When lots of observations come at once, you need only note the observation code — you can then review this part of the session later, in the video recording.
- When scanning your notes, the observation codes make it easy to distinguish one class of observation (e.g. the usability issues) from other observations.
- Datalogging ensures you note all behaviours, not just the ones that stand out (this helps reduce bias in your observations caused by the reconstructive nature of human memory).
- Lightweight documentation like this ensures that rapid, iterative design projects have a record of what drove the current design without the need for formal reporting.
- Datalogging is one of those things you'll be glad you did when, like James found, there are problems with the video recording (e.g. when the sound is poor or when the recording is corrupted).
What to write down
As a rule of thumb, you should average about one observation per minute. But remember this is an average: observations are a bit like buses (none for ages, then three come along at once). For each observation, you should write down:
- The time.
- The class of observation.
- A short description.
An excerpt from a session might look like this:
|4:35||M||Scans nav options but doesn't choose|
|5:10||C||"I'm looking for a search box now"|
|6:07||X||Doesn't seem to spot the link to the 'search' page|
Where "M" is the short code for "Miscellaneous observation", "C" is the short code for "General comment" and "X" is the short code for "Usability problem".
Here's a list of marker definitions that I use in my tests. This list may seem a bit imposing if you've never done datalogging before; if that's the case, just use 2-3 items in the list until you feel more confident.
|D||Duplicate usability problem (described earlier)|
|V||Video highlight — an "Ah-ha!" moment|
|C||Comment (general comment by participant)|
|P||Positive opinion expressed by participant|
|N||Negative opinion expressed by participant|
|F||Facial reaction (e.g. surprise)|
|H||Help or documentation accessed|
|A||Assist from moderator|
|G||Gives up or wrongly thinks finished|
|I||Design idea (design insight by logger)|
|M||Misc (general observation by logger)|
Technologies to help
All you really need to do data logging is a digital watch (or an analogue watch with a second hand), a pen and a pad. You'll probably want to print a crib sheet showing the codes you're using, just in case you forget.
But there's some technology to help you too. Todd Zazelenchuk has written a great article describing several datalogging tools. In this section, I'll describe just 3 different tools to give you a flavour of how technology can help you.
The Livescribe Pulse is part pen, part digital recorder and part scanner. As you write on the special dotted paper, an infrared camera at the tip of the smartpen tracks everything you write down and a built-in digital recorder records the audio in the room. When you review your notes, you can play back the session by pointing with the pen to the appropriate part of your notes. For example, if you point with your pen to the "X" usability observation you made, you'll actually hear (through mini-speakers in the pen) what was going on in the room as you made that observation. You can also import your notes and recordings into your computer. It sounds amazing technology — and it is. Unfortunately, the audio recording quality isn't great and the pen itself is fat and slightly awkward to hold. (The consequence of this for me is that it makes my writing look messy, and then I find I take fewer notes).
Morae is the software platform of choice for usability testers and it includes a built-in datalogging tool that timestamps and synchronises your observations with the participant screen recordings. You can search across participant recordings for different observation codes and export the observation codes and any associated notes to Excel. Techsmith has a good video on datalogging with Morae. If you already own Morae, this is the route I'd suggest you take. But if you don't own Morae, you'll find this an expensive route to datalogging.
Most people have a copy of Excel on their computer and the table-based layout of Excel makes it a natural choice for taking notes during a usability test session. It's easy to set up Excel to contain a column for the time, a column for the class of observation and a final column for your notes. What's less easy to do in Excel in to timestamp your observations. You can find various macros that will do this but you may find your company insists you disable macros in Microsoft Office to reduce the risk of viruses. And the latest version of Excel for the Mac does not support macros.
Fortunately, there's a workaround. You simply need to tell Excel to allow circular references and then create a circular formula. Here's an Excel spreadsheet I prepared that you can use to timestamp your observations:
How to use your log to write a report in minutes
If you code observations with a program like Excel, you can sort and edit your observations easily. Add a couple of columns to the spreadsheet (like "Possible fix" and "Owner") and it becomes the basis of a bug list for the development team. You'll need to think about the findings, properly interpret the usability issues and develop fixes that will work, but once you've done this you can email this to the development team and you've just created the world's fastest usability report.
Overachievers might want to format the Excel observations so they look nice and pretty using Microsoft Word's mailmerge feature. Once you've set the template up, this literally takes just a few minutes to produce and everyone will think you've spent hours on it.
While your report is being read by the development team, old-school usability practitioners like James are just beginning to review the recordings of the session.
You'll find that rapid reporting like this makes you very popular with management and the development team.
About the author
Dr. David Travis (@userfocus on Twitter) holds a BSc and a PhD in Psychology and he is a Chartered Psychologist. He has worked in the fields of human factors, usability and user experience since 1989 and has published two books on usability. David helps both large firms and start ups connect with their customers and bring business ideas to market. If you like his articles, you'll love his online user experience training course.
Love it? Hate it? Join the discussioncomments powered by Disqus
Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
User Experience Articles
Our most recent articles
- Apr 3: Is Usability a Science?
- Mar 6: Why iterative design isn’t enough to create innovative products
- Feb 6: The Beginners' Guide to Contextual Interviewing
- Jan 9: The 8 competencies of user experience: a tool for assessing and developing UX Practitioners
- Dec 5: Non-UX books that every UX practitioner should read
Our most commented articles
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 16 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 14 articles tagged ethnography
- 14 articles tagged expert review
- 1 article tagged fitts law
- 4 articles tagged focus groups
- 1 article tagged forms
- 6 articles tagged guidelines
- 10 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 11 articles tagged iterative design
- 3 articles tagged layout
- 2 articles tagged legal
- 11 articles tagged metrics
- 3 articles tagged mobile
- 7 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 1 article tagged quotations
- 4 articles tagged roi
- 16 articles tagged selling usability
- 12 articles tagged standards
- 44 articles tagged strategy
- 2 articles tagged style guide
- 4 articles tagged survey design
- 5 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 53 articles tagged usability testing
- 3 articles tagged user manual