UX Certification
Get hands-on practice in all the key areas of UX and prepare for the BCS Foundation Certificate.
Observing a usability test seems simple but it's easy to lose focus during a session and record only the dramatic or obvious usability problems. As you watch the test, you should make minute-by-minute observations of the participant's behaviour as single letter codes. Datalogging ensures you note all behaviours, not just the ones that stand out, and provides all you need to quickly create a list of usability issues you can pass to the design team.
James leaned back in his chair and rubbed his eyes. He had sat through 5 usability test sessions today and frankly, it had been hard going. Duplicate problems had started appearing with the second participant of the day, and James's attention had wandered a bit. By participant 3 he was checking email and by participant 4 he was on Facebook. He knew he should pay more attention during the sessions, but he has convinced himself that he works best when he reviews the screen recordings of each session the next day.
It was when James played back the first recording that he knew something was wrong. The screen video had recorded fine, and he could see the participant's face in the picture-in-picture recording. But there was no audio. He checked the mute button on his computer but the volume was up to maximum. He jumped to the next recording, then the next. They were all the same. For some reason the microphone hadn't worked.
He had no idea what his 5 participants were saying.
James looked back at his notes and sighed. There wasn't much on that half-page of doodling that was going to help. He could recall a couple of stand-out problems from the first few participants but he knew that wasn't going to be enough for the design team.
What would he put in his report?
If you've ever observed a usability test, you'll know that it's often hard to keep up with the tempo of what's going on. Nothing seems to be happening — and then suddenly a handful of usability problems appear at once. It seems impossible to get them all down: you write down one usability issue, but that prevents you from observing the next problem. You look at the participant — who is now struggling with a different problem — and you wonder how the participant got here and what you've missed. If you have the job of moderating the session and taking notes, it's even more difficult: how can you focus on the participant and take notes?
So it's no surprise that many people think the easiest solution to this problem is to use James's approach. Skip the note taking and just review the recordings, or alternatively just note down the key issues after each participant has finished.
Although this sounds the easy solution, it's not. Passively observing multiple usability sessions is tedious. It gets increasingly difficult to stay focused as you watch subsequent participants. I'm not sure if this is due to being cooped up in a room for a long time or because subsequent participants run into many of the same problems you've seen earlier or because some participants aren't very interesting. But whatever the cause, you need some way of helping you to pay attention, and there's only so much coffee you can drink.
Another reason why this isn't a great solution is that your notes end up as a jumble of observations. Review your notes, and you'll find participant opinions ("I prefer a larger search box") mixed up with behavioural observations ("The participant doesn't understand the 'contact' label") and both of those laced with demographic notes ("The participant spends about £30/month on their mobile phone bill").
Here's another solution.
As you watch the test, you should note down your observations of the participant's behaviour as single letter codes: an approach known as "datalogging". Datalogging is a kind of shorthand developed by students of animal and human behaviour (if you'd like more background on the method, consult the ethologist's bible, "Measuring Behaviour").
Datalogging has several advantages:
As a rule of thumb, you should average about one observation per minute. But remember this is an average: observations are a bit like buses (none for ages, then three come along at once). For each observation, you should write down:
An excerpt from a session might look like this:
Time | Class | Description |
---|---|---|
4:35 | M | Scans nav options but doesn't choose |
5:10 | C | "I'm looking for a search box now" |
6:07 | X | Doesn't seem to spot the link to the 'search' page |
Where "M" is the short code for "Miscellaneous observation", "C" is the short code for "General comment" and "X" is the short code for "Usability problem".
Here's a list of marker definitions that I use in my tests. This list may seem a bit imposing if you've never done datalogging before; if that's the case, just use 2-3 items in the list until you feel more confident.
Code | Definition description |
---|---|
X | Usability problem |
D | Duplicate usability problem (described earlier) |
V | Video highlight — an "Ah-ha!" moment |
C | Comment (general comment by participant) |
P | Positive opinion expressed by participant |
N | Negative opinion expressed by participant |
B | Bug |
F | Facial reaction (e.g. surprise) |
H | Help or documentation accessed |
A | Assist from moderator |
G | Gives up or wrongly thinks finished |
I | Design idea (design insight by logger) |
M | Misc (general observation by logger) |
All you really need to do data logging is a digital watch (or an analogue watch with a second hand), a pen and a pad. You'll probably want to print a crib sheet showing the codes you're using, just in case you forget.
But there's some technology to help you too. Todd Zazelenchuk has written a great article describing several datalogging tools. In this section, I'll describe just 3 different tools to give you a flavour of how technology can help you.
The Livescribe Pulse is part pen, part digital recorder and part scanner. As you write on the special dotted paper, an infrared camera at the tip of the smartpen tracks everything you write down and a built-in digital recorder records the audio in the room. When you review your notes, you can play back the session by pointing with the pen to the appropriate part of your notes. For example, if you point with your pen to the "X" usability observation you made, you'll actually hear (through mini-speakers in the pen) what was going on in the room as you made that observation. You can also import your notes and recordings into your computer. It sounds amazing technology — and it is. Unfortunately, the audio recording quality isn't great and the pen itself is fat and slightly awkward to hold. (The consequence of this for me is that it makes my writing look messy, and then I find I take fewer notes).
Morae is the software platform of choice for usability testers and it includes a built-in datalogging tool that timestamps and synchronises your observations with the participant screen recordings. You can search across participant recordings for different observation codes and export the observation codes and any associated notes to Excel. Techsmith has a good video on datalogging with Morae. If you already own Morae, this is the route I'd suggest you take. But if you don't own Morae, you'll find this an expensive route to datalogging.
Most people have a copy of Excel on their computer and the table-based layout of Excel makes it a natural choice for taking notes during a usability test session. It's easy to set up Excel to contain a column for the time, a column for the class of observation and a final column for your notes. What's less easy to do in Excel in to timestamp your observations. You can find various macros that will do this but you may find your company insists you disable macros in Microsoft Office to reduce the risk of viruses. And the latest version of Excel for the Mac does not support macros.
Fortunately, there's a workaround. You simply need to tell Excel to allow circular references and then create a circular formula. Here's an Excel spreadsheet I prepared that you can use to timestamp your observations:
Macro-free Excel spreadsheet to timestamp observations (xls format)
If you code observations with a program like Excel, you can sort and edit your observations easily. Add a couple of columns to the spreadsheet (like "Possible fix" and "Owner") and it becomes the basis of a bug list for the development team. You'll need to think about the findings, properly interpret the usability issues and develop fixes that will work, but once you've done this you can email this to the development team and you've just created the world's fastest usability report.
Overachievers might want to format the Excel observations so they look nice and pretty using Microsoft Word's mailmerge feature. Once you've set the template up, this literally takes just a few minutes to produce and everyone will think you've spent hours on it.
While your report is being read by the development team, old-school usability practitioners like James are just beginning to review the recordings of the session.
You'll find that rapid reporting like this makes you very popular with management and the development team.
Dr. David Travis (@userfocus) has been carrying out ethnographic field research and running product usability tests since 1989. He has published three books on user experience including Think Like a UX Researcher. If you like his articles, you might enjoy his free online user experience course.
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
This article is tagged tools, usability testing.
Our most recent videos
Our most recent articles
Let us help you create great customer experiences.
We run regular training courses in usability and UX.
Join our community of UX professionals who get their user experience training from Userfocus. See our curriculum.
copyright © Userfocus 2021.
Get hands-on practice in all the key areas of UX and prepare for the BCS Foundation Certificate.
We can tailor our user research and design courses to address the specific issues facing your development team.
Users don't always know what they want and their opinions can be unreliable — so we help you get behind your users' behaviour.