As with most things in life, there is more than one way to run an interview. Most people are familiar with out-of-context interviews, such as those we've all experienced in offices and Government buildings and seen on television chat shows. These can certainly provide some understanding for user research, but to gain significant insights you need to get into your user's context.
What’s so special about context?
A good customer interview during the discovery phase is not aiming to find out what people want — it's aiming to find out why people want it. You are trying to get answers to three basic questions:
- What is the user trying to get done?
- How does the user do it at the moment?
- What are the happy moments / pain points with the current process?
The problem with out-of-context interviews is that you can never be sure that you’re getting at the truth. People often do not understand why they do things a certain way and therefore can’t tell you. Even when they do understand why they are doing things, they may not want to tell you. And when they do tell you, they often don’t tell you the truth, or the whole truth: instead, they may think they are being helpful by describing a ‘simplified’ view of the way they work. And, of course, people can lie.
The most effective way to deal with this is to get people to show you how they achieve their goals at the moment and then observe them. Asking people to show you how they achieve their goals is a good way to get closer to authentic behaviour because it’s hard to fake.
As a trivial example, imagine I asked you describe to me how you make instant coffee. You might describe the various steps in making instant coffee, such as boiling the kettle, grabbing a teaspoon, adding coffee and hot water to the cup, then adding milk and sugar to taste. But if instead I observed you in context, I might notice that you do other things while the kettle is boiling and sometimes you need to boil the water again. Other times you may not have a teaspoon to hand so instead you tip out a teaspoon-sized amount of coffee directly from the jar into the cup. These behaviours aren’t unusual but users won’t describe them because they want to tell you a good story. And these behaviours lead to design insights such as a kettle that whistles when it boils; or a new instant coffee jar that includes a kind of Pez-dispenser in the lid to deliver a teaspoon of coffee.
Let’s review what you need for a contextual interview. To begin with the obvious, you should have some participants to visit. But how many is "some"?
Ask an experienced user researcher, "How many participants?" and you'll probably get more than one answer. There are a few reasons for this. First, any interviews are better than none, so even one participant will teach you something. Second, if the people you want to interview divide into clear types, start with 4-6 of each type. Third, if you don't know what different types there are, start with 8 people (the types will emerge from the patterns in their different experiences). Typically, I tend to start with around 20 users to design a typical system.
As well as your participants, you should have an outline discussion guide to act as a framework for eliciting stories. This discussion guide will contain the key assumptions you need to validate. Make sure your discussion guide is brief: don’t see it as a Q&A but as a kind of scaffolding to elicit and structure stories.
There's one more thing you need, and that’s a member of the design team who will join you on the field visit. You need some help taking notes, and this is the role you will give to your colleague (send them here to learn the AEIOU notetaking method for ethnographic interviews). But the main reason you want a member of the design team with you is because user research is a team sport: you want to ensure everyone gets their exposure hours.
But even though you want the team to observe, there's a limit to how many people should come along on any one visit. A two-person research team is ideal: with three or more, the dynamics change. It’s hard to stop observers interrupting the flow or changing the direction of the session. It can be a bit like people trampling over a crime scene and getting in the way. To manage this, and to ensure that everyone gets their exposure hours, swap the note-taker for someone else on the design team after the first couple of participants. The note-taker could be a developer, designer, project owner, scrum master, or domain expert (the latter is especially helpful when there is domain-specific jargon with which you’re not familiar).
On the field visit, your role will be to develop rapport with the user; conduct the interview; and apprentice with the user. Your colleague will be the note-taker. His or her role is to take photographs of the user, the environment and any artefacts; audio record the interview; make written observations; and ask clarifying and follow-up questions.
A good field visit tends to have five stages:
- Build rapport with the user.
- Transition from a traditional interview to a master-apprentice model.
Let’s review these in turn.
Build rapport with the user
In this phase, which should take only 5 minutes or so, you will introduce yourself and describe your separate roles (user researcher / note-taker). Then you should explain the purpose of the study so the user knows what you care about. A good way to start is to get an overview of the user’s background, so use questions like, “Maybe you could give me a tour of your place?”, "Tell me about the first time you started doing this activity", or "What got you interested in this kind of thing in the first place?" These are all good, safe, natural questions that should give you clues what to look for once the participant starts to demonstrate their work. Then move onto the opening question on your discussion guide.
As part of building rapport, you should also review your ‘shot list’ and get permission to take photographs. Photographs provide an incredible amount of additional insight and you don’t want to be the researcher who’s too shy to take photographs. But taking photographs can often appear intrusive and rude. The way to deal with this is to put control of the situation in the hands of your participant. For example, show them a list of things you want to photograph, such as:
- Your desk.
- Your computer.
- Your computer screen.
- The technology you use.
- Papers you use for your job.
- The wider environment.
Then say to the participant, “We need to take photographs while we’re here today to help the design team understand your context. Here’s a list of the things we’d like to photograph. If there are any things on here you want to keep private, just put a line through them and we won’t photograph them”.
This puts control of what you photograph in the hands of the user but at the same time it means you won’t need to continually ask for permission before every photograph. Another good question to ask is, “What things do you think we should photograph to understand the way you do this?”
Now is also the time to ask for permission to record the session to a digital voice recorder. The participant’s verbal protocol is central to your analysis, so you want to make a recording of the session. You should also consider transcribing your recordings (typically transcription firms charge around £1 per minute of audio).
If this is moving you outside your comfort zone, you could always prime the participant about photographs and recordings before the visit. When you initially recruit people for your research, along with the background to the study you could send them your shot list and mention the audio recording. Then it shouldn’t be an obstacle when you first meet the participant.
In this phase, which should take a minute or so, the interaction moves from a traditional interview to a master-apprentice model. You should tell the user that you want to learn by watching and asking questions, as if you were an apprentice learning how to do their job. This short phase should take about a minute.
As you become more experienced, you’ll realise that good interviewing isn’t a set of questioning techniques: it’s more a way of being. Hugh Beyer and Karen Holtzblatt, who invented a particular approach to customer interviews known as contextual inquiry, explain why this is important:
"Running a good interview is less about following specific rules than it is about being a certain kind of person for the duration of the interview. The apprentice model is a good starting point for how to behave."
The 'master-apprentice model' is a really useful method for engaging participants in your discovery process because everyone has experienced "teaching" somebody something. It also gives you, the researcher, license to ask naïve questions to check your understanding.
Sometimes the best thing you can do is sit back and watch the way your user is behaving. Don’t think that you need to continually ask questions. Especially if you have asked for a demonstration of something, it’s OK to just watch and simply ask the odd question to clarify your understanding. In fact, to someone looking from the outside, this may not look like an interview at all. That’s because few people have experience of running these kinds of interview. A good contextual interview should help you validate your riskiest assumptions, give you insight into the problems your users have and help you understand what matters in your users’ lives.
Keep observing. Anytime something tweaks your antenna, drill down with follow up questions. Point to things in the participant’s environment and ask what they are for. Get copies or pictures of artefacts, samples, forms, and documents. Use the discussion guide to remind you what you want to cover but don’t worry about covering every topic in every interview.
You’ll discover that it’s much easier to run a contextual interview than a pop-up interview because you don’t need to keep firing questions at your participant. Most of the time, just remembering two question stems will keep the session moving along nicely:
- “Tell me a story about the last time you…”
- “Can you show me how you…?”
The observation phase is where you should spend the bulk of your time during the session.
In this phase, you verify your assumptions and conclusions with the participant. Skim back over your notes and review what you learnt. Make suggestions for why the participant performed an action. Reflect back some of the concepts they are using — the participant will correct you if your assumption is wrong.
Immediately at the end of each session, grab an unlined 6” x 4” index card. You will use one index card for each of the participants in your study. The purpose of these index cards is to summarise your immediate thoughts: they won’t be a replacement for the transcripts or your more considered analysis but this step is vital to help stop your different participants blending one into the other. Arrange your index cards in portrait orientation and at the top of the card write your participant’s first name. Print out a passport-sized picture of your participant (or draw a sketch if you were too shy to take photographs) and stick that to the card. Then write down some concise, bullet-point descriptions about this participant. Aim for about 5 or so descriptions: these should be the things that really stood out to you. The best kinds of descriptions capture participant’s behaviours, needs and goals.
Running your first contextual interview
Let me remind you of Steve Blank’s exhortation to founders of start-ups: “Get out of the building”. I would only add: “…And into your users’ context”. Good luck and let me know how it went in the comments.
Thanks to Philip Hodgson and Todd Zazelenchuk for comments on an earlier draft of this article.
About the author
Dr. David Travis (@userfocus on Twitter) holds a BSc and a PhD in Psychology and he is a Chartered Psychologist. He has worked in the fields of human factors, usability and user experience since 1989 and has published two books on usability. David helps both large firms and start ups connect with their customers and bring business ideas to market. If you like his articles, you'll love his online user experience training course.
Love it? Hate it? Join the discussioncomments powered by Disqus
Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details
Every month, we share an in-depth article on user experience with over 10,000 newsletter readers. Want in? Sign up now and download a free guide to usability test moderation.
User Experience Articles
Our most popular articles
Our most commented articles
Our most recent articles
- Feb 6: The Beginners' Guide to Contextual Interviewing
- Jan 9: The 8 competencies of user experience: a tool for assessing and developing UX Practitioners
- Dec 5: Non-UX books that every UX practitioner should read
- Nov 1: What one UX skill or ability is the most important to master?
- Oct 5: What do we mean by user experience leadership?
Search for articles by keyword
- 7 articles tagged accessibility
- 4 articles tagged axure
- 5 articles tagged benefits
- 16 articles tagged careers
- 8 articles tagged case study
- 1 article tagged css
- 8 articles tagged discount usability
- 2 articles tagged ecommerce
- 14 articles tagged ethnography
- 14 articles tagged expert review
- 1 article tagged fitts law
- 4 articles tagged focus groups
- 1 article tagged forms
- 6 articles tagged guidelines
- 10 articles tagged heuristic evaluation
- 7 articles tagged ia
- 14 articles tagged iso 9241
- 9 articles tagged iterative design
- 3 articles tagged layout
- 2 articles tagged legal
- 11 articles tagged metrics
- 3 articles tagged mobile
- 7 articles tagged moderating
- 3 articles tagged morae
- 2 articles tagged navigation
- 9 articles tagged personas
- 15 articles tagged prototyping
- 7 articles tagged questionnaires
- 1 article tagged quotations
- 4 articles tagged roi
- 16 articles tagged selling usability
- 12 articles tagged standards
- 43 articles tagged strategy
- 2 articles tagged style guide
- 4 articles tagged survey design
- 5 articles tagged task scenarios
- 2 articles tagged templates
- 21 articles tagged tools
- 52 articles tagged usability testing
- 3 articles tagged user manual