The checkpoints
These checkpoints are based on the work of Julian Meltzoff (1997).
- Is the research question clearly stated?
- Does the introduction, statement of the problem, and overview of any literature or previous reports adequately set the background for the reader, and is this material consistent with the research question?
- Is it clear why the study was conducted?
- Given the research question and the background material are the research hypotheses and objectives appropriate and clearly stated?
- Are key terms well defined?
- Is the independent variable appropriate given the question of the study? Are the levels of the independent variable appropriate?
- Is the dependent variable appropriate for the study?
- Are the criterion measures of the dependent variable appropriate, valid, and reliable?
- Are the scoring, rating, and judging procedures valid and reliable?
- Is the measuring apparatus (if any) accurate and reliable?
- Are the controls appropriate? Can the results be affected by variables that have not been controlled? Are the controls or control groups (if used) properly selected?
- Is the research design suitable to meet the objectives of the study? Is the research design appropriate to test the hypotheses and answer the research question?
- Are the methods and procedures clearly described in sufficient detail to be understood and replicated?
- Is the presentation sequence of test stimuli (including any randomisations or counter-balancing) appropriate?
- Are the test participants properly oriented and motivated? What is their understanding of the task? Are the instructions sufficiently clear and precise?
- Are there any signs of experiment bias in the design, data collection, assessment, analysis, or reporting?
- Are the participants properly selected? Is the sample representative and unbiased? Do the procedures adhere to the guidelines for the protection and well being of participants?
- Is the sample size appropriate? Are the appropriate procedures used to assign participants to groups, treatments, or conditions? Are suitable techniques used to establish group equivalence, such as matching, equating, or randomising?
- Does participant attrition occur and if so does it bias the sample?
- Are bad data properly identified and set aside (not included in the final test data set); and are instances of bad data reported and explained as such?
- Have the data been appropriately analysed, sorted, categorised, grouped, prioritised etc.?
- Are descriptive statistics used? Are these accurate?
- Are the inferential statistical tests appropriate? Are the assumptions for their use met? Are there any errors in the calculation or presentation of statistical results?
- Are all graphs correctly labelled (both the X and Y axes)? Are data elements on graphs properly coded, and identified?
- Are tables and figures clearly labelled and accurately presented and referenced in the text? Are results and findings correctly interpreted, properly reported, given meaning and placed in context?
- Are recommendations unambiguous? Do recommendations follow clear usability, human factors, or ergonomics guidelines?
- Are recommendations supported by references to prior literature or to industry standards?
- Is the discussion section of the report reasonable in view of the data?
- Are the conclusions valid and justified by the data?
- Are the generalizations valid?
- Do references (if used) match the citations in the text?
- Have ethical standards been adhered to in all phases of the research?
- What can be done to improve or re-design the study?
Reference
Meltzoff, J. (1997) Critical Thinking About Research: Psychology and Related Fields. American Psychological Association.
Download a printable version
Use this version as a working checklist to evaluate a usability study.
Research checklist (pdf format, 60KB)
About the author

Dr. Philip Hodgson (@bpusability on Twitter) holds a B.Sc., M.A., and Ph.D. in Experimental Psychology. He has over twenty years of experience as a researcher, consultant, and trainer in usability, user experience, human factors and experimental psychology. His work has influenced product and system design in the consumer, telecoms, manufacturing, packaging, public safety, web and medical domains for the North American, European, and Asian markets.

Foundation Certificate in UX
Gain hands-on practice in all the key areas of UX while you prepare for the BCS Foundation Certificate in User Experience. More details

Usability test plan toolkit
This eBook contains all you need to make sure that you're fully prepared for your next usability test. Usability test plan toolkit.

Related articles & resources
This article is tagged tools, usability testing.
User Experience Articles & Videos
Our most recent videos
Our most recent articles
- Dec 2: Usability task scenarios: The beating heart of a usability test
- Nov 4: Common traps in user needs research and how to avoid them
- Oct 7: Transitioning from academic research to UX research
- Sep 2: The minimalist field researcher: What's in my bag?
- Aug 5: The future of UX research is automated, and that's a problem
Filter articles by keyword
- accessibility •
- axure •
- benefits •
- careers •
- case study •
- css •
- discount usability •
- ecommerce •
- ethnography •
- expert review •
- fitts law •
- focus groups •
- forms •
- guidelines •
- heuristic evaluation •
- ia •
- iso 9241 •
- iterative design •
- layout •
- legal •
- metrics •
- mobile •
- moderating •
- morae •
- navigation •
- personas •
- prototyping •
- questionnaires •
- quotations •
- roi •
- selling usability •
- standards •
- strategy •
- style guide •
- survey design •
- task scenarios •
- templates •
- tools •
- usability testing •
- user manual