Human-Computer Interaction

INTERACT 2015: Human-Computer Interaction – INTERACT 2015 pp 159-176 | Cite as

An Empirical Study of the Effects of Three Think-Aloud Protocols on Identification of Usability Problems

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9297)

Abstract

Think-aloud is a de facto standard in user-based usability evaluation to verbalize what a user is experiencing. Despite its qualities, it has been argued that thinking aloud affects the task solving process. This paper reports from an empirical study of the effect of three think-aloud protocols on the identified usability problems. The three protocols were traditional, active listening and coaching. The study involved 43 test subjects distributed on the three think-aloud conditions and a silent control condition in a between-subject design. The results show that the three think-aloud protocols facilitated identification of the double number of usability problems compared to the silent condition, while the problems identified by the three think-aloud protocol were comparable. Our results do not support the common emphasis on the Coaching protocol, while we have seen that the Traditional protocol performs surprisingly well.

Keywords

Usability evaluation Thinking aloud Verbalization Think-aloud protocols Empirical study 

References

  1. 1.
    Andreasen, M.S., Nielsen, H.V., Schrøder, S.O., Stage, J.: What happened to remote usability testing? an empirical study of three methods. In: Proceedings of Conference on Human Factors in Computing Systems 2007 (CHI 2007), pp. 1405–1414. ACM Press, New York (2007)Google Scholar
  2. 2.
    Boren, T., Ramey, J.: Thinking aloud: Reconciling theory and practice. IEEE Trans. Prof. Commun. 43(3), 261–278 (2000)CrossRefGoogle Scholar
  3. 3.
    Bruun, A., Gull, P., Hofmeister, L., Stage, J.: Let your users do the testing: a comparison of three remote asynchronous usability testing methods. In: Proceedings of Conference on Human Factors in Computing Systems 2009 (CHI 2009), pp. 1619–1628. ACM Press, New York (2009)Google Scholar
  4. 4.
    Cohen, J.: Statistical power analysis for the behavioral sciences, 2nd edn. Lawrence Erlbaum Associates, Hillsdale (1988)Google Scholar
  5. 5.
    Dumas, J., Redish, J.: A Practical Guide to Usability Testing. Intellect Press, Portland (1999)Google Scholar
  6. 8.
    Ericsson, K.A., Simon, H.A.: Protocol Analysis: Verbal Reports as Data, revised edn. MIT Press, Cambridge (1996)Google Scholar
  7. 7.
    Gray, W.D., Salzman, M.C.: Damaged Merchandise? A review of experiments that compare usability evaluation methods. Hum. Comput. Interact. 13(3), 203–261 (1998)CrossRefGoogle Scholar
  8. 8.
    van den Haak, M.J., de Jong, M.D.T., Schellens, P.J.: Retrospective vs. concurrent think-aloud protocols: testing the usability of an online library catalogue. Behav. Inf. Technol. 22(5), 339–351 (2003)CrossRefGoogle Scholar
  9. 9.
    Henderson, R.D., Smith, M.C., Podd, J., Varela-Alvarez, H.: A comparison of the four prominent user-based methods for evaluating the usability of computer software. Ergonomics 38(10), 2030–2044 (1995)CrossRefGoogle Scholar
  10. 10.
    Hertzum, M., Hansen, K., Anderson, H.: Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload? Behav. Inf. Technol. 28(2), 165–181 (2009)CrossRefGoogle Scholar
  11. 11.
    Hertzum, M., Holmegaard, K.D.: Thinking aloud in the presence of interruptions and time constraints. Int. J. Hum.-Comput. Interact. 29(5), 351–364 (2013)CrossRefGoogle Scholar
  12. 12.
    Hertzum, M., Jacobsen, N.E.: The evaluator effect: a chilling fact about usability evaluation methods. Int. J. Hum. Comput. Interact. 15, 183–204 (2003). Taylor & FrancisCrossRefGoogle Scholar
  13. 13.
    ISO 9241-11 (1998) Ergonomic requirements for office work with visual display terminals (VDTs). Part 11: Guidance on usability. ISO (1998)Google Scholar
  14. 14.
    Krahmer, E., Ummelen, N.: Thinking about thinking aloud: A comparison of two verbal protocols for usability testing. IEEE Trans. Prof. Commun. 47(2), 105–117 (2004)CrossRefGoogle Scholar
  15. 15.
    McDonald, S., Edwards, H., Zhao, T.: Exploring think-alouds in usability testing: an international survey. IEEE Trans. Prof. Commun. 55(1), 2–19 (2012)CrossRefGoogle Scholar
  16. 16.
    Molich, R.: User-Friendly Web Design (in Danish). Ingeniøren Books, Copenhagen (2000)Google Scholar
  17. 17.
    Nielsen, J.: Usability Engineering. Academic Press, Cambridge (1993)MATHGoogle Scholar
  18. 18.
    Nørgaard, M., Hornbæk, K.: What do usability evaluators do in Practice? An explorative study of think-aloud testing. In: Proceedings of DIS 2006, pp. 209–219. ACM Press, New YorkGoogle Scholar
  19. 19.
    Olmsted-Hawala, E.L., Murphy, E.D., Hawala, S., Ashenfelter, K.T.: Think-aloud protocols: a comparison of three think-aloud protocols for use in testing data-dissemination web sites for usability. In: Proceedings of CHI 2010, pp. 2381–2390. ACM, New YorkGoogle Scholar
  20. 20.
    Rhenius, D., Deffner, G.: Evaluation of concurrent thinking aloud using eye-tracking data. In: Proceedings of Human Factors Society 34th Annual Meeting, pp. 1265–1269 (1990)Google Scholar
  21. 21.
    Rubin, J., Chisnell, D.: Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley, Hoboken (2008)Google Scholar
  22. 22.
    Skov, M.B., Stage, J.: A conceptual tool for usability problem identification in website development. Int. J. Inf. Technol. Web. Eng. 4(4), 22–35 (2009)CrossRefGoogle Scholar
  23. 23.
    Wixon, D.: Evaluating usability methods: why the current literature fails the practitioner. Interactions 10(4), 29–34 (2003)CrossRefGoogle Scholar
  24. 24.
    Woolrych, A., Hornbæk, K., Frøkjær, E., Cockton, G.: Ingredients rather than recipes: A proposal for research that does not treat usability evaluation methods as indivisible wholes. Int. J. Hum.-Comput. Interact. 27(10), 940–970 (2011)CrossRefGoogle Scholar
  25. 25.
    Wright, R., Converse, S.: Method bias and concurrent verbal protocol in software usability testing. In: Proceedings of Human Factors Society 36th Annual Meeting, pp. 1220–1224 (1992)Google Scholar
  26. 26.
    Zhao, T., McDonald, S., Edwards, H.M.: The impact of two different think-aloud instructions in a usability test: a case of just following orders? Behav. Inf. Technol. (2012) Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  1. 1.Department of Computer ScienceAalborg UniversityAalborg EastDenmark

Personalised recommendations