Added Value of Eye Tracking in Usability Studies: Expert and Non-expert Participants

  • Marco C. Pretorius
  • Judy van Biljon
  • Estelle de Kock
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 332)

Abstract

This paper investigates the value of eye tracking in evaluating the usability of a Learning Management System, at an open distance learning university where the users’ computer and Web skills vary significantly. Eye tracking utilize the users’ eye movements, while doing a task, to provide information about the nature, sequence and timing of the cognitive operations that took place. This information supplements, but does not replace standard usability testing with observations. This forces the questions of when the added value of eye tracking justifies the added cost and resources. Existing research has indicated significant differences in the usability experienced by experts and non-experts on the same system. The aim of this paper is to go one step further and shed light on the type and severity of the usability problems experienced by non-expert users. Usability testing with eye tracking is a resource intensive method but our findings indicate that eye tracking adds concise, summarised evidence of usability problems that justifies the cost when testing special groups such as users deficient in Web and computer skills. The contribution of this paper is to highlight the added value of eye tracking as a usability evaluation method in working with Web non-expert users. Furthermore, the findings improve our understanding of the knowledge differences between expert and non-expert Web users and the practical challenges involved in working with non-expert users.

Keywords

Usability Eye tracking expert non-expert 

References

  1. 1.
    Avgeriou, P., Papasalouros, A., Retalis, S., Skordalakis, M.: Towards a Pattern Language for Learning Management Systems. Educational Technology & Society 6(2), 11–24 (2003)Google Scholar
  2. 2.
    Dillon, A., Song, M.: An empirical comparison of the usability for novice and expert searchers of a textual and a graphic interface to an art-resource database. Journal of Digital Information 1(1) (1997)Google Scholar
  3. 3.
    Aula, A., Majaranta, P., Räihä, K.-J.: Eye-Tracking Reveals the Personal Styles for Search Result Evaluation. In: Costabile, M.F., Paternó, F. (eds.) INTERACT 2005. LNCS, vol. 3585, pp. 1058–1061. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  4. 4.
    Law, B., Atkins, M.S., Kirkpatrick, A.E., Lomax, A.J., Mackenzie, C.L.: Eye Gaze Patterns Differentiate Novice and Experts in a Virtual Laparoscopic Surgery Training Environment. Association for Computing Machinery, New York (2004)Google Scholar
  5. 5.
    Kasrskis, P., Stehwien, J., Hickox, J., Aretz, A.: Comparison of expert and novice scan behaviours during VFR flight. In: 11th International Symposium on Aviation Psychology Columbus. The Ohio State University, OH (2001)Google Scholar
  6. 6.
    Prumper, J., Frese, M., Zap, D., Brodeck, F.: Errors in computerized office work: Differences between novices and expert users. SIGCHI Bulletin 23(2), 63–66 (1991)CrossRefGoogle Scholar
  7. 7.
    Äijö, R., Mantere, J.: Are non-expert usability evaluations valuable? In: 18th International Symposium on Human Factors in Telecommunications (HfT 2001), Bergen, Norway (2001)Google Scholar
  8. 8.
    Fields, B., Keith, S., Blandford, A.: Designing for Expert Information Finding Strategies. Technical Report: IDC-TR-2004-001 (January 2004)Google Scholar
  9. 9.
    Popovic, V.: Expert and Novice user differences and implications for product design and usability. Human Factors and Ergonomics Society Annual Meeting Proceedings 6(4), 933–936 (2007)Google Scholar
  10. 10.
    Gorman, M.E.: Types of Knowledge and Their Roles in Technology Transfer. Journal of Technology Transfer 27, 219–231 (2002)CrossRefGoogle Scholar
  11. 11.
    Tabatabai, D., Luconi, F.: Expert-Novice Differences in Searching the WebAIS Electronic Library, AISeL (1998)Google Scholar
  12. 12.
    Kotze, P., Renaud, K., Van Biljon, J.: Don’t do this - Pitfalls in using anti-patterns in teaching. Computers & Education (2006)Google Scholar
  13. 13.
    Beymer, D., Orton, P.Z., Russell, D.M.: An Eye Tracking Study of How Pictures Influence Online Reading. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) INTERACT 2007. LNCS, vol. 4663, pp. 456–460. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  14. 14.
    Rudmann, D.S., McConkie, G.W., Zheng, X.S.: Eye tracking in Cognitive State Detection for HCI. In: ICMI 2003, Vancouver, British Columbia, Canada, November 5-7 (2003)Google Scholar
  15. 15.
    Yoneki, E.: Sentient Future Competition: Ambient Intelligence by Collaborative Eye Tracking. In: European Workshop on Wireless Sensor Networks (EWSN), Zurich, Switzerland (2006)Google Scholar
  16. 16.
    Bednarik, R., Tukiainen, M.: An eye-tracking methodology for characterizing program comprehension processes. In: ETRA 2006, San Diego, California, ACM, New York (2006), 1-59593-305-0/06/0003Google Scholar
  17. 17.
    Karn, K.S., Jacob, R.J.K.: Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In: The Mind’s Eye, Cognitive and Applied Aspects of Eye Movement Research. Elsevier, Amsterdam (2003)Google Scholar
  18. 18.
    Jacob, R.J.K., Karn, K.S.: The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research. In: Hyona, R.D. (ed.). Elsevier Science, Amsterdam (2003)Google Scholar
  19. 19.
    Sutcliffe, A., Namoune, A.: Investigating User Attention and Interest in Websites. In: Baranauskas, C., Palanque, P., Abascal, J., Barbosa, S.D.J. (eds.) INTERACT 2007. LNCS, vol. 4662, pp. 88–101. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  20. 20.
    Ehmke, C., Wilson, S.: Identifying Web Usability Problems from Eye-Tracking Data. In: Proceedings of HCI: People and Computers XXI (2007)Google Scholar
  21. 21.
    Pretorius, M.C., Calitz, A.P., van Greunen, D.: The Added Value of Eye Tracking in the Usability Evaluation of a Network Management Tool. In: Proceedings of the 2005 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on IT Research in Developing Countries. White River, South Africa (2005)Google Scholar
  22. 22.
    Vu, K.L., Hanley, G.L., Strybel, T.Z., Procto, R.W.: Metacognitive Processes in Human–Computer Interaction: Self-Assessments of Knowledge as Predictors of Computer Expertise. International Journal of Human–Computer Interaction 12(1), 43–71 (2000)CrossRefGoogle Scholar
  23. 23.
    De Angeli, A., Sutcliffe, A., Hartmann, J.: Designing Interactive Systems archive. In: Proceedings of the 6th Conference on Designing Interactive Systems, pp. 271–280 (2006)Google Scholar
  24. 24.
    Barnum, C.: Usability Testing and Research. The Allyn and Bacon Series in Technical Communication (2002)Google Scholar
  25. 25.
    Nielsen, J.: Why You Only Need to Test With 5 Users. Alertbox 2000, http://www.useit.com/alertbox/20000319.html (cited 2009/01/12)

Copyright information

© IFIP 2010

Authors and Affiliations

  • Marco C. Pretorius
    • 1
  • Judy van Biljon
    • 1
  • Estelle de Kock
    • 1
  1. 1.School of ComputingUniversity of South AfricaPretoriaSouth Africa

Personalised recommendations