A Concurrent Think Aloud Study of Engagement and Usability in a Serious Game

  • Geoffrey Hookham
  • Bridgette Bewick
  • Frances Kay-Lambkin
  • Keith Nesbitt
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9894)

Abstract

This research presents a think-aloud study examining issues of engagement and usability in relation to a serious game and a more traditional online program. Results from twenty concurrent think aloud sessions involving a serious game called Shadow and its more traditional counterpart called SHADE are reported. Both programs are designed to help counsel young adults with depression and alcohol or other drug issues. An analysis of the think aloud results reveal issues related to both usability and engagement with users’ concerns cycling between content and operation of the interface. The main themes emerging from the study provide an alternative lens designers.

Keywords

Engagement Serious games Think aloud Usability Design 

References

  1. 1.
    Abt, C.C.: Serious Games. The Viking Press, New York (1970)Google Scholar
  2. 2.
    Annetta, L.A., Minogue, J., Holmes, S.Y., Cheng, M.T.: Investigating the impact of video games on high school students’ engagement and learning about genetics. Comput. Educ. 53(1), 74–85 (2009)CrossRefGoogle Scholar
  3. 3.
    Boot, W.R., Kramer, A.F., Simons, D.J., Fabiani, M., Gratton, G.: The effects of video game playing on attention, memory, and executive control. Acta Psychol. 129, 387–398 (2008)CrossRefGoogle Scholar
  4. 4.
    Kay-Lambkin, F.J., Baker, A.L., Kelly, B., Lewis, T.J.: Clinician-assisted computerised versus therapist-delivered treatment for depressive and addictive disorders: results of a randomised control trial. Med. J. Aust. 195(3), S44–S50 (2011)Google Scholar
  5. 5.
    Hookham, G., Deady, M., Kay-Lambkin, F., Nesbitt, K.: Training for life: designing a game to engage younger people in a psychological counselling program. Aust. J. Intell. Inf. Process. Syst. 13(3) (2012)Google Scholar
  6. 6.
    Hookham, G., Kay-Lambkin, F., Blackmore, K., Nesbitt, K.: Using startle probe to compare affect and engagement between a serious game and an online intervention program. In: Proceedings of the Australasian Computer Science Week Multiconference, p. 75. ACM (2016)Google Scholar
  7. 7.
    Hookham, G., Nesbitt, K., Kay-Lambkin, F.: Comparing usability and engagement between a serious game and a traditional online program. In: Proceedings of the Australasian Computer Science Week Multiconference, p. 54. ACM (2016)Google Scholar
  8. 8.
    Fredricks, J.A., Blumenfeld, P.C., Paris, A.H.: School engagement: potential of the concept, state of the evidence. Rev. Educ. Res. 74(1), 59–109 (2004)CrossRefGoogle Scholar
  9. 9.
    Mills, C., D’Mello, S., Lehman, B., Bosch, N., Strain, A., Graesser, A.: What makes learning fun? exploring the influence of choice and difficulty on mind wandering and engagement during learning. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS, vol. 7926, pp. 71–80. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  10. 10.
    Brockmyer, J.H., Fox, C.M., Curtiss, K.A., McBroom, E., Burkhart, K.M., Pidruzny, J.N.: The development of the game engagement questionnaire: a measure of engagement in video game-playing. J. Exp. Soc. Psychol. 45, 11 (2009)CrossRefGoogle Scholar
  11. 11.
    Douglas, J.Y., Hargadon, A.: The pleasures of immersion and engagement: schemas, scripts and the fifth business. Digital Creativity 12(3), 153–166 (2001)CrossRefGoogle Scholar
  12. 12.
    Csikszentmihalyi, M.: Flow: The Psychology of Optimal Experience. Harper & Row, New York (1990)Google Scholar
  13. 13.
    IJsselsteijn, W., de Kort, Y., Poels, K., Jurgelionis, A., Bellotti, F.: Characterising and measuring user experiences in digital games. Paper presented at the International Conference on Advances in Computer Entertainment Technology (2007)Google Scholar
  14. 14.
    Sweetser, P., Wyeth, P.: GameFlow: a model for evaluating player enjoyment in games. Comput. Entertainment (CIE) 3(3), 3 (2005)CrossRefGoogle Scholar
  15. 15.
    Blumenthal, T.D., Cuthbert, B.N., Filion, D.L., Hackley, S., Lipp, O.V., Boxtel, A.: Committee report: guidelines for human startle eyeblink electromyographic studies. Psychophysiology 42(1), 1–15 (2005)CrossRefGoogle Scholar
  16. 16.
    Ericsson, K.A., Simon, H.A.: How to study thinking in everyday life: contrasting think-aloud protocols with descriptions and explanations of thinking. Mind Cult. Act. 5(3), 178–186 (1998)CrossRefGoogle Scholar
  17. 17.
    Bangor, A., Kortum, P. T., Miller, J.T.: An Empirical Evaluation of the System Usability Scale (2008)Google Scholar
  18. 18.
    Brooke, J.: SUS-A quick and dirty usability scale. Usability Eval. Indus. 189(194), 4–7 (1996)Google Scholar
  19. 19.
    Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q., September 1989Google Scholar
  20. 20.
    Green, A.: Verbal protocol analysis. The Psychologist (1995)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Geoffrey Hookham
    • 1
  • Bridgette Bewick
    • 2
  • Frances Kay-Lambkin
    • 3
  • Keith Nesbitt
    • 1
  1. 1.University of NewcastleNewcastleAustralia
  2. 2.University of LeedsLeedsUK
  3. 3.University of New South WalesSydneyAustralia

Personalised recommendations