Short and Long Term Benefits of Enjoyment and Learning within a Serious Game

  • G. Tanner Jackson
  • Kyle B. Dempsey
  • Danielle S. McNamara
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6738)

Abstract

Intelligent Tutoring Systems (ITSs) have been used for decades to teach students domain content or strategies. ITSs often struggle to maintain students’ interest and sustain a productive practice environment over time. ITS designers have begun integrating game components as an attempt to engage learners and maintain motivation during prolonged interactions. Two studies were conducted to investigate enjoyment and performance at short-term (90 minutes) and long-term (3 weeks) timescales. The short-term study (n=34) found that students in a non-game practice condition performed significantly better and wrote more than the game-based practice. However, the long-term study (n=9) found that when students were in the game-based environment they produced longer contributions than when in the non-game version. Both studies revealed trends that the game-based system was slightly more enjoyable, though the differences were not significant. The different trends across studies indicate that games may contribute to an initial decrease in performance, but that students are able to close this gap over time.

Keywords

Serious Games Intelligent Tutoring Systems game-based learning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Newell, A., Rosenbloom, P.: Mechanisms of skill acquisition and the law of practice. In: Anderson, J.R. (ed.) Cognitive Skills and their Acqusition, pp. 1–55. Hillsdale, NJ (1981)Google Scholar
  2. 2.
    Koedinger, K.R., Corbett, A.T.: Cognitive Tutors: Technology bringing learning science to the classroom. In: Sawyer, K. (ed.) The Cambridge Handbook of the Learning Sciences, pp. 61–78. Cambridge University Press, Cambridge (2006)Google Scholar
  3. 3.
    Bell, C., McNamara, D.S.: Integrating iSTART into a high school curriculum. In: Proceedings of the 29th Annual Meeting of the Cognitive Science Society, Cognitive Science Society, Austin (2007)Google Scholar
  4. 4.
    McNamara, D.S., Jackson, G.T., Graesser, A.C.: Intelligent tutoring and games (ITaG). In: Baek, Y.K. (ed.) Gaming for Classroom-Based Learning: Digital Role-Playing as a Motivator of Study. IGI Global (2010)Google Scholar
  5. 5.
    O’Neil, H.F., Wainess, R., Baker, E.L.: Classification of learning outcomes: Evidence from the computer games literature. Curriculum Journal 16, 455–474 (2005)CrossRefGoogle Scholar
  6. 6.
    McNamara, D.S., Levinstein, I.B., Boonthum, C.: iSTART: Interactive strategy trainer for active reading and thinking. Behavioral Research Methods, Instruments, & Computers 36, 222–233 (2004)CrossRefGoogle Scholar
  7. 7.
    Magliano, J.P., Todaro, S., Millis, K., Wiemer-Hastings, K., Kim, H.J., McNamara, D.S.: Changes in reading strategies as a function of reading training: A comparison of live and computerized training. Journal of Educational Computing Research 32, 185–208 (2005)CrossRefGoogle Scholar
  8. 8.
    Jackson, G.T., Boonthum, C., McNamara, D.S.: The efficacy of iSTART extended practice: Low ability students catch up. In: Aleven, V., Kay, J., Mostow, J. (eds.) ITS 2010. LNCS, vol. 6095, pp. 349–351. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  9. 9.
    Jackson, G.T., Dempsey, K.B., McNamara, D.S.: The evolution of an automated reading strategy tutor: From classroom to a game-enhanced automated system. In: Khine, M.S., Saleh, I.M. (eds.) New Science of Learning: Cognition, Computers and Collaboration in Education, pp. 283–306. Springer, New York (2010)CrossRefGoogle Scholar
  10. 10.
    Cordova, D.I., Lepper, M.R.: Intrinsic motivation and the process of learning: Beneficial effects of contextualization, personalization, and choice. J. Ed. Psyc. 88, 715–730 (1996)CrossRefGoogle Scholar
  11. 11.
    Papastergiou, M.: Digital game-based learning in high school computer science education: Impact on educational effectiveness and student motivation. Comput. Educ. 52, 1–12 (2009)CrossRefGoogle Scholar
  12. 12.
    McNamara, D.S., Boonthum, C., Levinstein, I.B., Millis, K.: Evaluating self-explanations in iSTART: comparing word-based and LSA algorithms. In: Landauer, T., McNamara, D.S., Dennis, S., Kintsch, W. (eds.) Handbook of Latent Semantic Analysis, pp. 227–241. Erlbaum, Mahwah (2007)Google Scholar
  13. 13.
    Jackson, G.T., Guess, R.H., McNamara, D.S.: Assessing cognitively complex strategy use in an untrained domain. Topics in Cognitive Science 2, 127–137 (2010)CrossRefGoogle Scholar
  14. 14.
    Jennett, C., Cox, A.L., Cairns, P., Dhoparee, S., Epps, A., Tijs, T., Walton, A.: Measuring and defining the experience of immersion in games. International Journal of Human-Computer Studies 66, 641–661 (2008)CrossRefGoogle Scholar
  15. 15.
    Garris, R., Ahlers, R., Driskell, J.E.: Games, motivation, and learning: A research and practice model. Simulation & Gaming 33, 441–467 (2002)CrossRefGoogle Scholar
  16. 16.
    Gee, J.P.: What video games have to teach us about learning and literacy. Palgrave MacMillian, New York (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • G. Tanner Jackson
    • 1
  • Kyle B. Dempsey
    • 1
  • Danielle S. McNamara
    • 1
  1. 1.Psychology DepartmentUniversity of MemphisMemphisUSA

Personalised recommendations