Timing Game-Based Practice in a Reading Comprehension Strategy Tutor

  • Matthew E. Jacovina
  • G. Tanner Jackson
  • Erica L. Snow
  • Danielle S. McNamara
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9684)

Abstract

Game-based practice within Intelligent Tutoring Systems (ITSs) can be optimized by examining how properties of practice activities influence learning outcomes and motivation. In the current study, we manipulated when game-based practice was available to students. All students (n = 149) first completed lesson videos in iSTART-2, an ITS focusing on reading comprehension strategies. They then practiced with iSTART-2 for two 2-hour sessions. Students’ first session was either in a game or nongame practice environment. In the second session, they either switched to the alternate environment or remained in the same environment. Students’ comprehension was tested at pretest and posttest, and motivational measures were collected. Overall, students’ comprehension increased from pretest to posttest. Effect sizes of the pretest to posttest gain suggested that switching from the game to nongame environment was least effective, while switching from a nongame to game environment or remaining in the game environment was more effective. However, these differences between the practice conditions were not statistically significant, either on comprehension or motivation measures, suggesting that for iSTART-2, the timing of game-based practice availability does not substantially impact students’ experience in the system.

Keywords

Game-based learning Intelligent Tutoring Systems Comprehension Motivation 

Notes

Acknowledgments

This research was supported in part by the Institute for Educational Sciences (IES R305A130124). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the IES. We thank the many colleagues and students who have contributed to this work, and extend a special thanks to Tricia Guerrero for her help in coding data for this project.

References

  1. 1.
    Steenbergen-Hu, S., Cooper, H.: A meta-analysis of the effectiveness of intelligent tutoring systems on college students’ academic learning. J. Educ. Psychol. 106, 331–347 (2014)CrossRefGoogle Scholar
  2. 2.
    D’Mello, S., Olney, A., Williams, C., Hays, P.: Gaze tutor: a gaze-reactive intelligent tutoring system. Int. J. Hum Comput Stud. 70, 377–398 (2012)CrossRefGoogle Scholar
  3. 3.
    McNamara, D.S., Jackson, G.T., Graesser, A.C.: Intelligent tutoring and games (ITaG). In: Baek, Y.K. (ed.) Gaming for Classroom-Based Learning: Digital Roleplaying as a Motivator of Study. IGI Global, Hershey (2010)Google Scholar
  4. 4.
    Gee, J.P.: What Video Games Have to Teach Us About Learning and Literacy. Palgrave Macmillan, New York (2003)Google Scholar
  5. 5.
    Richter, G., Raban, D.R., Rafaeli, S.: Studying gamification: the effect of rewards and incentives on motivation. In: Reiners, T., Wood, L. (eds.) Gamification in Education and Business, pp. 21–46. Springer International Publishing, Switzerland (2015)Google Scholar
  6. 6.
    Papastergiou, M.: Digital game-based learning in high school computer science education: impact on educational effectiveness and student motivation. Comput. Educ. 52, 1–12 (2009)CrossRefGoogle Scholar
  7. 7.
    Belanich, J., Orvis, K.L., Sibley, D.E.: PC-based game features that influence instruction and learner motivation. Mil. Psychol. 25, 206–217 (2013)CrossRefGoogle Scholar
  8. 8.
    Amory, A., Naicker, K., Vincent, J., Adams, C.: The use of computer games as an educational tool: identification of appropriate game types and game elements. Br. J. Educ. Technol. 30, 311–321 (1999)CrossRefGoogle Scholar
  9. 9.
    Adams, D.M., Mayer, R.E., MacNamara, A., Koenig, A., Wainess, R.: Narrative games for learning: testing the discovery and narrative hypotheses. J. Educ. Psychol. 104, 235–249 (2012)CrossRefGoogle Scholar
  10. 10.
    Rieber, L.P., Noah, D.: Games, simulations, and visual metaphors in education: antagonism between enjoyment and learning. Educ. Media Int. 45, 77–92 (2008)CrossRefGoogle Scholar
  11. 11.
    Jackson, G.T., Dempsey, K.B., McNamara, D.S.: Game-based practice in reading strategy tutoring system: showdown in iSTART-ME. In: Reinders, H. (ed.) Computer games, pp. 115–138. Multilingual Matters, Bristol (2012)Google Scholar
  12. 12.
    Tsai, F.H., Tsai, C.C., Lin, K.Y.: The evaluation of different gaming modes and feedback types on game-based formative assessment in an online learning environment. Comput. Educ. 81, 259–269 (2015)CrossRefGoogle Scholar
  13. 13.
    Varonis, E.M., Varonis, M.E.: Deconstructing candy crush: what instructional design can learn from game design. Int. J. Inf. Learn. Technol. 32, 150–164 (2015)CrossRefGoogle Scholar
  14. 14.
    Wouters, P., van Nimwegen, C., van Oostendorp, H., van der Spek, E.D.: A meta-analysis of the cognitive and motivational effects of serious games. J. Educ. Psychol. 105, 249–265 (2013)CrossRefGoogle Scholar
  15. 15.
    Eagle, M., Barnes, T.: Evaluation of a game-based lab assignment. In: Proceedings of the 4th International Conference on Foundations of Digital Games (FDG 2009), pp. 64–70. ACM, New York, NY (2009)Google Scholar
  16. 16.
    Jackson, G.T., McNamara, D.S.: Motivation and performance in a game-based intelligent tutoring system. J. Educ. Psychol. 105, 1036–1049 (2013)CrossRefGoogle Scholar
  17. 17.
    Snow, E.L., Allen, L.K., Jacovina, M.E., McNamara, D.S.: Does agency matter?: exploring the impact of controlled behaviors within a game-based environment. Comput. Educ. 26, 378–392 (2014)Google Scholar
  18. 18.
    McNamara, D.S., Magliano, J.P.: Towards a comprehensive model of comprehension. In: Ross, B. (ed.) The Psychology of Learning and Motivation, vol. 51. Elsevier Science, New York (2009)CrossRefGoogle Scholar
  19. 19.
    Jackson, G.T., Varner (Allen), L.K., Boonthum-Denecke, C., McNamara, D.S.: The Impact of individual differences on learning with an educational game and a traditional ITS. Int. J. Learn. Technol. 8, 315–336 (2013)CrossRefGoogle Scholar
  20. 20.
    Jackson, G.T., Guess, R.H., McNamara, D.S.: Assessing cognitively complex strategy use in an untrained domain. Top. Cogn. Sci. 2, 127–137 (2010)CrossRefGoogle Scholar
  21. 21.
    McNamara, D.S., O’Reilly, T., Best, R., Ozuru, Y.: Improving adolescent students’ reading comprehension with iSTART. J. Educ. Comput. Res. 34, 147–171 (2006)CrossRefGoogle Scholar
  22. 22.
    McNamara, D.S., Graesser, A.C., McCarthy, P., Cai, Z.: Automated Evaluation of Text and Discourse with Coh-Metrix. Cambridge University Press, Cambridge (2014)CrossRefGoogle Scholar
  23. 23.
    Boekaerts, M.: The on-line motivation questionnaire: a self-report instrument to assess students’ context sensitivity. New Dir. Measures Methods 12, 77–120 (2002)Google Scholar
  24. 24.
    Jacovina, M.E., Snow, E.L., Jackson, G., McNamara, D.S.: Game features and individual differences: interactive effects on motivation and performance. In: Conati, C., Heffernan, N., Mitrovic, A., Verdejo, M. (eds.) AIED 2015. LNCS, vol. 9112, pp. 642–645. Springer, Heidelberg (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Matthew E. Jacovina
    • 1
  • G. Tanner Jackson
    • 2
  • Erica L. Snow
    • 3
  • Danielle S. McNamara
    • 1
  1. 1.Institute for the Science of Teaching and LearningArizona State UniversityTempeUSA
  2. 2.Cognitive ScienceEducational Testing ServicePrincetonUSA
  3. 3.SRI InternationalMenlo ParkUSA

Personalised recommendations