Serious Games Analytics pp 381-399

Part of the Advances in Game-Based Learning book series (AGBL) | Cite as

Replay Analysis in Open-Ended Educational Games

  • Erik Harpstead
  • Christopher J. MacLellan
  • Vincent Aleven
  • Brad A. Myers

Abstract

Designers of serious games have an interest in understanding if their games are well-aligned, i.e., whether in-game rewards incentivize behaviors that will lead to learning. Few existing serious games analytics solutions exist to serve this need. Open-ended games in particular run into issues of alignment due to their affordances for wide player freedom. In this chapter, we first define open-ended games as games that have a complex functional solution spaces. Next, we describe our method for exploring alignment issues in an open-ended educational game using replay analysis. The method uses multiple data mining techniques to extract features from replays of player behavior. Focusing on replays rather than logging play-time metrics allows designers and researchers to run additional metric calculations and data transformations in a post hoc manner. We describe how we have applied this replay analysis methodology to explore and evaluate the design of the open-ended educational game RumbleBlocks. Using our approach, we were able to map out the solution space of the game and highlight some potential issues that the game’s designers might consider in iteration. Finally, we discuss some of the limitations of the replay approach.

Keywords

Alignment Replay analysis Open-ended games 

References

  1. Andersen, E., Gulwani, S., & Popovic, Z. (2013). A trace-based framework for analyzing and synthesizing educational progressions. In Proceedings of the 31st International ACM SIGCHI Conference on Human Factors in Computing Systems—CHI ‘13 (pp. 773–782). New York: ACM Press. doi:10.1145/2470654.2470764
  2. Andersen, E., O’Rourke, E., Liu, Y., Snider, R., Lowdermilk, J., Truong, D., et al. (2012). The impact of tutorials on games of varying complexity. In Proceedings of the 30th International ACM SIGCHI Conference on Human Factors in Computing Systems—CHI ‘12 (pp. 59–68). doi:10.1145/2207676.2207687
  3. Barab, S., Thomas, M., Dodge, T., Carteaux, R., & Tuzun, H. (2005). Making learning fun: Quest Atlantis, a game without guns. Educational Technology Research and Development. doi:10.1007/BF02504859.Google Scholar
  4. Christel, M. G., Stevens, S. M., Maher, B. S., Brice, S., Champer, M., Jayapalan, L., et al. (2012). RumbleBlocks: Teaching science concepts to young children through a unity game. In Proceedings of CGAMES’2012 USA—17th International Conference on Computer Games: AI, Animation, Mobile, Interactive Multimedia, Educational and Serious Games (pp. 162–166). IEEE. doi:10.1109/CGames.2012.6314570
  5. Clark, D. B., Tanner-Smith, E. E., Killingsworth, S., & Bellamy, S. (2013). Digital games for learning: A systematic review and meta-analysis (executive summary). Menlo Park, CA: SRI International.Google Scholar
  6. Gee, J. P. (2003). What video games have to teach us about learning and literacy. Computers in entertainment. New York: Palgrave Macmillan. doi:10.1145/950566.950595.Google Scholar
  7. Hamerly, G., & Elkan, C. (2004). Learning the k in k-means. In S. Thrun, L. K. Saul, & B. Schölkopf (Eds.), Advances in Neural Information Processing Systems 16: Proceedings of the 2003 Conference (pp. 281–288). Cambridge, MA: MIT Press.Google Scholar
  8. Harpstead, E., MacLellan, C. J., Aleven, V., & Myers, B. A. (2014). Using extracted features to inform alignment-driven design ideas in an educational game. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems—CHI ‘14 (pp. 3329–3338). New York: ACM Press. doi:10.1145/2556288.2557393
  9. Harpstead, E., Maclellan, C. J., Koedinger, K. R., Aleven, V., Dow, S. P., & Myers, B. A. (2013). Investigating the solution space of an open-ended educational game using conceptual feature extraction. In Proceedings of the International Conference on Educational Data Mining—EDM ‘13 (pp. 51–58).Google Scholar
  10. Harpstead, E., Myers, B., & Aleven, V. (2013). In search of learning: facilitating data analysis in educational games. In Proceedings of the 31st International ACM SIGCHI Conference on Human Factors in Computing Systems—CHI ‘13 (pp. 79–88). Paris: ACM Press. doi:10.1145/2470654.2470667
  11. Hunicke, R., Leblanc, M., & Zubek, R. (2004). MDA: A formal approach to game design and game research. In Proceedings of the AAAI Workshop on Challenges in Game AI (pp. 1–5).Google Scholar
  12. Ketelhut, D. J. (2006). The impact of student self-efficacy on scientific inquiry skills: An exploratory investigation in river city, a multi-user virtual environment. Journal of Science Education and Technology, 16(1), 99–111. doi:10.1007/s10956-006-9038-y.CrossRefGoogle Scholar
  13. Liu, Y., Andersen, E., & Snider, R. (2011). Feature-based projections for effective playtrace analysis. In Proceedings of the 6th International Conference on Foundations of Digital Games—FDG ‘11 (pp. 69–76). ACM Press. doi:10.1145/2159365.2159375
  14. Lomas, D., Patel, K., Forlizzi, J. L., & Koedinger, K. R. (2013). Optimizing challenge in an educational game using large-scale design experiments. In Proceedings of the 31st International ACM SIGCHI Conference on Human Factors in Computing Systems—CHI ‘13 (pp. 89–98). ACM Press. doi:10.1145/2470654.2470668
  15. Long, Y., & Aleven, V. (2014). Gamification of joint student/system control over problem selection in a linear equation tutor. In Proceedings of the 12th International Conference on Intelligent Tutoring Systems (pp. 378–387). doi:10.1007/978-3-319-07221-0_47
  16. National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. In H. Quinn, H. Schweingruber, & T. Keller (Eds.), Social sciences. Washington, DC: The National Academies Press.Google Scholar
  17. Owen, V. E., & Halverson, R. (2013). ADAGE (Assessment Data Aggregator for Game Environments): A click-stream data framework for assessment of learning in play. In Proceedings of the 9th Games + Learning + Society Conference—GLS 9.0 (Vol. 9, pp. 248–254). ETC Press.Google Scholar
  18. Schell, J. (2008). The art of game design: A book of lenses (1st ed.). Burlington, MA: Morgan Kaufmann.Google Scholar
  19. Smith, A. M., Andersen, E., Mateas, M., & Popović, Z. (2012). A case study of expressively constrainable level design automation tools for a puzzle game. In Proceedings of the 7th International Conference on Foundations of Digital Games—FDG ‘12 (p. 156). ACM Press. doi:10.1145/2282338.2282370
  20. Smith, A. M., Butler, E., & Popović, Z. (2013). Quantifying over play: Constraining undesirable solutions in puzzle design. In Proceedings of the 8th International Conference on Foundations of Digital Games—FDG ‘13 (pp. 221–228).Google Scholar
  21. Spring, F., & Pellegrino, J. W. (2011). The challenge of assessing learning in open games: HORTUS as a case study. In Proceedings of the 8th Games + Learning + Society Conference—GLS 8.0 (pp. 209–217).Google Scholar
  22. Squire, K. (2008). Open-ended video games: A model for developing learning for the interactive age. In K. Salen (Ed.), The ecology of games: Connecting youth, games, and learning (pp. 167–198). Cambridge, MA: MIT Press. doi:10.1162/dmal.9780262693646. 167.Google Scholar
  23. Weber, B. G., & Mateas, M. (2009). A data mining approach to strategy prediction. In 2009 IEEE Symposium on Computational Intelligence and Games (pp. 140–147). IEEE. doi:10.1109/CIG.2009.5286483

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Erik Harpstead
    • 1
  • Christopher J. MacLellan
    • 1
  • Vincent Aleven
    • 1
  • Brad A. Myers
    • 1
  1. 1.Carnegie Mellon UniversityPittsburghUSA

Personalised recommendations