Tracking Students’ Activities in Serious Games
A Serious Game (SG) is a virtual process designed for the purpose of real-world problem-solving. In SG analytics studies, learning processes are tracked using diverse techniques to support the personalization of instruction. However, it is a challenge to find potential meanings of each parameter of the tracking logs and define an appropriate indicator for a user’s behavior. Game tracking logs often only provide limited information regardless of a game context. Therefore, research such as combining game data analysis with visualization techniques is needed to provide a holistic view of the gaming process and player behaviors. This study focused on the learning analytics of students’ activities in a 3D immersive SG environment called Alien Rescue (AR, http://alienrescue.edb.utexas.edu), which is designed for middle school science learning. The goal of this study was to understand the relationship between students’ activities—as shown in log data—and their performance in the environment. Students’ activity logs and their performance scores were analyzed using both statistics and visualization techniques. The findings on SG tracking variables, learning paths based on different performance groups, and the most frequent learning path are reported in this paper.
KeywordsSerious games analytics Visualization Learning path
We would like to acknowledge the help by Damilola Shonaike in creating the image in Fig. 10.5 as part of her research project.
- Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31, 542–550. doi: 10.1016/j.chb.2013.05.031.CrossRefGoogle Scholar
- Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., et al. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.Google Scholar
- Djaouti, D., Alvarez, J., Jessel, J. P., & Rampnoux, O. (2011). Origins of serious games. In Serious games and edutainment applications (pp. 25–43). London: Springer.Google Scholar
- Drachen, A., & Canossa, A. (2009). Towards gameplay analysis via gameplay metrics. In Proceedings of from The 13th International MindTrek Conference: Everyday Life in the Ubiquitous Era (pp. 202–209). Association for Computing Machinery.Google Scholar
- Johnson, L., Adams, S., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher (Education ed.). Austin, Texas: The New Media Consortium.Google Scholar
- Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). The NMC Horizon Report: 2014 Higher Education Edition. Austin, Texas: The New Media Consortium.Google Scholar
- Lajoie, S. P. (1993). Computer environments as cognitive tools for enhancing learning. In S. P. Lajoie & S. J. Derry (Eds.), Computers as cognitive tools (pp. 261–288). Hillsdale, NJ: Lawrence Erlbaum Associates Inc.Google Scholar
- Linek, S. B., Marte, B., & Albert, D. (2008). The differential use and effective combination of questionnaires and logfiles. In Computer-based Knowledge & Skill Assessment and Feedback in Learning settings (CAF), Proceedings of the International Conference on Interactive Computer Aided Learning (ICL), 24th to 26th September.Google Scholar
- Linek, S. B., Öttl, G., & Albert, D. (2010). Non-invasive data tracking in educational games: combination of logfiles and natural language processing. In Chova, L.G. and Belenguer, D.M. (eds.), Proceedings from INTED 2010: International Technology, Education and Development Conference, Spain, Valenica 8–10 March, 2010.Google Scholar
- Liu, M., Bera, S., Corliss, S., Svinicki, M., & Beth, A. (2004). Understanding the connection between cognitive tool use and cognitive processes as used by sixth graders in a problem-based hypermedia learning environment. Journal of Educational Computing Research, 31(3), 309–334.CrossRefGoogle Scholar
- Liu, M., Yuen, T. T., Horton, L., Lee, J., Toprac, P., & Bogard, T. (2013). Designing technology-enriched cognitive tools to support young learners’ problem solving. The International Journal of Cognitive Technology., 18(1), 14–21.Google Scholar
- Liu, M., Kang, J., Lee, J, Winzeler, E., & Liu, S. (2015). Examining through visualization what tools learners access as they play a serious game for middle school science. In C. S. Loh, Y. Sheng, & D. Ifenthaler (Eds.) Serious Games Analytics: Methodologies for Performance Measurement, Assessment, and Improvement. Switzerland: Springer. doi: 10.1007/978-3-319-05834-4.
- Loh, C. S., & Sheng, Y. (2013). Measuring the (dis-)similarity between expert and novice behaviors as serious games analytics. Education and Information Technologies, 20(1), 1360–2357.Google Scholar
- Reese, D. D., Tabachnick, B. G., & Kosko, R. E. (2013). Video game learning dynamics: Actionable measures of multidimensional learning trajectories. British Journal of Educational Technology, 46(1), 98–122. http://doi.org/10.1111/bjet.12128.
- Scarlatos, L. L. & Scarlatos, T. (2010). Visualizations for the Assessment of Learning in Computer Games, Paper presented at the 7th International Conference & Expo on Emerging Technologies for a Smarter World (CEWIT 2010), Sept. 27–29, 2010, Incheon, Korea.Google Scholar