Technology, Knowledge and Learning

, Volume 21, Issue 1, pp 33–57 | Cite as

What We Can Learn from the Data: A Multiple-Case Study Examining Behavior Patterns by Students with Different Characteristics in Using a Serious Game

  • Min LiuEmail author
  • Jaejin Lee
  • Jina Kang
  • Sa Liu


Using a multi-case approach, we examined students’ behavior patterns in interacting with a serious game environment using the emerging technologies of learning analytics and data visualization in order to understand how the patterns may vary according to students’ learning characteristics. The results confirmed some preliminary findings from our previous research, but also revealed patterns that would not be easily detected without data visualizations. Such findings provided insights about designing effective learning scaffolds to support the development of problem-solving skills in young learners and will guide our next-step research.


Learning analytics Data visualization Serious games Problem solving Middle school science Fantasy Game engagement 


  1. Andersen, E., Liu, Y. E., Apter, E., Boucher-Genesse, F., & Popović, Z. (2010). Gameplay analysis through state projection. In Proceedings of the fifth international conference on the foundations of digital games (pp. 1–8). USA. doi: 10.1145/1822348.1822349
  2. Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., et al. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.Google Scholar
  3. Asgari, M., & Kaufman, D. (2010). Does fantasy enhance learning in digital games? In D. Kaufman & L. Sauve (Eds.), Educational gameplay and simulation environments: Case studies and lessons learned (pp. 84–95). Hershey, PA: IGI Global.Google Scholar
  4. Bienkowski, M., Feng, M., & Means, B. (2012). Enhancing teaching and learning through educational data mining and learning analytics. Washington, DC: U.S. Department of Education. Retrieved from
  5. Blikstein, P. (2011). Using learning analytics to assess students’ behavior in open-ended program-ming tasks. In P. Long, G. Siemens, G. Conole, & D. Gašević (Eds.), Proceedings of the learning analytics knowledge conference (pp. 110–116). New York, NY: ACM.CrossRefGoogle Scholar
  6. Bogard, T., Liu, M., & Chiang, Y. H. (2013). Thresholds of knowledge development in complex problem solving: A multiple-case study of advanced learners’ cognitive processes. Educational Technology Research and Development, 61(3), 465–503. doi: 10.1007/s11423-013-9295-4.CrossRefGoogle Scholar
  7. Bransford, J. D., & Stein, B. S. (1984). The IDEAL problem solver. New York, NY: W. H. Feeman and Company.Google Scholar
  8. Brockmyer, J. H., Fox, C. M., Curtiss, K. A., McBroom, E., Burkhart, K. M., & Pidruzny, J. N. (2009). The development of the Game Engagement Questionnaire: A measure of engagement in video game-playing. Journal of Experimental Social Psychology, 45(4), 624–634.CrossRefGoogle Scholar
  9. Buckingham Shum, S. (2012). Learning analytics. UNESCO policy brief. Retrieved from
  10. Chittaro, L., & Ieronutti, L. (2004). A visual tool for tracing users’ behavior in virtual environments. In Proceedings of the working conference on Advanced visual interfaces (pp. 40–47). USA.Google Scholar
  11. Cordova, D. I., & Lepper, M. R. (1996). Intrinsic motivation and the process of learning: Beneficial effects of contextualization, personalization, and choice. Journal of Educational Psychology, 88(4), 715.CrossRefGoogle Scholar
  12. Dixit, P. N., & Youngblood, G. M. (2008) Understanding playtest data through visual data mining in interactive 3D environments. Paper presented at the 12th international conference on computer games: AI, Animation, Mobile, Interactive Multimedia and Serious Games (CGAMES). Louisville, KY.Google Scholar
  13. Drachen, A., & Canossa, A. (2009a). Analyzing spatial user behavior in computer games using geographic information systems. In Proceedings of the 13th international MindTrek conference: Everyday life in the ubiquitous era (pp. 182–189). New York, NY: ACM. doi: 10.1145/1621841.1621875
  14. Drachen, A., & Canossa, A. (2009b). Towards gameplay analysis via gameplay metrics. In Proceedings of the 13th international MindTrek conference: Everyday life in the ubiquitous era (pp. 202–209). New York, NY: ACM. doi: 10.1145/1621841.1621878
  15. Dreyfus, S. E. (2004). The five-stage model of adult skill acquisition. Bulletin of Science Technology Society, 24(3), 177–181. doi: 10.1177/0270467604264992.CrossRefGoogle Scholar
  16. Ericsson, K. A., Charness, N., Feltovich, P. J., & Hoffman, R. R. (2006). The Cambridge handbook of expertise and expert performance. New York: Cambridge University Press.CrossRefGoogle Scholar
  17. Garris, R., Ahlers, R., & Driskell, J. E. (2002). Games, motivation, and learning: A research and practice model. Simulation & Gaming, 33(4), 441–467. doi: 10.1177/1046878102238607.CrossRefGoogle Scholar
  18. Hsieh, P., Cho, Y., Liu, M., & Schallert, D. (2008). Examining the interplay between middle school students’ achievement goals and self-efficacy in a technology-enhanced learning environment. American Secondary Education, 36(3), 33–50.Google Scholar
  19. Hwang, W., Shadiev, R., Wang, C., & Huang, Z. (2011). A pilot study of cooperative programming learning behavior and its relationship with students’ learning performance. Computers & Education, 58, 1267–1281.CrossRefGoogle Scholar
  20. Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). The NMC horizon report: 2014 higher education edition. Austin, TX: The New Media Consortium.Google Scholar
  21. Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2015). NMC horizon report: 2015 higher (Education ed.). Austin, TX: The New Media Consortium.Google Scholar
  22. Koedinger, K., Cunningham, K., Skogsholm, A., & Leber, B. (2008). An open repository and analysis tools for fine-grained, longitudinal learner data. In Proceedings of first international conference on educational data mining (pp. 157–166). Canada.Google Scholar
  23. Linek, S. B., Öttl, G., & Albert, D. (2010). Non-invasive data tracking in educational games: Combination of logfiles and natural language processing. In Proceedings of the international technology, education and development conference (INTED). Google Scholar
  24. Liu, M., & Bera, S. (2005). An analysis of cognitive tool use patters in hypermedia learning environment. Educational Technology Research and Development, 53(1), 5–21.CrossRefGoogle Scholar
  25. Liu, M., Bera, S., Corliss, S., Svinicki, M., & Beth, A. (2004). Understanding the connection between cognitive tool use and cognitive processes as used by sixth graders in a problem-based hypermedia learning environment. Journal of Educational Computing Research, 31(3), 309–334.CrossRefGoogle Scholar
  26. Liu, M., Horton, L., Corliss, S. B., Svinicki, M. D., Bogard, T., Kim, J., & Chang, M. (2009). Students’ problem solving as mediated by their cognitive tool use: A study of tool use patterns. Journal of Educational Computing Research, 40(1), 111–139.CrossRefGoogle Scholar
  27. Liu, M., Horton, L., Kang, J., Kimmons, R., & Lee, J. (2013a). Using a ludic simulation to make learning of middle school space science fun. International Journal of Gaming and Computer-Mediated Simulations (IJGCMS), 5(1), 66–86.CrossRefGoogle Scholar
  28. Liu, M., Kang, J., Lee, J., Winzeler, E., & Liu, S. (2015). Examining through visualization what tools learners access as they play a serious game for middle school science. In C. S. Loh, Y. Sheng, & D. Ifenthaler (Eds.), Serious games analytics: Methodologies for performance measurement, assessment, and improvement (pp. 181–208). Switzerland: Springer. doi: 10.1007/978-3-319-05834-4.CrossRefGoogle Scholar
  29. Liu, M., Rosenblum, J. A., Horton, L., & Kang, J. (2014). Designing science learning with game-based approaches. Computers in the Schools, 31(1–2), 84–102. doi: 10.1080/07380569.2014.879776.CrossRefGoogle Scholar
  30. Liu, M., Yuen, T. T., Horton, L., Lee, J., Toprac, P., & Bogard, T. (2013b). Designing technology-enriched cognitive tools to support young learners’ problem solving. The International Journal of Cognitive Technology, 18(1), 14–21.Google Scholar
  31. Loh, C. S., & Sheng, Y. (2013). Measuring the (dis-)similarity between expert and novice behaviors as serious games analytics. Education and Information Technologies,. doi: 10.1007/s10639-013-9263-y.Google Scholar
  32. Lynn, S. J., & Rhue, J. W. (1986). The fantasy-prone person: Hypnosis, imagination, and creativity. Journal of Personality and Social Psychology, 51(2), 404–408.CrossRefGoogle Scholar
  33. Malone, T. W. (1981). Toward a theory of intrinsically motivating instruction. Cognitive Science, 5(4), 333–369. doi: 10.1207/s15516709cog0504_2.CrossRefGoogle Scholar
  34. Malone, T. W., & Lepper, M. R. (1987). Making learning fun: A taxonomy of intrinsic motivations for learning. Aptitude, Learning, and Instruction, 3, 223–253.Google Scholar
  35. Martin, T., & Sherin, B. (2013). Learning analytics and computational techniques for detecting and evaluating patterns in learning: An introduction to the special issue. Journal of the Learning Sciences, 22(4), 511–520. doi: 10.1080/10508406.2013.840466.CrossRefGoogle Scholar
  36. Merckelbach, H., Horselenberg, R., & Muris, P. (2001). The Creative Experiences Questionnaire (CEQ): A brief self-report measure of fantasy proneness. Personality and Individual Differences, 31(6), 987–995. doi: 10.1016/S0191-8869(00)00201-4.CrossRefGoogle Scholar
  37. Moura, D., Seif El-Nasr, M., & Shaw, C. C. (2011). Visualizing and understanding players’ behavior in video games: Discovering patterns and supporting aggregation and comparison. In Proceedings of the 2011 ACM SIGGRAPH symposium on video games (pp. 11–15). New York, NY: ACM Press.Google Scholar
  38. Romero, C., & Ventura, S. (2013). Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3(1), 12–27. doi: 10.1002/widm.1075.Google Scholar
  39. Scarlatos, L. L., & Scarlatos, T. (2010). Visualizations for the assessment of learning in computer games. In Paper presented at the 7th international conference & expo on emerging technologies for a smarter world (CEWIT 2010), Incheon, Korea.Google Scholar
  40. Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. doi: 10.1177/0002764213498851.CrossRefGoogle Scholar
  41. Siemens, G., & d Baker, R. S. J. (2012). Learning analytics and educational data mining: Towards communication and collaboration. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 252–254). New York, NY: ACM.Google Scholar
  42. Stake, R. E. (2005). Multiple case study analysis. New York, NY: Guilford Press.Google Scholar
  43. Verbert, K., Manouselis, N., Drachsler, H., & Duval, E. (2012). Dataset-driven research to support learning and knowledge analytics. Educational Technology and Society, 15(3), 133–148.Google Scholar
  44. Wallner, G., & Kriglstein, S. (2013). Visualization-based analysis of gameplay data—A review of literature. Entertainment Computing, 4(3), 143–155. doi: 10.1016/j.entcom.2013.02.002.CrossRefGoogle Scholar
  45. Wiest, L. R. (2001). The role of fantasy contexts in word problems. Mathematics Education Research Journal, 13(2), 74–90.CrossRefGoogle Scholar
  46. Wilson, K. A., Bedwell, W. L., Lazzara, E. H., Salas, E., Burke, C. S., Estock, J. L., & Conkey, C. (2009). Relationships between game attributes and learning outcomes: Review and research proposals. Simulation & Gaming, 40(2), 217–266. doi: 10.1177/1046878108321866.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  1. 1.Learning Technologies ProgramThe University of Texas at AustinAustinUSA
  2. 2.BK21Plus Institute of Future Education DesignSeoul National UniversitySeoulSouth Korea

Personalised recommendations