Advertisement

Assessing Game-Based Mathematics Learning in Action

  • Fengfeng KeEmail author
  • Biswas Parajuli
  • Danial Smith
Chapter
Part of the Advances in Game-Based Learning book series (AGBL)

Abstract

Digital learning environments emphasize learning in action. Because knowledge is present in what learners do, how they do it, what tools they use, and how they communicate in and about their doing, it is important to assess knowledge production in context and learning in action. Via a design-based research approach, we explored the feasibility and validity of using the evidence-centered design approach and Bayesian networks to assess mathematical learning in action in a game-based learning environment. We iteratively tested the hypothesized assessment models and alternative approaches of exploiting game-based performance data, with longitudinal data sets collected during the course of 42 gaming sessions across 3 academic semesters. The investigation illustrated the design and implementation heuristics related to the game-based learning-in-action assessment. The emerged learning-in-action assessment evolves around four major operational practices: (a) domain competency modeling along with core game mechanics conceptualization; (b) developing task models and the Q-matrix; (c) developing the game log that encompasses performance data capturing, pattern recognition, and observables extraction; and (d) training, substantiating, and comparing statistical models for data processing and assessment implementation.

Keywords

Game-based learning Mathematics Real-time assessment Bayesian network Learning in action 

Notes

Acknowledgements

The work reported in this chapter was supported by the National Science Foundation, grant no. 1318784. Any opinions, findings, and conclusions or recommendations expressed in these materials are those of the author and do not necessarily reflect the views of the National Science Foundation.

References

  1. Almond, R. G. (2010). I can name that Bayesian network in two matrixes! International Journal of Approximate Reasoning, 51(2), 167–178.CrossRefGoogle Scholar
  2. Ayers, E., & Junker, B. W. (2006, July). Do skills combine additively to predict task difficulty in eighth grade mathematics. In Educational data mining: Papers from the AAAI workshop. Menlo Park, CA: AAAI.Google Scholar
  3. Bakhshinategh, B., Zaiane, O. R., ElAtia, S., & Ipperciel, D. (2018). Educational data mining applications and tasks: A survey of the last 10 years. Education and Information Technologies, 23(1), 537–553.CrossRefGoogle Scholar
  4. Carnoy, M., & Rothstein, R. (2013). What do international tests really show us about U.S. student performance? Washington, DC: Economic Policy Institute.Google Scholar
  5. Clark, D. B., Tanner-Smith, E. E., & Killingsworth, S. S. (2016). Digital games, design, and learning: A systematic review and meta-analysis. Review of Educational Research, 86(1), 79–122.CrossRefGoogle Scholar
  6. Dede, C. (2012, May). Interweaving assessments into immersive authentic simulations: Design strategies for diagnostic and instructional insights. In Invitational Research Symposium on Technology Enhanced Assessments. Retrieved from http://www.ets.org/Media/Research/pdf/session4-dede-paper-tea2012.pdf
  7. Dewey, J. (1910). Science as subject-matter and as method. Science, 31, 121–127.CrossRefGoogle Scholar
  8. Eriksson, I., & Lindberg, V. (2016). Enriching ‘learning activity’ with ‘epistemic practices’–enhancing students’ epistemic agency and authority. Nordic Journal of Studies in Educational Policy, 2016(1), 32432.CrossRefGoogle Scholar
  9. Kang, J., Liu, M., & Qu, W. (2017). Using gameplay data to examine learning behavior patterns in a serious game. Computers in Human Behavior, 72, 757–770.CrossRefGoogle Scholar
  10. Ke, F. (2016). Designing and integrating purposeful learning in game play: A systematic review. Educational Technology Research and Development, 64(2), 219–244.CrossRefGoogle Scholar
  11. Ke, F. (2019). Mathematical problem solving and learning in an architecture-themed epistemic game. Educational Technology Research and Development.  https://doi.org/10.1007/s11423-018-09643-2
  12. Ke, F., & Clark, K. M. (2019). Game-based multimodal representations and mathematical problem solving. International Journal of Science and Mathematics Education.  https://doi.org/10.1007/s10763-018-9938-3
  13. Ke, F., Shute, V., Clark, K., & Erlebacher, G. (2019). Interdisciplinary design of the game-based learning platform: A phenomenological examination of the integrative design of game, learning, and assessment. New York, NY: Springer.CrossRefGoogle Scholar
  14. Klopfer, E., Osterweil, S., & Salen, K. (2009). Moving learning games forward. Cambridge, MA: The Education Arcade. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.687.5017&rep=rep1&type=pdfGoogle Scholar
  15. Levy, R. (2014). Dynamic Bayesian network modeling of game based diagnostic assessments (CRESST Report 837). National Center for Research on Evaluation, Standards, and Student Testing (CRESST). Retrieved from https://files.eric.ed.gov/fulltext/ED555714.pdf
  16. Manjarres, A. V., Sandoval, L. G. M., & Suárez, M. S. (2018). Data mining techniques applied in educational environments: Literature review. Digital Education Review, 33, 235–266.Google Scholar
  17. Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Research Report Series, 2003(1), i–29.CrossRefGoogle Scholar
  18. Mislevy, R. J., Behrens, J. T., Dicerbo, K. E., & Levy, R. (2012). Design and discovery in educational assessment: Evidence-centered design, psychometrics, and educational data mining. Journal of Educational Data Mining, 4(1), 11–48.Google Scholar
  19. Mislevy, R. J., Haertel, G., Riconscente, M., Rutstein, D. W., & Ziker, C. (2017). Evidence-centered assessment design. In Assessing model-based reasoning using evidence-centered design (pp. 19–24). Cham, Switzerland: Springer.CrossRefGoogle Scholar
  20. Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Journal of Educational Technology & Society, 17(4), 49–64.Google Scholar
  21. Pardos, Z., Heffernan, N., Ruiz, C., & Beck, J. (2008, June). The composition effect: Conjunctive or compensatory? An analysis of multi-skill math questions in ITS. In R. de Baker, T. Barnes, & J. E. Beck (Eds.), Educational data mining 2008 proceedings (pp. 147–156).Google Scholar
  22. Sandoval, W. A., & Bell, P. (2004). Design-based research methods for studying learning in context: Introduction. Educational Psychologist, 39(4), 199–201.CrossRefGoogle Scholar
  23. Shaffer, D. W., Hatfield, D., Svarovsky, G. N., Nash, P., Nulty, A., Bagley, E., … Mislevy, R. (2009). Epistemic network analysis: A prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 33–53.CrossRefGoogle Scholar
  24. Shute, V. J., & Ventura, M. (2013). Measuring and supporting learning in games: Stealth assessment. Cambridge, MA: MIT.CrossRefGoogle Scholar
  25. Taub, M., Azevedo, R., Bradbury, A. E., Millar, G. C., & Lester, J. (2018). Using sequence mining to reveal the efficiency in scientific reasoning during STEM learning with a game-based learning environment. Learning and Instruction, 54, 93–103.CrossRefGoogle Scholar
  26. Yin, R. K. (2013). Validity and generalization in future case study evaluations. Evaluation, 19(3), 321–332.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Educational Psychology and Learning SystemsFlorida State UniversityTallahasseeUSA

Personalised recommendations