Assessment and Adaptation in Games

  • Valerie ShuteEmail author
  • Fengfeng Ke
  • Lubin Wang
Part of the Advances in Game-Based Learning book series (AGBL)


Digital games are very popular in modern culture. We have been examining ways to leverage these engaging environments to assess and support important student competencies, especially those that are not optimally measured by traditional assessment formats. In this chapter, we describe a particular approach for assessing and supporting student learning in game environments—stealth assessment—that entails unobtrusively embedding assessments directly and invisibly into the gaming environment. Results of the assessment can be used for adaptation in the form of scaffolding, hints, and providing appropriately challenging levels. We delineate the main steps of game-based stealth assessment and illustrate the implementation of these steps via two cases. The first case focuses on developing stealth assessment for problem-solving skills in an existing game. The second case describes the integration of game and assessment design throughout game development, and the assessment and support of mathematical knowledge and skills. Both cases illustrate the applicability of data-driven, performance-based assessment in an interactive game as the basis for adaptation and for use in formal and informal contexts.


Stealth assessment Adaptation Bayesian networks 


  1. Almond, R. G. (2010). Using evidence centered design to think about assessments. In V. J. Shute & B. J. Becker (Eds.), Innovative assessment for the 21st century: Supporting educational needs (pp. 75–100). New York: Springer.CrossRefGoogle Scholar
  2. Almond, R. G., DiBello, L., Jenkins, F., Mislevy, R. J., Senturk, D., Steinberg, L. S., et al. (2001). Models for conditional probability tables in educational assessment. In T. Jaakkola & T. Richardson (Eds.), Artificial intelligence and statistics 2001 (pp. 137–143). San Francisco, CA: Morgan Kaufmann.Google Scholar
  3. Almond, R. G., Mislevy, R. J., Steinberg, L. S., Williamson, D. M., & Yan, D. (2015). Bayesian networks in educational assessment. New York: Springer.CrossRefGoogle Scholar
  4. Barab, S. A., Gresalfi, M., & Ingram-Goble, A. (2010). Transformational play using games to position person, content, and context. Educational Researcher, 39(7), 525–536.CrossRefGoogle Scholar
  5. Barab, S. A., Thomas, M., Dodge, T., Carteaux, R., & Tuzun, H. (2005). Making learning fun: Quest Atlantis, a game without guns. Educational Technology Research and Development, 53(1), 86–108.CrossRefGoogle Scholar
  6. Bransford, J., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school (expanded ed.). Washington: National Academies Press.Google Scholar
  7. Chang, K. E., Sung, Y. T., & Chen, S. F. (2001). Learning through computer-based concept mapping with scaffolding aid. Journal of Computer Assisted Learning, 17, 21–33.CrossRefGoogle Scholar
  8. Clark, D. B., Tanner-Smith, E. E., & Killingsworth, S. (2014). Digital games, design, and learning: A systematic review and meta-analysis. Menlo Park, CA: SRI International.Google Scholar
  9. Cowley, B., Charles, D., Black, M., & Hickey, R. (2008). Toward an understanding of flow in video games. Computers in Entertainment, 6(2), 1–27.CrossRefGoogle Scholar
  10. Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. New York: Harper & Row.Google Scholar
  11. Csikszentmihalyi, M. (1997). Finding flow. New York: Basic.Google Scholar
  12. DiCerbo, K. E., & Behrens, J. T. (2012). Implications of the digital ocean on current and future assessment. In R. Lissitz & H. Jiao (Eds.), Computers and their impact on state assessment: Recent history and predictions for the future (pp. 273–306). Charlotte, NC: Information Age Publishing.Google Scholar
  13. Embretson, S. E. (1998). A cognitive design system approach to generating valid tests: Application to abstract reasoning. Psychological Methods, 3(3), 300–396. doi: 10.1037/1082-989X.3.3.380.CrossRefGoogle Scholar
  14. Entertainment Software Association. (2015). 2015 Essential facts about the computer and video game industry. Retrieved from
  15. Flynn, L. (2008). In praise of performance-based assessments. Science and Children, 45(8), 32–35.Google Scholar
  16. Fullerton, T. (2014). Game design workshop, 3rd edition: A playcentric approach to creating innovative games. Boca Raton, FL: AK Peters/CRC Press.Google Scholar
  17. Gee, J. P. (2003). What video games have to teach us about learning and literacy. New York: Palgrave Macmillan.Google Scholar
  18. Kim, Y. J., & Shute, V. J. (2015). Opportunities and challenges in assessing and supporting creativity in video games. In J. Kaufmann & G. Green (Eds.), Research frontiers in creativity. San Diego, CA: Academic.Google Scholar
  19. Lopes, R., & Bidarra, R. (2011). Adaptivity challenges in games and simulations: A survey. IEEE Transactions on Computational Intelligence and AI in Games, 3(2), 85–99.CrossRefGoogle Scholar
  20. Madaus, G. F., & O’Dwyer, L. M. (1999). A short history of performance assessment: Lessons learned. Phi Delta Kappan, 80(9), 688–695.Google Scholar
  21. Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessment. Measurement: Interdisciplinary Research and Perspective, 1(1), 3–62.Google Scholar
  22. Murphy, N., & Messer, D. (2000). Differential benefits from scaffolding and children working alone. Educational Psychology, 20(1), 17–31.CrossRefGoogle Scholar
  23. Partnership for the 21st Century. (2015). Retrieved from
  24. Raven, J. C. (1941). Standardization of progressive matrices, 1938. British Journal of Medical Psychology, 19(1), 137–150.CrossRefGoogle Scholar
  25. Raven, J. (2000). The Raven’s progressive matrices: Change and stability over culture and time. Cognitive Psychology, 41, 1–48.CrossRefGoogle Scholar
  26. Sampayo-Vargas, S., Cope, C. J., He, Z., & Byrne, G. J. (2013). The effectiveness of adaptive difficulty adjustments on students’ motivation and learning in an educational computer game. Computers & Education, 69, 452–462.CrossRefGoogle Scholar
  27. Schweizer, F., Wüstenberg, S., & Greiff, S. (2013). Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity. Learning and Individual Differences, 24, 42–52.CrossRefGoogle Scholar
  28. Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.CrossRefGoogle Scholar
  29. Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. In S. Tobias & J. D. Fletcher (Eds.), Computer games and instruction (pp. 503–524). Charlotte, NC: Information Age Publishers.Google Scholar
  30. Shute, V. J., Hansen, E. G., & Almond, R. G. (2008). You can’t fatten a hog by weighing it—Or can you? Evaluating an assessment for learning system called ACED. International Journal of Artificial Intelligence and Education, 18(4), 289–316.Google Scholar
  31. Shute, V. J., & Kim, Y. J. (2011). Does playing the World of Goo facilitate learning? In D. Y. Dai (Ed.), Design research on learning and thinking in educational settings: Enhancing intellectual growth and functioning (pp. 359–387). New York, NY: Routledge Books.Google Scholar
  32. Shute, V. J., Leighton, J. P., Jang, E. E., & Chu, M.-W. (2016). Advances in the science of assessment. Educational Assessment., 21(1), 1–27.CrossRefGoogle Scholar
  33. Shute, V. J., Masduki, I., & Donmez, O. (2010). Conceptual framework for modeling, assessing, and supporting competencies within game environments. Technology, Instruction, Cognition and Learning, 8(2), 137–161.Google Scholar
  34. Shute, V. J., & Ventura, M. (2013). Measuring and supporting learning in games: Stealth assessment. Cambridge, MA: The MIT Press.Google Scholar
  35. Shute, V. J., Ventura, M., & Ke, F. (2015). The power of play: The effects of Portal 2 and Lumosity on cognitive and noncognitive skills. Computers & Education, 80, 58–67.CrossRefGoogle Scholar
  36. Shute, V. J., Ventura, M., & Kim, Y. J. (2013). Assessment and learning of qualitative physics in Newton’s Playground. The Journal of Educational Research, 106, 423–430.CrossRefGoogle Scholar
  37. Shute, V. J., & Wang, L. (2016). Assessing and supporting hard-to-measure constructs. In A. A. Rupp, & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies, and application (pp. 535–562). Hoboken, NJ: John Wiley & Sons, Inc.Google Scholar
  38. Sitzmann, T. (2011). A meta-analysis of self-regulated learning in work-related training and educational attainment: What we know and where we need to go. Psychological Bulletin, 137, 421–442.CrossRefGoogle Scholar
  39. Squire, K. (2006). From content to context: Videogames as designed experience. Educational Researcher, 35(8), 19–29.CrossRefGoogle Scholar
  40. Stecher, B. (2010). Performance assessment in an era of standard-based educational accountability. Stanford, CA: Stanford University, Stanford Center for Opportunity Policy in Education.Google Scholar
  41. Tatsuoka, K. (1990). Toward an integration of item-response theory and cognitive error diagnosis. In N. Frederiksen, R. Glaser, A. Lesgold, & M. Shafto (Eds.), Diagnostic monitoring of skill and knowledge acquisition (pp. 453–488). Hillsdale, NJ: Erlbaum.Google Scholar
  42. van Oostendorp, H., van der Spek, E. D., & Linssen, J. (2014). Adapting the complexity level of a serious game to the proficiency of players. EAI Endorsed Transactions on Serious Games, 1(2), 8–15.Google Scholar
  43. Ventura, M., Shute, V. J., & Small, M. (2014). Assessing persistence in educational games. In R. Sottilare, A. Graesser, X. Hu, & B. Goldberg (Eds.), Design recommendations for adaptive intelligent tutoring systems (Learner modeling, Vol. 2, pp. 93–101). Orlando, FL: U.S. Army Research Laboratory.Google Scholar
  44. Ventura, M., Shute, V. J., & Zhao, W. (2012). The relationship between video game use and a performance-based measure of persistence. Computers and Education, 60, 52–58.CrossRefGoogle Scholar
  45. Vygotsky, L. S. (1978). Mind in society: The development of higher mental processes. Cambridge, MA: Harvard University Press.Google Scholar
  46. Vygotsky, L. S. (1987). The collected works of L. S. Vygotsky. New York: Plenum.Google Scholar
  47. Wouters, P. J. M., van Nimwegen, C., van Oostendorp, H., & van der Spek, E. D. (2013). A meta-analysis of the cognitive and motivational effects of serious games. Journal of Educational Psychology, 105, 249–265.CrossRefGoogle Scholar
  48. Wustenberg, S., Greiff, S., & Funke, J. (2012). Complex problem solving—More than reasoning? Intelligence, 40, 1–14.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  1. 1.Educational Psychology and Learning Systems DepartmentFlorida State UniversityTallahasseeUSA

Personalised recommendations