Advertisement

Summative Game-Based Assessment

  • Andreas OranjeEmail author
  • Bob Mislevy
  • Malcolm I. Bauer
  • G. Tanner Jackson
Chapter
Part of the Advances in Game-Based Learning book series (AGBL)

Abstract

The interplay between games, assessment, and learning has so far been considered primarily within a formative setting and with particular focus on how design frameworks from these fields are connected and disconnected. The purpose of this chapter is to extend what has been developed and learned about formative game-based assessments (GBAs) into summative assessment practices. The objective is to answer questions about design principles, good or best practices, and application opportunities for summative game-based assessments, including a critical analysis of how they can or cannot improve summative assessment practices more generally. After introducing some foundations, we will discuss motivations for GBA, provide design trade-offs for various use cases, and develop considerations for designing summative GBAs. Careful review shows that this is a very challenging space with some limited, though worthwhile opportunities.

Keywords

Game-based assessment Evidence-centered game design Summative assessment 

Notes

Acknowledgements

On last count, no fewer than seven anonymous reviewers at ETS and abroad have provided countless comments and edits on previous versions of this work. We are deeply indebted to their valuable and perspective-altering contributions without which this work would never have seen the light of day.

References

  1. Almond, R., Steinberg, L., & Mislevy, R. S. (2002). Enhancing the design and delivery of assessment systems: A four process architecture. The Journal of Technology and Assessment, 5(1), 1–63.Google Scholar
  2. Attali, Y., & Powers, D. (2008). Effect of immediate feedback and revision on psychometric properties of open-ended GRE subject test items (Research Report 08-21). Princeton, NJ: Educational Testing Service.Google Scholar
  3. Baker, R. S. J. (2010). Data mining for education. In B. McGaw, P. Peterson, & E. Baker (Eds.), International encyclopedia of education (3rd ed.). Oxford, UK: Elsevier.Google Scholar
  4. Barab, S. A., Gresalfi, M. S., & Ingram-Goble, A. (2010). Transformational play: Using games to position person, content, and context. Educational Researcher, 39(7), 525–536.CrossRefGoogle Scholar
  5. Bauer, M., Wylie, C., Jackson, T., Mislevy, R., Hoffman-John, E., & John, M. (2017). Why video games can be a good fit for formative assessment. Journal of Applied Testing Technology, 18(S1), 19–31.Google Scholar
  6. Bennett, R. E. (2010). Cognitively based assessment of, for, and as learning (CBAL): A preliminary theory of action for summative and formative assessment. Measurement, 8(2–3), 70–91.Google Scholar
  7. Carolan, J., & Zielezinski, M. B. (2019). Debunking the ‘gold standard’ myths in edtech efficacy. Retrieved from: https://www.edsurge.com/news/2019-05-21-debunking-the-gold-standard-myths-in-edtech-efficacy
  8. Csíkszentmihályi, M. (1975). Beyond boredom and anxiety. San Francisco, CA: Jossey-Bass.Google Scholar
  9. Csíkszentmihályi, M. (1990). Flow: The psychology of optimal experience. New York, NY: Harper Collins.Google Scholar
  10. Deci, E. L., & Ryan, R. M. (2000). The “what” and “why” of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11, 227–268.CrossRefGoogle Scholar
  11. DiCerbo, K. E., & Behrens, J. T. (2012). Implications of the Digital Ocean on current and future assessment. In R. Lissitz (Ed.), Computers and their impact on state assessment: Recent history and predictions for the future (pp. 273–306). Charlotte, NC: Information Age Publishing.Google Scholar
  12. Dorans, N. J. (2012). The contestant perspective on taking tests: Emanations from the statue within. Educational Measurement: Issues and Practice, 31(4), 20–37.CrossRefGoogle Scholar
  13. Dunbar, S. B., Koretz, D. M., & Hoover, H. D. (1991). Quality control in the development and use of performance assessments. Applied Measurement in Education, 4, 289–303.CrossRefGoogle Scholar
  14. Electronic Arts. (2018). The Sims 4 [computer software]. Redwood City, CA: Electronic Arts.Google Scholar
  15. Ercikan, K., & Pellegrino, J. W. (2017). Validation of score meaning for the next generation of assessments: The use of response processes (The NCME applications of educational assessment and measurement book series). Washington, DC: National Council on Measurement in Education.CrossRefGoogle Scholar
  16. Frederiksen, J. R., & Collins, A. (1989). A systems approach to educational testing. Educational Researcher, 18(9), 27–32.CrossRefGoogle Scholar
  17. Gee, J. P. (2007). Good video games + good learning. New York, NY: Peter Lang.CrossRefGoogle Scholar
  18. Greeno, J. G. (1998). The situativity of knowing, learning, and research. American Psychologist, 53, 5–26.CrossRefGoogle Scholar
  19. Holland, P. W. (1994). Measurements or contests? Comments on Zwick, bond and Allen/Donoghue. In Proceedings of the social statistics section of the American Statistical Association (pp. 27–29). Alexandria, VA: American Statistical Association.Google Scholar
  20. John, M. (2014). Gamification is dead, long live games for learning. Retrieved from http://techcrunch.com/2014/10/05/gamification-is-dead-long-live-games-for-learning/
  21. Kerr, D., & Chung, G. K. W. K. (2012). Identifying key features of student performance in educational video games and simulations through cluster analysis. Journal of Educational Data Mining, 4(1), 144–182.Google Scholar
  22. Klopfer, E., Osterweil, S., & Salen, K. (2009). Moving learning games forward: Obstacles, opportunities & openness. Retrieved from http://education.mit.edu/papers/MovingLearningGamesForward_EdArcade.pdf
  23. Koster, R. (2014). A theory of fun for game design. Sebastopol, CA: O’Reilly Media.Google Scholar
  24. Leighton, J. P., & Gierl, M. J. (Eds.). (2007). Cognitive diagnostic assessment for education. Theory and applications. Cambridge, MA: Cambridge University Press.Google Scholar
  25. Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Research, 23(2), 13–23.CrossRefGoogle Scholar
  26. Metcalf, S. J., Kamarainen, A., Tutwiler, M. S., Grotzer, T. A., & Dede, C. J. (2011). Ecosystem science learning via multi-user virtual environments. International Journal of Gaming and Computer-Mediated Simulations., 3(1), 86–90.CrossRefGoogle Scholar
  27. Mislevy, R. J., Behrens, J. T., DiCerbo, K. E., & Levy, R. (2012). Design and discovery in educational assessment: Evidence centered design, psychometrics, and data mining. Journal of Educational Data Mining, 4, 11–48. Retrieved from http://www.educationaldatamining.org/JEDM/images/articles/vol4/issue1/MislevyEtAlVol4Issue1P11_48.pdfGoogle Scholar
  28. Mislevy, R. J., Corrigan, S., Oranje, A., DiCerbo, K. E., Bauer, M. I., von Davier, A., & John, M. (2016). Psychometrics and game-based assessment. In F. Drasgow (Ed.), Technology and testing (pp. 23–48). New York, NY: Routledge.Google Scholar
  29. Mislevy, R. J., Haertel, G., Cheng, B. H., Ructtinger, L., DeBarger, A., Murray, E., … Vendlinski, T. (2013). A “conditional” sense of fairness in assessment. Educational Research and Evaluation, 19, 121–140.CrossRefGoogle Scholar
  30. Mislevy, R. J., Oranje, A., Bauer, M. I., von Davier, A., Hao, J., Corrigan, S., … John, M. (2014). Psychometric considerations in game-based assessment. Redwood City, CA: GlassLab.Google Scholar
  31. Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1, 3–67.Google Scholar
  32. Mislevy, R. J., Steinberg, L. S., Breyer, F. J., Johnson, L., & Almond, R. A. (2002). Making sense of data from complex assessments. Applied Measurement in Education, 15, 363–378.CrossRefGoogle Scholar
  33. Moustaki, I., & Knott, M. (2000). Generalised latent trait models. Psychometrika, 65, 391–411.CrossRefGoogle Scholar
  34. National Assessment Governing Board. (2014). Science framework for the 2015 National Assessment of educational Progress. Washington, DC: National Assessment Governing Board.Google Scholar
  35. Partnership for 21st Century Skills. (2011). Framework for 21st century learning. Retrieved from P21.orgGoogle Scholar
  36. Pearl, J. (2000). Causality: Models, reasoning, and inference. Cambridge, UK: Cambridge University Press.Google Scholar
  37. Ramanarayanan, V., Evanini, K., & Tsuprun, E. (2019). Beyond monologues: Automated processing of conversational speech. In K. Zechner & K. Evanini (Eds.), Handbook of automated speech scoring. London, UK: Routledge.Google Scholar
  38. Rideout, V. (2014). Learning at home: Families’ educational media use in America. New York, NY: The Joan Ganz Cooney Center.Google Scholar
  39. Salen, K., & Zimmerman, E. (2003). Rules of play: Game design fundamentals. Cambridge, MA: MIT Press.Google Scholar
  40. Schell, J. (2008). The art of game design. Boca Raton, FL: CRC Francis & Taylor.CrossRefGoogle Scholar
  41. Schmit, M. J., & Ryan, A. (1992). Test-taking dispositions: A missing link? Journal of Applied Psychology, 77, 629–637.CrossRefGoogle Scholar
  42. Scott, L. A. (2017). 21st century skills early learning. Partnership for 21st Century Learning. Retrieved from http://www.p21.org/storage/documents/EarlyLearning_Framework/P21_ELF_Framework_Final.pdf
  43. Shaffer, D. W. (2006). How computer games help children learn. New York, NY: Palgrave Macmillan.CrossRefGoogle Scholar
  44. Shute, V. J., Ventura, M., & Kim, Y.-J. (2013). Assessment and learning of qualitative physics in Newton’s playground. The Journal of Educational Research, 106, 423–430.CrossRefGoogle Scholar
  45. Stecher, B. M., & Hamilton, L. S. (2014). Measuring hard-to-measure student competencies: A research and development plan. Santa Monica, CA: RAND Corporation.CrossRefGoogle Scholar
  46. The Entertainment Software Association. (2013). Essential facts about the computer and video game industry. Washington, DC: The Entertainment Software Association.Google Scholar
  47. Tsang, A. (2017). Want to work for Jaguar Land Rover? Start playing phone games. The New York Times. Retrieved from https://www.nytimes.com/2017/06/19/business/jaguar-land-rover-app-puzzles.html
  48. Weiss, D. J., & Kingsbury, G. G. (1984). Application of computerized adaptive testing to educational problems. Journal of Educational Measurement, 21, 361–375.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Andreas Oranje
    • 1
    Email author
  • Bob Mislevy
    • 1
  • Malcolm I. Bauer
    • 1
  • G. Tanner Jackson
    • 1
  1. 1.Educational Testing ServicePrincetonUSA

Personalised recommendations