Advertisement

Technology, Knowledge and Learning

, Volume 21, Issue 2, pp 195–210 | Cite as

Got Game? A Choice-Based Learning Assessment of Data Literacy and Visualization Skills

  • Doris B. Chin
  • Kristen P. Blair
  • Daniel L. Schwartz
Article

Abstract

In partnership with both formal and informal learning institutions, researchers have been building a suite of online games, called choicelets, to serve as interactive assessments of learning skills, e.g. critical thinking or seeking feedback. Unlike more traditional assessments, which take a retrospective, knowledge-based view of learning, choicelets take a prospective, process-based view and focus on students’ choices as they attempt to solve a challenge. The multi-level challenges are designed to allow for players’ “free choice” as they explore and learn how to solve the challenge. The system provides them with various learning resources, and tracks whether, what, how, and when they choose to learn. This paper briefly describes a partner’s curriculum focused on data literacy and visualization, the design of a choice-based assessment for their program, and reports on an initial study of the curriculum and game with 10th grade biology students. Results are presented in the context of the design research questions: Do student choices in the game predict their learning from the game? Does the curriculum teach the students to choose more effectively with respect to data visualization? Future work for choice-based assessments is also discussed.

Keywords

Educational technology Learning assessment Educational assessment Game-based assessment science education 

Notes

Acknowledgments

This material is based upon work supported by the National Science Foundation under Grant Numbers 0904324 and 1228831, the John D. and Catherine T. MacArthur Foundation, and the Gordon and Betty Moore Foundation. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the granting agencies. The authors would like to thank Jacob Haigh and Neil Levine for their key contributions in the development of the assessment game, as well as Rochelle Urban and Megan Schufreider for their work on the pilot curriculum and study.

References

  1. Chi, M., Schwartz, D. L., Chin, D. B., & Blair, K. P. (2014). Choice-based assessment: Can choices made in digital games predict 6th-grade students’ math test scores? In Proceedings of the 7th international conference on educational data mining (pp. 36–43).Google Scholar
  2. Conlin, L. D., Chin, D. B., Blair, K. P., Cutumisu, M., & Schwartz, D. L. (2015). Guardian angels of our better nature: Finding evidence of the benefits of design thinking. In Proceedings of the American Society for Engineering Education, June 2015, Seattle, WA.Google Scholar
  3. Cutumisu, M., Blair, K. P., Chin, D. B., & Schwartz, D. L. (2015). Posterlet: A game-based assessment of children’s choices to seek feedback and to revise. Journal of Learning Analytics, 2(1), 49–71.Google Scholar
  4. Falk, J. J. H., & Dierking, L. L. D. (2002). Lessons without limit: How free-choice learning is transforming education. New York: Rowmand & Littlefield.Google Scholar
  5. Friedman, A. (Ed). (2008). Framework for evaluating impacts of informal science education projects. Arlington, VA: National Science Foundation. http://caise.insci.org/uploads/docs/Eval_Framework.pdf.
  6. Garris, R., Ahlers, R., & Driskell, J. E. (2002). Games, motivation, and learning: A research and practice model. Simulation & Gaming, 33(4), 441–467.CrossRefGoogle Scholar
  7. Gee, J. P. (2003). What video games have to teach us about learning and literacy. Computers in Entertainment (CIE), 1(1), 20.CrossRefGoogle Scholar
  8. Ifenthaler, D., Eseryel, D., & Ge, X. (2012). Assessment for game-based learning (pp. 1–8). New York: Springer.CrossRefGoogle Scholar
  9. Klehe, U. C., & Anderson, N. (2007). Working hard and working smart: Motivation and ability during typical and maximum performance. Journal of Applied Psychology, 92(4), 978.CrossRefGoogle Scholar
  10. Mayo, M. J. (2009). Video games: A route to large-scale STEM education? Science, 323(5910), 79–82.CrossRefGoogle Scholar
  11. Mislevy, R. J. (2011). Evidence-centered design for simulation-based assessment (CRESST report 800). Los Angeles, CA: The National Center for Research on Evaluation, Standards, and Student Testing, University of California, Los Angeles. https://www.cse.ucla.edu/products/reports/R800.pdf.
  12. Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2003). A brief introduction to evidence‐centered design. ETS Research Report Series, 2003(1), pp. i-29.Google Scholar
  13. National Governors Association Center for Best Practices & Council of Chief State School Officers. (2010). Common core state standards. Washington, DC: Authors.Google Scholar
  14. National Research Council. (2009). Learning science in informal environments: People, places, and pursuits. Committee on Learning Science in Informal Environments. In P. Bell, B. Lewenstein, A. W. Shouse & M. A. Feder (Eds.), Board on Science Education, Center for Education, Division of Behavioral and Social Sciences and Education. Washington, D.C.: The National Academies Press.Google Scholar
  15. NGSS Lead States. (2013). Next generation science standards: For states, by states. Washington, DC: The National Academies Press.Google Scholar
  16. Sackett, P. R., Zedeck, S., & Fogli, L. (1988). Relations between measures of typical and maximum job performance. Journal of Applied Psychology, 73(3), 482.CrossRefGoogle Scholar
  17. Schwartz, D. L. & Arena, D. (2009). Choice-based assessments for the digital age. MacArthur 21st century learning and assessment project.Google Scholar
  18. Schwartz, D. L., & Arena, D. (2013). Measuring what matters most: Choice-based assessments for the digital age. Cambridge, MA: MIT Press.Google Scholar
  19. Schwartz, D. L., Bransford, J. D., & Sear, D. (2005). Efficiency and innovation in transfer. In J. P. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 1–51). Greenwich, CT: IAP.Google Scholar
  20. Semmens, R., Blair, K. P., & Schwartz, D. L. (2015). How sick is that doggie in the window? Game choices correlate to academic performance. Manuscript in preparation.Google Scholar
  21. Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. Computer Games and Instruction, 55(2), 503–524.Google Scholar
  22. Shute, V. J., Ventura, M., Bauer, M. I., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning: Flow and grow. In U. Ritterfeld, M. Cody & P. Vorderer, P. (Eds.), Serious games: Mechanisms and effects (pp. 295–321). New York: Routledge.Google Scholar
  23. Shute, V. J., D’Mello, S. K., Baker, R., Cho, K., Bosch, N., Ocumpaugh, J., et al. (2015). Modeling how incoming knowledge, persistence, affective states, and in-game progress influence student learning from an educational game. Computers & Education, 86, 224–235.CrossRefGoogle Scholar
  24. Van Eck, R. (2006). Digital game-based learning: It’s not just the digital natives who are restless. EDUCAUSE Review, 41(2), 16.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  • Doris B. Chin
    • 1
  • Kristen P. Blair
    • 1
  • Daniel L. Schwartz
    • 1
  1. 1.Stanford Graduate School of EducationStanfordUSA

Personalised recommendations