Visual Signaling in a High-Search Virtual World-based Assessment: A SAVE Science Design Study
- 241 Downloads
Abstract
Education policy in the United States centers K-12 assessment efforts primarily on standardized tests. However, such tests may not provide an accurate and reliable representation of what students understand about the complexity of science. Research indicates that students tend to pass science tests, even if they do not understand the concepts being assessed. On standardized tests, such concepts are typically assessed via multiple-choice questions, which may check student receptive understanding of science-related vocabulary terms rather than their ability to develop hypotheses and design experiments to test those hypotheses. In an attempt to address these assessment issues, our SAVE Science project has been exploring the use of immersive virtual environments as platforms for both learning and assessment. SAVE Science (Situated Assessment in Virtual Environments for Science Content and Inquiry) is an NSF-funded study exploring the use of virtual world-based tests to assess the science knowledge and skill of middle school students. The main goal of SAVE Science is to explore the value of virtual world-based assessments as supplements or alternatives to more traditional forms of assessment. In pursuit of that goal, we are examining design frameworks designed to help students manage the high cognitive load they may experience while completing the tests. In this paper, we present results from a study exploring the use of visual signaling techniques in virtual world-based assessments, with a particular focus on their use and impact in visually complex, high visual search environments. The study focused on the use of visual signaling to reduce perceived student cognitive load, while simultaneously increasing the number of interactions students perform with assessment-relevant objects in a virtual world (assessment efficiency).
Keywords
Virtual worlds Assessment STEM education Educational games Instructional designReferences
- Barab, S., Arici, A., & Jackson, C. (2005). Eat your vegetables and do your homework: A design based investigation of enjoyment and meaning in learning. Educational Technology, 45(1), 15–20.Google Scholar
- Chen, C. J., & Fauzy, W. M. W. I. (2008). Guiding exploration through three-dimensional virtual environments: A cognitive load reduction approach. Journal of Interactive Learning Research, 19(4), 579–596.Google Scholar
- Cierniak, G., Scheiter, K., & Gerjets, P. (2009). Explaining the split-attention effect: Is the reduction of extraneous cognitive load accompanied by an increase in germane cognitive load? Computers in Human Behavior, 25, 315–324.CrossRefGoogle Scholar
- Clark, D., Nelson, B., Sengupta, P., & D’Angelo, C. (2009). Rethinking science learning through digital games and simulations: Genres, examples, and evidence. In Learning science: Computer games, simulations, and education workshop. Washington, DC: National Academy of Sciences.Google Scholar
- de Koning, B. B., Tabbers, H. K., Rikers, R. M. J. P., & Paas, F. (2007). Attention cueing as a means to enhance learning from an animation. Applied Cognitive Psychology, 21(6), 731–746. doi: 10.1002/acp.1346.CrossRefGoogle Scholar
- de Koning, B. B., Tabbers, H. K., Rikers, R. M. J. P., & Paas, F. (2010). Attention guidance in learning from a complex animation: Seeing is understanding? Learning and Instruction, 20(2), 111–122.CrossRefGoogle Scholar
- Erlandson, B., Nelson, B., & Savenye, W. (2010). Collaboration modality, cognitive load, and science inquiry learning in virtual inquiry environments. Educational Technology Research and Development, 58(6):693–710.CrossRefGoogle Scholar
- Gerjets, P., Scheiter, K., Opfermann, M., Hesse, F. W., & Eysink, T. H. S. (2009). Learning with hypermedia: The influence of representational formats and different levels of learner control on performance and learning behavior. Computers in Human Behavior, 25, 360–370.CrossRefGoogle Scholar
- Jeung, H. J., Chandler, P., & Sweller, J. (1997). The role of visual indicators in dual sensory mode instruction. Educational Psychology, 17(3), 329–345.CrossRefGoogle Scholar
- Ketelhut, D. J. (2007). The impact of student self-efficacy on scientific inquiry skills: An exploratory investigation in river city, a multi-user virtual environment. Journal of Science Education and Technology, 16(1), 99–111.CrossRefGoogle Scholar
- Ketelhut, D. J., Nelson, B., Schifter, C., & Kim, Y. (2013). Improving science assessments by situating them in a virtual environment. Education Sciences, 3(2), 172–192.CrossRefGoogle Scholar
- Ketelhut, D. J., Yates, A., Sil, A., & Timms, M. (2012). Applying educational data mining in E-learning environments. Section within the new measurement paradigm report (pp. 47–52). http://cadrek12.org/sites/default/files/NMP%20Report%20041412_0.pdf
- Lawrence, C. (2006). Take a load off: Cognitive considerations for game design. Proceedings of the 3rd Australasian conference on interactive entertainment (pp. 91–95).Google Scholar
- Mayer, R. E. (2010). Unique contributions of eye-tracking research to the study of learning with graphics. Learning and Instruction, 20, 167–171.CrossRefGoogle Scholar
- Michael, J. (2007). Conceptual assessment in the biological sciences: A National Science Foundation sponsored workshop. Advances in Physiological Education, 31, 389–391.CrossRefGoogle Scholar
- Morozov, A. (2009). The effects of spatial visualization ability and graphical navigational aids on cognitive load and learning from web-based instruction. Educational Multimedia and Hypermedia, 18(1), 27–70.Google Scholar
- National Research Council. (2005). America’s lab report: Investigations in high school science. Washington, DC: National Academies Press.Google Scholar
- Nelson, B. (2007). Exploring the use of individualized, reflective guidance in an educational multi-user virtual environment. The Journal of Science Education and Technology, 16(1):83–97.CrossRefGoogle Scholar
- Nelson, B. & Ketelhut, D. J. (2007). Scientific Inquiry in Educational Multi-User Virtual Environments. Educational Psychology Review, 19(3):265–283.CrossRefGoogle Scholar
- Nelson, B. & Ketelhut, D. J. (2008). Exploring embedded guidance and self-efficacy in educational multi-user virtual environments. International Journal of Computer-Supported Collaborative Learning, 3:413–427.CrossRefGoogle Scholar
- Nelson, B., Ketelhut, D. J., & Schifter, C. (2010). Exploring cognitive load in immersive educational games: The SAVE Science project. International Journal of Gaming and Computer-Mediated Simulations, 2(1):31–39.CrossRefGoogle Scholar
- Nelson, B., Erlandson, B., & Denham, A. (2011). Global channels for learning and assessment in complex game environments. British Journal of Educational Technology, 42(1):88–100.CrossRefGoogle Scholar
- Nelson, B., Kim, Y., Foshee, C., & Slack, K. (2014). Visual signaling in virtual world-based assessments: The SAVE Science project. Information Sciences, 264:32–40.CrossRefGoogle Scholar
- Shute, V. J., Ventura, M., Bauer, M. I., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning: Flow and grow. In U. Ritterfeld, M. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 295–321). Mahwah, NJ: Routledge, Taylor and Francis.Google Scholar
- Songer, N. B., Lee, H. S., & McDonald, S. (2003). Research towards an expanded understanding of inquiry science beyond one idealized standard. Science Education, 87, 490–516.CrossRefGoogle Scholar
- Sweller, J., Ayres, P., & Kalyuga, S. (2011). Measuring cognitive load, cognitive load theory (pp. 71–85). New York: Springer.CrossRefGoogle Scholar
- Sweller, J., Van Merrienboer, J. J. G., & Paas, F. G. W. C. (1994). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.CrossRefGoogle Scholar
- Wouters, P., Paas, F., & Van Merrienboer, J. J. G. (2008). How to optimize learning from animated models: a review of guidelines based on cognitive load. Review of Educational Research, 78(3), 645–675.CrossRefGoogle Scholar