Investigating the feasibility of using assessment and explanatory feedback in desktop virtual reality simulations
- 91 Downloads
There is great potential in making assessment and learning complementary. In this study, we investigated the feasibility of developing a desktop virtual reality (VR) laboratory simulation on the topic of genetics, with integrated assessment using multiple choice questions based on item response theory (IRT) and feedback based on the cognitive theory of multimedia learning. A pre-test post-test design was used to investigate three research questions related to: (1) students’ perceptions of assessment in the form of MC questions within the VR genetics simulation; (2) the fit of the MC questions to the assumptions of the partial credit model (PCM) within the framework of IRT; and (3) if there was a significant increase in intrinsic motivation, self-efficacy, and transfer from pre- to post-test after using the VR genetics simulation as a classroom learning activity. The sample consisted of 208 undergraduate students taking a medical genetics course. The results showed that assessment items in the form of gamified multiple-choice questions were perceived by 97% of the students to lead to higher levels of understanding, and only 8% thought that they made the simulation more boring. Items within a simulation were found to fit the PCM and the results showed that the sample had a small significant increase in intrinsic motivation and self-efficacy, and a large significant increase in transfer following the genetics simulation. It was possible to develop assessments for online educational material and retain the relevance and connectedness of informal assessment while simultaneously serving the communicative and credibility-based functions of formal assessment, which is a great challenge facing education today.
KeywordsSimulations Desktop virtual reality Assessment Explanatory feedback Item response theory Cognitive theory of multimedia learning Retrieval practice
This research was funded by Innovation fund Denmark.
Compliance with ethical standards
Conflict of interest
Mads Bonde is a co-founder of the simulation development company Labster that provided the simulation that was used in this study. Ainara Lopez Cordoba works at Labster. The remaining authors declare that they have no conflict of interest.
Ethical consent was obtained from all participants in accordance with the ethical regulations of the Health Research Ethics Committee in Denmark.
- Andrich, D., Sheridan, B., & Luo, G. (2010). Rasch models for measurement: RUMM2030. Perth, Australia: RUMM Laboratory.Google Scholar
- Black, P. J., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5(1), 7–73.Google Scholar
- Boud, D. (1995). Assessment and learning: Contradictory or complementary? In P. Knight (Ed.), Assessment for learning in higher education (pp. 35–48). London: Kogan.Google Scholar
- DeVellis, R. F. (1991). Scale development: Theory and applications. Thousand Oaks, CA, USA: Sage.Google Scholar
- Embretson, S., & Reise, S. P. (2000). Item response theory for psychologists. Books.google.com. Retrieved from http://books.google.com/books?hl=en&lr=&ie=UTF-8&id=rYU7rsi53gQC&oi=fnd&pg=PP11&dq=ORDERED+LATENT+CLASS+MODELS+IN+NONPARAMETRIC+ITEM+RESPONSE+THEORY&ots=ZAESC95fcK&sig=Rvmsiq0-E7GYnGqw9ejqivzaKd4
- Groth-Marnat, G. (2000). Visions of clinical assessment: Then, now, and a brief history of the future. Journal of Clinical Psychology, 56(3), 349–365. https://doi.org/10.1002/(SICI)1097-4679(200003)56:3%3c349:AID-JCLP11%3e3.0.CO;2-T.CrossRefGoogle Scholar
- Hattie, J. (2009). Visible learning: A synthesis of 800 + meta-analyses on achievement. London: Routledge.Google Scholar
- Labster. (2019). Labster—Cytogenetics lab. Retrieved January 16th 2019 from https://www.youtube.com/watch?v=_VsabhW1LkA.
- Makransky, G., Bonde, M. T., Wulff, J. S. G., Wandall, J., Hood, M., Creed, P. A., et al. (2016). Simulation based virtual learning environment in medical genetics counseling: An example of bridging the gap between theory and practice in medical education. BMC Medical Education, 16, 98. https://doi.org/10.1186/s12909-016-0620-6.CrossRefGoogle Scholar
- Makransky, G., Lilleholt, L., & Aaby, A. (2017). Development and validation of the Multimodal Presence Scale for virtual reality environments: A confirmatory factor analysis and item response theory approach. Computers in Human Behavior, 72, 276–285. https://doi.org/10.1016/j.chb.2017.02.066.CrossRefGoogle Scholar
- Makransky, G., Schnohr, C., Torsheim, T., & Currie, C. (2014). Equating the HBSC family affluence scale across survey years: A method to account for item parameter drift using the Rasch model. Quality of Life Research, 23(10), 2899–2907. https://doi.org/10.1007/s11136-014-0728-2.CrossRefGoogle Scholar
- Makransky, G., Terkildsen, T. S., & Mayer, R. E. (2019c). Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learning and Instruction, 60, 225–236. https://doi.org/10.1016/j.learninstruc.2017.12.007.CrossRefGoogle Scholar
- Marcus, N., Ben-Naim, D., & Bain, M. (2011). Instructional support for teachers and guided feedback for students in an adaptive elearning environment. In Information Technology: New Generations (ITNG), 2011 Eighth International Conference on (pp. 626–631). IEEE.Google Scholar
- Mayer, R. E. (2008). Learning and instruction (2nd ed.). Upper Saddle River, NJ: Pearson Merrill Prentice Hall.Google Scholar
- Mayer, R. E. (2011). Applying the science of learning. Boston: Pearson.Google Scholar
- McDermott, K. B., Agarwal, P. K., D’Antonio, L., Roediger, H. L., & McDaniel, M. A. (2014). Both multiple-choice and short-answer quizzes enhance later exam performance in middle and high school classes. Journal of Experimental Psychology: Applied, 20, 3–21.Google Scholar
- McGaghie, W. C., Issenberg, S. B., Cohen, E. R., Barsuk, J. H., & Wayne, D. B. (2011). Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Academic Medicine : Journal of the Association of American Medical Colleges, 86(6), 706–711. https://doi.org/10.1097/ACM.0b013e318217e119.CrossRefGoogle Scholar
- Merchant, Z., Goetz, E. T., Cifuentes, L., Keeney-Kennicutt, W., & Davis, T. J. (2014). Effectiveness of virtual reality-based instruction on students’ learning outcomes in K-12 and higher education: A meta-analysis. Computers & Education, 70, 29–40. https://doi.org/10.1016/j.compedu.2013.07.033.CrossRefGoogle Scholar
- Mislevy, R., J. (2016). Postmodern test theory. The gordon commission on the future of assessment in education. Retrieved from http://www.gordoncommission.org/rsc/pdf/mislevy_postmodern_test_theory.pdf
- Moreno, R. (2004). Decreasing cognitive load for novice students: Effects of explanatory versus corrective feedback in discovery-based multimedia. Instructional Science, 32(1/2), 99–113. https://doi.org/10.1023/B:TRUC.0000021811.66966.1d.CrossRefGoogle Scholar
- Moreno, R., & Valdez, A. (2005). Cognitive load and learning effects of having students organize pictures and words in multimedia environments: The role of student interactivity and feedback. Educational Technology Research and Development, 53(3), 35–45. https://doi.org/10.1007/BF02504796.CrossRefGoogle Scholar
- National Research Council. (2011). Learning science through computer games and simulations. Washington: National Research Council.Google Scholar
- Nitko, A. J. (1996). Educational assessment of students. Des Moines, IA: Prentice-Hall Order Processing Center.Google Scholar
- Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academies Press.Google Scholar
- Pellegrino, J. W., & Hilton, M. L. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: National Academies Press.Google Scholar
- Perkins, D. (1994). Do students understand understanding? Education Digest, 59(5), 21.Google Scholar
- Pintrich, P. R. R., Smith, D., Garcia, T., & McKeachie, W. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ). Ann Arbor, MI: University of Michigan.Google Scholar
- Rummer, R., Schweppe, J., Scheiter, K., & Gerjets, P. (2008). Lernen mit Multimedia: die kognitiven Grundlagen des Modalitätseffekts. Psychologische Rundschau, 59(2), 98–107.Google Scholar
- Ryan, R. M., Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55 (1): 68–78. CiteSeerX10.1.1.529.4370. https://doi.org/10.1037/0003-066x.55.1.68.
- Ryan, R. M., & Deci, E. L. (2016). Facilitating and hindering motivation, learning, and well-being in schools: Research and observations from self-determination theory. In K. R. Wentzel & D. B. Miele (Eds.), Handbook of motivation at school (2nd ed., pp. 96–119). New York: Routledge.Google Scholar
- Sadler, D. R. (1998). Formative assessment: revisiting the territory. Assessment in Education, 5(1), 77e84.Google Scholar
- Schraw, G., Mayrath, M. C., ClarkeMidura, J., & Robinson, D. H. (Eds.). (2012). Technology based assessments for 21st century skills: Theoretical and practical implications from modern research. IAPGoogle Scholar
- Schunk, D. H., & DiBenedetto, M. K. (2016). Self-efficacy theory in education. In K. R. Wentzel & D. B. Miele (Eds.), Handbook of motivation at school (pp. 34–54). New York: Routledge.Google Scholar
- Shavelson, R. J., Young, D. B., Ayala, C. C., Brandon, P. R., Furtak, E. M., Ruiz-Primo, M. A., ··· & Yin, Y. (2008). On the impact of curriculum-embedded formative assessment on learning: A collaboration between curriculum and assessment developers. Applied Measurement in Education, 21(4), 295–314.Google Scholar
- Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. Computer Games and Instruction, 55(2), 503–524.Google Scholar
- Shute, V. J., Ventura, M., Bauer, M., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning. Serious Games: Mechanisms and Effects, 2, 295–321.Google Scholar
- Smith Jr., E. V. (2002). Detecting and evaluating the impact of multidimensionality using item fit statistics and principal component analysis of residuals. Journal of Applied Measurement, 3(2), 205–231. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/12011501
- Stiggins, R. J., Arter, J. A., Chappuis, J., & Chappuis, S. (2004). Classroom assessment for student learning: doing it right–using it well. Portland: Assessment Training Institute.Google Scholar
- Strandbygaard, J., Bjerrum, F., Maagaard, M., Winkel, P., Larsen, C. R., Ringsted, C., ··· & Sorensen, J. L. (2013). Instructor feedback versus no instructor feedback on performance in a laparoscopic virtual reality simulator: a randomized trial. Annals of Surgery, 257(5), 839–844.Google Scholar