Towards Observing and Assessing Collaborative Learning Activities in Immersive Environments

  • Samah Felemban
  • Michael Gardner
  • Victor Callaghan
  • Anasol Pena-Rios
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 725)

Abstract

Immersive environments such as 3-D virtual worlds have shown great potential for learning since they have many features that distinguish them from other learning systems. They support explaining difficult phenomena by visualising objects and their relationships. Additionally, they enable collaborative learning by connecting students in real-time and increase engagement and exploration between them. However, assessing learning outcomes in these environments involves many challenges such as dynamically capturing and analysing the actions of users and translating these actions into learning evidence. This paper focusses on significant aspects of the learning process: observation and assessment. It presents a virtual observation model that maps observing learners in classrooms with observing and assessing students in 3-D virtual environments. In addition, it demonstrates the implementation of the observation model and provides examples of its application. In general, the paper aims to enhance the learning affordances of the 3-D virtual worlds by recording all in-world learning evidence and visualising students’ assessments to improve learning.

Keywords

E-learning Multi-Users virtual environments 3D virtual worlds Assessment Virtual observation Collaborative learning Learning evidence 

References

  1. 1.
    Emerging Tools and Applications of Virtual Reality in Education. Ringgold Inc., Beaverton (2016)Google Scholar
  2. 2.
    Dalgarno, B., Lee, M.J.: What are the learning affordances of 3-D virtual environments? Br. J. Edu. Technol. 41(1), 10–32 (2010)CrossRefGoogle Scholar
  3. 3.
    Duncan, I., Miller, A., Jiang, S.: A taxonomy of virtual worlds usage in education. Br. J. Edu. Technol. 43(6), 949–964 (2012)CrossRefGoogle Scholar
  4. 4.
    Pena-Rios, A., et al.: Remote mixed reality collaborative laboratory activities: learning activities within the InterReality portal. In: Proceedings of the The 2012 IEEE/WIC/ACM International Joint Conferences on Web Intelligence and Intelligent Agent Technology, vol. 03. IEEE Computer Society, pp. 362–366 (2012)Google Scholar
  5. 5.
    Felemban, S.: Distributed pedagogical virtual machine (d-pvm). In: The Immersive Learning Research Network Conference (iLRN 2015), p. 58 (2015)Google Scholar
  6. 6.
    Wells, G.: Action, talk, and text: learning and teaching through inquiry. vol. 16. Teachers College Press (2001)Google Scholar
  7. 7.
    Gardner, M., Elliott, J.: The immersive education laboratory: understanding affordances, structuring experiences, and creating constructivist, collaborative processes, in mixed-reality smart environments. EAI Endorsed Trans. Future Intell. Educ. Environ. 14(1), e6 (2014)Google Scholar
  8. 8.
    Felemban, S., Gardner, M., Challagan, V.: Virtual observation lenses for assessing online collaborative learning environments. In: Immersive Learning Research Network (iLRN 2016). Verlag der Technischen Universität Graz: Santa Barbra, USA. pp. 80–92 (2016)Google Scholar
  9. 9.
    Felemban, S., Gardner, M., Callaghan, V.: An event detection approach for identifying learning evidence in collaborative virtual environments. In: 2016 8th Computer Science and Electronic Engineering (CEEC). IEEE Xplore (2016)Google Scholar
  10. 10.
    Angelo, T.A.: Reassessing (and defining) assessment. AAHE Bull. 48, 7–9 (1995)Google Scholar
  11. 11.
    Reiners, T., Gregory, S., Dreher, H.: Educational assessment in virtual world environments. In: ATN Assessment Conference 2011: Meeting the Challenges (2011)Google Scholar
  12. 12.
    Bloomfield, P.R., Livingstone, D.: Immersive learning and assessment with quizHUD. Comput. Inf. Syst. J. 13(1), 20–26 (2009)Google Scholar
  13. 13.
    Livingstone, D., Kemp, J.: Integrating web-based and 3D learning environments: second Life meets Moodle. Next Gener. Technol.-Enhanced Learn. 8 (2008)Google Scholar
  14. 14.
    Thompson, K., Markauskaite, L.: Identifying Group Processes and Affect in Learners: A Holistic Approach to. Cases on the Assessment of Scenario and Game-Based Virtual Worlds in Higher Education, p. 175 (2014)Google Scholar
  15. 15.
    Schunn, C.D., Anderson, J.R.: The generality/specificity of expertise in scientific reasoning. Cogn. Sci. 23(3), 337–370 (1999)CrossRefGoogle Scholar
  16. 16.
    Kerr, D., Chung, G.K.: Identifying key features of student performance in educational video games and simulations through cluster analysis. JEDM-J. Educ. Data Min. 4(1), 144–182 (2012)Google Scholar
  17. 17.
    Bernardini, A., Conati, C.: Discovering and recognizing student interaction patterns in exploratory learning environments. In: Aleven, V., Kay, J., Mostow, J. (eds.) ITS 2010. LNCS, vol. 6094, pp. 125–134. Springer, Heidelberg (2010). doi:10.1007/978-3-642-13388-6_17 CrossRefGoogle Scholar
  18. 18.
    Mislevy, R.J., Almond, R.G., Lukas, J.F.: A brief introduction to evidence‐centered design. ETS Res. Report Ser. 2003(1), i–29 (2003)Google Scholar
  19. 19.
    Tesfazgi, S.H.: Survey on behavioral observation methods in virtual environments. Research Assignment. Delft Univ. of Tech. (2003)Google Scholar
  20. 20.
    Gobert, J.D., et al.: Leveraging educational data mining for real-time performance assessment of scientific inquiry skills within microworlds. JEDM-J. Educ. Data Min. 4(1), 111–143 (2012)Google Scholar
  21. 21.
    Mislevy, R.J., Riconscente, M.: Evidence-centered assessment design. Handbook of Test Development, pp. 61–90 (2006)Google Scholar
  22. 22.
    Shute, V.J., Rieber, L., Van Eck, R.: Games… and… learning. Trends and Issues in Instr. Des. Technol. 3 (2011)Google Scholar
  23. 23.
    Shute, V.J.: Stealth assessment in computer-based games to support learning. Comput. Games Instruction 55(2), 503–524 (2011)Google Scholar
  24. 24.
    Borich, G.D.: Observation Skills for Effective Teaching. Routledge (2016)Google Scholar
  25. 25.
    Ibáñez, M.B., Crespo, Raquel M., Kloos, C.D.: Assessment of knowledge and competencies in 3D virtual worlds: a proposal. In: Reynolds, N., Turcsányi-Szabó, M. (eds.) KCKS 2010. IAICT, vol. 324, pp. 165–176. Springer, Heidelberg (2010). doi:10.1007/978-3-642-15378-5_16 CrossRefGoogle Scholar
  26. 26.
    Pena-Rios, A.: Exploring mixed reality in distributed collaborative learning environments. In: School of Computer Science and Electronic Engineering. University of Essex (2016)Google Scholar
  27. 27.
    Csapó, B., et al.: Technological issues for computer-based assessment. In: Griffin, P., McGaw, B., Care, E. (eds.) Assessment and Teaching of 21st Century Skills, pp. 143–230. Springer, Netherlands (2012)Google Scholar
  28. 28.
    Johnson, D.W.: Cooperation in the classroom. American Psychological Association (1991)Google Scholar
  29. 29.
    Kyllonen, P.C.: Measurement of 21st century skills within the common core state standards. In: Invitational Research Symposium on Technology Enhanced Assessments, May 2012Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Samah Felemban
    • 1
    • 2
  • Michael Gardner
    • 1
  • Victor Callaghan
    • 1
  • Anasol Pena-Rios
    • 1
  1. 1.University of EssexColchesterUK
  2. 2.Umm al-Qura UniversityMakkahSaudi Arabia

Personalised recommendations