Skip to main content

Exploring Indicators for Collaboration Quality and Its Dimensions in Classroom Settings Using Multimodal Learning Analytics

  • Conference paper
  • First Online:
Responsive and Sustainable Educational Futures (EC-TEL 2023)

Abstract

Multimodal Learning Analytics researchers have explored relationships between collaboration quality and multimodal data. However, the current state-of-art research works have scarcely investigated authentic settings and seldom used video data that can offer rich behavioral information. In this paper, we present our findings on potential indicators for collaboration quality and its underlying dimensions such as argumentation, and mutual understanding. We collected multimodal data (namely, video and logs) from 4 Estonian classrooms during authentic computer-supported collaborative learning activities. Our results show that vertical head movement (looking up and down) and mouth region features could be used as potential indicators for collaboration quality and its aforementioned dimensions. Also, our results from clustering provide indications of the potential of video data for identifying different levels of collaboration quality (e.g., high, low, medium). The findings have implications for building collaboration quality monitoring and guiding systems for authentic classroom settings.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://www.cotrack.website.

  2. 2.

    https://www.etherpad.org.

  3. 3.

    Head rotation along the x-axis (i.e., moving the head up and down).

  4. 4.

    Head rotation along the y-axis (i.e., moving the head left and right).

References

  1. Blikstein, P., Worsley, M.: Multimodal learning analytics and education data mining: using computational technologies to measure complex learning tasks. J. Learn. Analyt. 3(2), 220–238 (2016)

    Article  Google Scholar 

  2. Cai, Y., Shimojo, S., Hayashi, Y.: Observing facial muscles to estimate the learning state during collaborative learning: a focus on the ICAP framework. In: ICCE 2020–28th International Conference on Computers in Education, Proceedings, vol. 1, pp. 119–126 (2020)

    Google Scholar 

  3. Chejara, P., Prieto, L.P., Rodriguez-Triana, M.J., Kasepalu, R., Ruiz-Calleja, A., Shankar, S.K.: How to build more generalizable models for collaboration quality? lessons learned from exploring multi-context audio-log datasets using multimodal learning analytics. In: LAK23: 13th International Learning Analytics and Knowledge Conference, pp. 111–121. LAK2023, ACM, NY, USA (2023)

    Google Scholar 

  4. Chejara, P., Prieto, L.P., Ruiz-Calleja, A., Rodríguez-Triana, M.J., Shankar, S.K., Kasepalu, R.: EFAR-MMLA: an evaluation framework to assess and report generalizability of machine learning models in MMLA. Sensors 21(8), 2863 (2021)

    Article  Google Scholar 

  5. Chounta, I., Avouris, N.M.: Towards a time series approach for the classification and evaluation of collaborative activities. Comput. Informatics 34(3), 588–614 (2015)

    Google Scholar 

  6. Chua, Y.H.V., Dauwels, J., Tan, S.C.: Technologies for automated analysis of co-located, real-life, physical learning spaces: where are we now? In: LAK19: 9th International Conference on Learning Analytics & Knowledge, pp. 11–20. LAK19, ACM, NY, USA (2019)

    Google Scholar 

  7. Craig, S.D., D’Mello, S., Witherspoon, A., Graesser, A.: Emote aloud during learning with AutoTutor: applying the facial action coding system to cognitive - affective states during learning. Cogn. Emot. 22(5), 777–788 (2008)

    Article  Google Scholar 

  8. Crescenzi-Lanna, L.: Multimodal learning analytics research with young children: a systematic review. Br. J. Edu. Technol. 51(5), 1485–1504 (2020)

    Article  Google Scholar 

  9. Cukurova, M., Luckin, R., Milln, E., Mavrikis, M.: The NISPI framework. Comput. Educ. 116(C), 93–109 (2018). https://doi.org/10.1016/j.compedu.2017.08.007

  10. Di Mitri, D., Schneider, J., Specht, M., Drachsler, H.: From signals to knowledge: a conceptual model for multimodal learning analytics. J. Comput. Assist. Learn. 34(4), 338–349 (2018)

    Article  Google Scholar 

  11. Duncan, S.: Some signals and rules for taking speaking turns in conversations. J. Pers. Soc. Psychol. 23(2), 283 (1972)

    Article  MathSciNet  Google Scholar 

  12. Hayashi, Y.: Detecting collaborative learning through emotions: an investigation using facial expression recognition. In: Coy, A., Hayashi, Y., Chang, M. (eds.) ITS 2019. LNCS, vol. 11528, pp. 89–98. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-22244-4_12

    Chapter  Google Scholar 

  13. Huang, K., Bryant, T., Schneider, B.: Identifying collaborative learning states using unsupervised machine learning on eye-tracking, physiological and motion sensor data. In: EDM 2019 - Proceedings of the 12th International Conference on Educational Data Mining, pp. 318–323. EDM (2019)

    Google Scholar 

  14. Ismaili, O.A., Lemaire, V., Cornuéjols, A.: A supervised methodology to measure the variables contribution to a clustering. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds.) ICONIP 2014. LNCS, vol. 8834, pp. 159–166. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-12637-1_20

    Chapter  Google Scholar 

  15. Kasepalu, R., Prieto, L.P., Ley, T., Chejara, P.: Teacher artificial intelligence-supported pedagogical actions in collaborative learning coregulation: a wizard-of-Oz study. Front. Educ. 7, 736194 (2022)

    Google Scholar 

  16. Martinez-Maldonado, R., Clayphan, A., Yacef, K., Kay, J.: MTFeedback: providing notifications to enhance teacher awareness of small group work in the classroom. IEEE Trans. Learn. Technol. 8(2), 187–200 (2014)

    Article  Google Scholar 

  17. Ochoa, X., Worsley, M.: Augmenting learning analytics with multimodal sensory data. J. Learn. Analyt. 3(2), 213–219 (2016)

    Article  Google Scholar 

  18. Praharaj, S., Scheffel, M., Drachsler, H., Specht, M.: Literature review on co-located collaboration modeling using multimodal learning analytics-can we go the whole nine yards? IEEE Trans. Learn. Technol. 14(3), 367–385 (2021)

    Article  Google Scholar 

  19. Pugh, S.L., Rao, A., Stewart, A.E., D’Mello, S.K.: Do speech-based collaboration analytics generalize across task contexts? In: LAK22: 12th International Learning Analytics and Knowledge Conference, pp. 208–218. ACM, Online, USA (2022)

    Google Scholar 

  20. Rannastu-Avalos, M., Siiman, L.A.: Challenges for distance learning and online collaboration in the time of COVID-19: interviews with science teachers. In: Nolte, A., Alvarez, C., Hishiyama, R., Chounta, I.-A., Rodríguez-Triana, M.J., Inoue, T. (eds.) CollabTech 2020. LNCS, vol. 12324, pp. 128–142. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58157-2_9

    Chapter  Google Scholar 

  21. Reilly, J.M., Schneider, B.: Predicting the quality of collaborative problem solving through linguistic analysis of discourse. In: Desmarais, M.C., Lynch, C.F., Merceron, A., Nkambou, R. (eds.) Proceedings of the 12th International Conference on Educational Data Mining, EDM 2019, pp. 149–157. International Educational Data Mining Society (IEDMS), Montréal, Canada (2019)

    Google Scholar 

  22. Rummel, N., Deiglmayr, A., Spada, H., Kahrimanis, G., Avouris, N.: Analyzing collaborative interactions across domains and settings: an adaptable rating scheme. In: Puntambekar, S., Erkens, G., Hmelo-Silver, C. (eds.) Analyzing Interactions in CSCL. Computer-Supported Collaborative Learning Series, vol. 12, pp. 367–390. Springer, Boston (2011). https://doi.org/10.1007/978-1-4419-7710-6_17

  23. Schneider, B., Dich, Y., Radu, I.: Unpacking the relationship between existing and new measures of physiological synchrony and collaborative learning: a mixed methods study. Int. J. Comput.-Support. Collab. Learn. 15, 89–113 (2020)

    Article  Google Scholar 

  24. Schneider, B., Sung, G., Chng, E., Yang, S.: How can high-frequency sensors capture collaboration? a review of the empirical links between multimodal metrics and collaborative constructs. Sensors 21, 8185 (2021)

    Google Scholar 

  25. Sell, A., Cosmides, L., Tooby, J.: The human anger face evolved to enhance cues of strength. Evol. Hum. Behav. 35(5), 425–429 (2014)

    Article  Google Scholar 

  26. Sharma, K., Papavlasopoulou, S., Giannakos, M.: Joint emotional state of children and perceived collaborative experience in coding activities. In: Proceedings of the 18th ACM International Conference on Interaction Design and Children, IDC 2019, pp. 133–145 (2019). https://doi.org/10.1145/3311927.3323145

  27. Siatras, S., Nikolaidis, N., Krinidis, M., Pitas, I.: Visual lip activity detection and speaker detection using mouth region intensities. IEEE Trans. Circuits Syst. Video Technol. 19(1), 133–137 (2008)

    Article  Google Scholar 

  28. Smith, J., et al.: Spoken interaction modeling for automatic assessment of collaborative learning. In: Proceedings of the Speech Prosody 2016, pp. 277–281 (2016)

    Google Scholar 

  29. Soller, A., Martínez, A., Jermann, P., Muehlenbrock, M.: From mirroring to guiding: a review of state of the art technology for supporting collaborative learning. Int. J. Artif. Intell. Educ. 15(4), 261–290 (2005)

    Google Scholar 

  30. Spikol, D., Ruffaldi, E., Dabisias, G., Cukurova, M.: Supervised machine learning in multimodal learning analytics for estimating success in project-based learning. J. Comput. Assist. Learn. 34(4), 366–377 (2018)

    Article  Google Scholar 

  31. Stiefelhagen, R., Zhu, J.: Head orientation and gaze direction in meetings. In: CHI 2002 Extended Abstracts on Human Factors in Computing Systems, pp. 858–859. No. 1 in CHI EA 2002, ACM, New York, NY, USA (2002)

    Google Scholar 

  32. Van Leeuwen, A., Rummel, N., Van Gog, T.: What information should CSCL teacher dashboards provide to help teachers interpret CSCL situations? Int. J. Comput.-Support. Collab. Learn. 14(3), 261–289 (2019)

    Article  Google Scholar 

  33. Viswanathan, S.A., VanLehn, K.: Using the tablet gestures and speech of pairs of students to classify their collaboration. IEEE Trans. Learn. Technol. 11(2), 230–242 (2018)

    Article  Google Scholar 

  34. Weinberger, A., Fischer, F.: A framework to analyze argumentative knowledge construction in computer-supported collaborative learning. Comput. Educ. 46(1), 71–95 (2006)

    Article  Google Scholar 

  35. Worsley, M.: (dis)engagement matters: identifying efficacious learning practices with multimodal learning analytics. In: LAK18: 8th International Conference on Learning Analytics and Knowledge, pp. 365–369. LAK 2018, ACM, NY, USA (2018)

    Google Scholar 

Download references

Acknowledgements

The presented work has been partially funded by the Estonian Research Council’s Personal Research Grant (PRG) under grant number PRG1634. It also has been supported by grant RYC2021-032273-I, financed by MCIN/ AEI/ 10.13039/501100011033 and the European Union’s “NextGenerationEU/PRTR”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pankaj Chejara .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chejara, P. et al. (2023). Exploring Indicators for Collaboration Quality and Its Dimensions in Classroom Settings Using Multimodal Learning Analytics. In: Viberg, O., Jivet, I., Muñoz-Merino, P., Perifanou, M., Papathoma, T. (eds) Responsive and Sustainable Educational Futures. EC-TEL 2023. Lecture Notes in Computer Science, vol 14200. Springer, Cham. https://doi.org/10.1007/978-3-031-42682-7_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-42682-7_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-42681-0

  • Online ISBN: 978-3-031-42682-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics