Skip to main content

Development and Experiment of Classroom Engagement Evaluation Mechanism During Real-Time Online Courses

  • Conference paper
  • First Online:
Artificial Intelligence in Education (AIED 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13916))

Included in the following conference series:

  • 3002 Accesses

Abstract

Student engagement is an essential indicator of the quality of teaching. However, during real-time online classes, teachers are required to balance course content and observe students’ reactions simultaneously. Here, we aim to develop a web-based classroom evaluation system for teachers that evaluates student participation automatically. In this study, we present a novel mechanism that evaluates student participation based on multi-reaction. The system estimates students’ head poses and facial expressions through the camera and uses the estimation results as criteria for assessing student participation. Additionally, we conducted two evaluation experiments to demonstrate the system’s effectiveness in automatically evaluating student participation in a real-time online classroom environment with nine students. The instruction experiment was divided into head pose estimation, facial expression recognition, and multi-reaction estimation; the accuracy rates were 70.0%, 60.0%, and 80.0%, respectively. Although participants did not show many heads poses in the simulation classroom experiment, the system evaluated the classroom by assessing expression recognition, and 70% of the questions showed the same results as those of a questionnaire.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/justadudewhohacks/face-api.js/.

References

  1. Abadi, M., Agarwal, A., Barham, P., et al.: TensorFlow: large-scale machine learning on heterogeneous systems (2015). http://arxiv.org/abs/1603.04467. Software available from http://tensorflow.org

  2. Chen, Z., Liang, M., Yu, W., Huang, Y., Wang, X.: Intelligent teaching evaluation system integrating facial expression and behavior recognition in teaching video. In: Proceedings of the 2021 IEEE International Conference on Big Data and Smart Computing (BigComp), pp. 52–59 (2021)

    Google Scholar 

  3. Dhall, A., Murthy, O.R., Goecke, R., Joshi, J., Gedeon, T.: Video and image based emotion recognition challenges in the wild: EmotiW 2015. In: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (ICMI), pp. 423–426 (2015)

    Google Scholar 

  4. Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17(2), 124–129 (1971)

    Article  Google Scholar 

  5. Guo, Q.: Detection of head raising rate of students in classroom based on head posture recognition. Traitement du Signal 37(5), 823–830 (2020)

    Article  MathSciNet  Google Scholar 

  6. Hu, P., Ramanan, D.: Finding tiny faces. In: Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 951–959 (2017)

    Google Scholar 

  7. Kuo, C.M., Lai, S.H., Sarkis, M.: A compact deep learning model for robust facial expression recognition. In: Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 2202–2210 (2018)

    Google Scholar 

  8. Li, J., et al.: CAS(ME)\(^3\): a third generation facial spontaneous micro-expression database with depth information and high ecological validity. IEEE Trans. Pattern Anal. Mach. Intell. 45(3), 2782–2800 (2023)

    Google Scholar 

  9. Linson, A., Xu, Y., English, A.R., Fisher, R.B.: Identifying student struggle by analyzing facial movement during asynchronous video lecture viewing: towards an automated tool to support instructors. In: Proceedings of the 23rd International Conference on Artificial Intelligence in Education (AIED), pp. 53–65 (2022)

    Google Scholar 

  10. Liu, L., Liu, W., Fan, Z., Xu, J., Cheng, W.: A measuring system for teacher-student interaction in classroom. In: Proceedings of the 2019 International Conference on Modern Educational Technology (ICMET), pp. 71–76 (2019)

    Google Scholar 

  11. Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z.: The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 94–101 (2010)

    Google Scholar 

  12. Lyons, M.J., Akamatsu, S., Kamachi, M., Gyoba, J.: Coding facial expressions with Gabor wavelets. In: Proceedings of the 3rd IEEE International Conference on Automatic Face and Gesture Recognition (FG), pp. 200–205 (1998)

    Google Scholar 

  13. Peng, Y., Kikuchi, M., Ozono, T.: Online classroom evaluation system based on multi-reaction estimation. In: Proceedings of the 20th IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT), pp. 500–505 (2021)

    Google Scholar 

  14. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. In: Proceedings of the 3rd International Conference on Learning Representations (ICLR), pp. 1–14 (2015)

    Google Scholar 

  15. Strohmaier, A.R., MacKay, K.J., Obersteiner, A., Reiss, K.M.: Eye-tracking methodology in mathematics education research: a systematic literature review. Educ. Stud. Math. 104(2), 147–200 (2020)

    Article  Google Scholar 

  16. Xu, X., Teng, X.: Classroom attention analysis based on multiple Euler angles constraint and head pose estimation. In: Proceedings of the 26th International Conference on Multimedia Modeling (MMM), pp. 329–340 (2020)

    Google Scholar 

  17. Zhang, Z., Li, Z., Liu, H., Cao, T., Liu, S.: Data-driven online learning engagement detection via facial expression and mouse behavior recognition technology. J. Educ. Comput. Res. 58(1), 63–86 (2020)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported in part by JSPS KAKENHI Grant Numbers JP19K12266, JP22K18006.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yanyi Peng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Peng, Y., Kikuchi, M., Ozono, T. (2023). Development and Experiment of Classroom Engagement Evaluation Mechanism During Real-Time Online Courses. In: Wang, N., Rebolledo-Mendez, G., Matsuda, N., Santos, O.C., Dimitrova, V. (eds) Artificial Intelligence in Education. AIED 2023. Lecture Notes in Computer Science(), vol 13916. Springer, Cham. https://doi.org/10.1007/978-3-031-36272-9_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-36272-9_48

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-36271-2

  • Online ISBN: 978-3-031-36272-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics