Skip to main content

Temporal Generalizability of Face-Based Affect Detection in Noisy Classroom Environments

  • Conference paper
  • First Online:
Artificial Intelligence in Education (AIED 2015)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9112))

Included in the following conference series:

Abstract

The goal of this paper was to explore the possibility of generalizing face-based affect detectors across multiple days, a problem which plagues physiological-based affect detection. Videos of students playing an educational physics game were collected in a noisy computer-enabled classroom environment where students conversed with each other, moved around, and gestured. Trained observers provided real-time annotations of learning-centered affective states (e.g., boredom, confusion) as well as off-task behavior. Detectors were trained using data from one day and tested on data from different students on another day. These cross-day detectors demonstrated above chance classification accuracy with average Area Under the ROC Curve (AUC, .500 is chance level) of .658, which was similar to within-day (training and testing on data collected on the same day) AUC of .667. This work demonstrates the feasibility of generalizing face-based affect detectors across time in an ecologically valid computer-enabled classroom environment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D’Mello, S.: A selective meta-analysis on the relative incidence of discrete affective states during learning with technology. Journal of Educational Psychology. 105, 1082–1099 (2013)

    Article  Google Scholar 

  2. Schutz, P., Pekrun, R. (eds.): Emotion in Education. Academic Press, San Diego, CA (2007)

    Google Scholar 

  3. Bosch, N., D’Mello, S., Mills, C.: What emotions do novices experience during their first computer programming learning session? In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS, vol. 7926, pp. 11–20. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  4. Pardos, Z.A., Baker, R.S.J.D., San Pedro, M.O.C.Z., Gowda, S.M., Gowda, S.M.: Affective states and state tests: investigating how affect throughout the school year predicts end of year learning outcomes. In: Proceedings of the Third International Conference on Learning Analytics and Knowledge, pp. 117–124. ACM, New York (2013)

    Google Scholar 

  5. Fiedler, K., Beier, S.: Affect and cognitive processes in educational contexts. International handbook of emotions in education, pp. 36–56 (2014)

    Google Scholar 

  6. D’Mello, S., Blanchard, N., Baker, R., Ocumpaugh, J., Brawner, K.: I feel your pain: a selective review of affect-sensitive instructional strategies. In: Sottilare, R., Graesser, A., Hu, X., and Goldberg, B. (eds.) Design Recommendations for Intelligent Tutoring Systems – vol. 2: Instructional Management, pp. 35–48 (2014)

    Google Scholar 

  7. D’Mello, S., Lehman, B., Sullins, J., Daigle, R., Combs, R., Vogt, K., Perkins, L., Graesser, A.: A time for emoting: when affect-sensitivity is and isn’t effective at promoting deep learning. In: Aleven, V., Kay, J., Mostow, J. (eds.) Intelligent Tutoring Systems. Lecture Notes in Computer Science, vol. 6094, pp. 245–254. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  8. Arroyo, I., Cooper, D.G., Burleson, W., Woolf, B.P., Muldner, K., Christopherson, R.: Emotion sensors go to school. AIED, pp. 17–24 (2009)

    Google Scholar 

  9. Alzoubi, O., Hussain, M., D’Mello, S., Calvo, R.A.: Affective Modeling from Multichannel Physiology: Analysis of Day Differences. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011, Part I. LNCS, vol. 6974, pp. 4–13. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  10. D’Mello, S., Kory, J.: Consistent but modest: a meta-analysis on unimodal and multimodal affect detection accuracies from 30 studies. In: Proceedings of the 14th ACM international conference on Multimodal interaction, pp. 31–38. ACM, New York (2012)

    Google Scholar 

  11. Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Transactions on Pattern Analysis and Machine Intelligence. 23, 1175–1191 (2001)

    Article  Google Scholar 

  12. Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence. 31, 39–58 (2009)

    Article  Google Scholar 

  13. Grafsgaard, J.F., Wiggins, J.B., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Automatically recognizing facial expression: predicting engagement and frustration. In: Proceedings of the 6th International Conference on Educational Data Mining (2013)

    Google Scholar 

  14. Kapoor, A., Burleson, W., Picard, R.W.: Automatic prediction of frustration. International Journal of Human-Computer Studies. 65, 724–736 (2007)

    Article  Google Scholar 

  15. Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., Movellan, J.R.: The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing. 5, 86–98 (2014)

    Article  Google Scholar 

  16. Bosch, N., Chen, Y., D’Mello, S.: It’s written on your face: detecting affective states from facial expressions while learning computer programming. In: Trausan-Matu, S., Boyer, K.E., Crosby, M., Panourgia, K. (eds.) ITS 2014. LNCS, vol. 8474, pp. 39–44. Springer, Heidelberg (2014)

    Chapter  Google Scholar 

  17. Bosch, N., D’Mello, S., Baker, R., Ocumpaugh, J., Shute, V.J., Ventura, M., Wang, L., Zhao, W.: Automatic detection of learning-centered affective states in the wild. In: Proceedings of the 2015 International Conference on Intelligent User Interfaces (IUI 2015). ACM, New York, NY, USA (in Press)

    Google Scholar 

  18. Shute, V.J., Ventura, M., Kim, Y.J.: Assessment and learning of qualitative physics in Newton’s Playground. The Journal of Educational Research. 106, 423–430 (2013)

    Article  Google Scholar 

  19. Ocumpaugh, J., Baker, R., Rodrigo, M.M.T.: Baker-Rodrigo observation method protocol (BROMP) 1.0. Training manual version 1.0. Technical Report. New York, NY: EdLab. Manila, Philippines: Ateneo Laboratory for the Learning Sciences (2012)

    Google Scholar 

  20. Littlewort, G., Whitehill, J., Wu, T., Fasel, I., Frank, M., Movellan, J., Bartlett, M.: The computer expression recognition toolbox (CERT). In: 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), pp. 298–305 (2011)

    Google Scholar 

  21. Ekman, P., Friesen, W.V.: Facial action coding system. Consulting Psychologist Press, Palo Alto, CA (1978)

    Google Scholar 

  22. Kory, J., D’Mello, S., Olney, A.: Motion Tracker: Cost-effective, non-intrusive, fully-automated monitoring of bodily movements using motion silhouettes. Presented at the (in review)

    Google Scholar 

  23. Allison, P.D.: Multiple regression: a primer. Pine Forge Press (1999)

    Google Scholar 

  24. Kononenko, I.: Estimating attributes: analysis and extensions of RELIEF. In: Bergadano, F., Raedt, L.D. (eds.) Machine Learning: ECML-94. Lecture Notes in Computer Science, vol. 784, pp. 171–182. Springer, Heidelberg (1994)

    Chapter  Google Scholar 

  25. Holmes, G., Donkin, A., Witten, I.H.: WEKA: a machine learning workbench. In: Proceedings of the Second Australian and New Zealand Conference on Intelligent Information Systems, pp. 357–361 (1994)

    Google Scholar 

  26. Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: synthetic minority over-sampling technique. Journal of Artificial Intelligence Research. 16, 321–357 (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nigel Bosch .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Bosch, N., D’Mello, S., Baker, R., Ocumpaugh, J., Shute, V. (2015). Temporal Generalizability of Face-Based Affect Detection in Noisy Classroom Environments. In: Conati, C., Heffernan, N., Mitrovic, A., Verdejo, M. (eds) Artificial Intelligence in Education. AIED 2015. Lecture Notes in Computer Science(), vol 9112. Springer, Cham. https://doi.org/10.1007/978-3-319-19773-9_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-19773-9_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-19772-2

  • Online ISBN: 978-3-319-19773-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics