It’s Written on Your Face: Detecting Affective States from Facial Expressions while Learning Computer Programming
- 2.3k Downloads
We built detectors capable of automatically recognizing affective states of novice computer programmers from student-annotated videos of their faces recorded during an introductory programming tutoring session. We used the Computer Expression Recognition Toolbox (CERT) to track facial features based on the Facial Action Coding System, and machine learning techniques to build classification models. Confusion/Uncertainty and Frustration were distinguished from all other affective states in a student-independent fashion at levels above chance (Cohen’s kappa = .22 and .23, respectively), but detection accuracies for Boredom, Flow/Engagement, and Neutral were lower (kappas = .04, .11, and .07). We discuss the differences between detection of spontaneous versus fixed (polled) judgments as well as the features used in the models.
KeywordsAffective State Facial Feature Cognitive Science Society Facial Action Code System Affect Judgment
Unable to display preview. Download preview PDF.
- 2.D’Mello, S., Lehman, B., Sullins, J., Daigle, R., Combs, R., Vogt, K., Perkins, L., Graesser, A.: A time for emoting: When affect-sensitivity is and isn’t effective at promoting deep learning. In: Aleven, V., Kay, J., Mostow, J. (eds.) ITS 2010, Part I. LNCS, vol. 6094, pp. 245–254. Springer, Heidelberg (2010)CrossRefGoogle Scholar
- 4.McDaniel, B.T., D’Mello, S.K., King, B.G., Chipman, P., Tapp, K., Graesser, A.C.: Facial features for affective state detection in learning environments. In: Proceedings of the 29th Annual Cognitive Science Society, pp. 467–472 (2007)Google Scholar
- 7.Littlewort, G., Whitehill, J., Wu, T., Fasel, I., Frank, M., Movellan, J., Bartlett, M.: The computer expression recognition toolbox (CERT). In: 2011 IEEE International Conference on Automatic Face Gesture Recognition and Workshops (FG 2011), pp. 298–305 (2011)Google Scholar
- 8.Whitehill, J.R.: A stochastic optimal control perspective on affect-sensitive teaching. PhD dissertation, University of California, San Diego (2012)Google Scholar
- 9.Grafsgaard, J.F., Wiggins, J.B., Boyer, K.E., Wiebe, E.N., Lester, J.C.: Automatically Recognizing Facial Indicators of Frustration: A Learning-Centric Analysis (2013)Google Scholar
- 10.D’Mello, S., Kory, J.: Consistent but modest: a meta-analysis on unimodal and multimodal affect detection accuracies from 30 studies. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction, pp. 31–38. ACM, New York (2012)Google Scholar
- 13.Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research 16, 321–357 (2011)Google Scholar
- 14.Graesser, A.C., McDaniel, B., Chipman, P., Witherspoon, A., D’Mello, S., Gholson, B.: Detection of emotions during learning with AutoTutor. In: Proceedings of the 28th Annual Meetings of the Cognitive Science Society, pp. 285–290 (2006)Google Scholar