Affect Detection from Human-Computer Dialogue with an Intelligent Tutoring System

  • Sidney D’Mello
  • Art Graesser
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4133)


We investigated the possibility of detecting affect from natural language dialogue in an attempt to endow an intelligent tutoring system, AutoTutor, with the ability to incorporate the learner’s affect into its pedagogical strategies. Training and validation data were collected in a study in which college students completed a learning session with AutoTutor and subsequently affective states of the learner were identified by the learner, a peer, and two trained judges. We analyzed each of these 4 data sets with the judges’ affect decisions, along with several dialogue features that were mined from AutoTutor’s log files. Multiple regression analyses confirmed that dialogue features could significantly predict particular affective states (boredom, confusion, flow, and frustration). A variety of standard classifiers were applied to the dialogue features in order to assess the accuracy of discriminating between the individual affective states compared with the baseline state of neutral.


Affective State Latent Semantic Analysis Intelligent Tutoring System Facial Action Code System Emotion Judgment 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Morgado, L., Gaspar, G.: Emotion in Intelligent Virtual Agents: The Flow Model of Emotion. In: Rist, T., Aylett, R.S., Ballin, D., Rickel, J. (eds.) IVA 2003. LNCS (LNAI), vol. 2792, pp. 31–38. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  2. 2.
    Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)Google Scholar
  3. 3.
    Graesser, A., VanLehn, K., Rosé, C., Jordan, P., Harter, D.: Intelligent Tutoring Systems with Conversational Dialogue. AI Magazine 22, 39–51 (2001)Google Scholar
  4. 4.
    Gratch, J., Rickel, J., Andre, E., Cassell, J., Petajan, E., Badler, N.: Creating Interactive Virtual Humans: Some Assembly Required. IEEE Intelligent Systems 17, 54–63 (2002)CrossRefGoogle Scholar
  5. 5.
    Johnson, L.: Pedagogical Agent Research at CARTE. AI Magazine 22, 85–94 (2001)Google Scholar
  6. 6.
    Cohen, P.A., Kulik, J.A., Kulik, C.C.: Educational Outcomes of Tutoring: A Metaanalysis of Findings. American Educational Research Journal 19, 237–248 (1982)Google Scholar
  7. 7.
    McNamara, D.S., Levinstein, I.B., Boonthum, C.: iSTART: Interactive Strategy Trainer for Active Reading and Thinking. Behavioral Research Methods, Instruments, and Computers 36, 222–233 (2004)CrossRefGoogle Scholar
  8. 8.
    Guhe, M., Gray, W.D., Schoelles, M.J., Ji, Q.: Towards an Affective Cognitive Architecture. In: Poster Session Presented at the Cognitive Science Conference (2004)Google Scholar
  9. 9.
    Graesser, A.C., Chipman, P., Haynes, B., Olney, A.: AutoTutor: An Intelligent Tutoring System with Mixed-initiative Dialogue. IEEE Transactions in Education 48, 612–618 (2005)CrossRefGoogle Scholar
  10. 10.
    Lepper, M.R., Chabay, R.W.: Socializing the Intelligent Tutor: Bringing Empathy to Computer Tutors. In: Learning Issues for Intelligent Tutoring Systems, pp. 242–257 (1988)Google Scholar
  11. 11.
    Lepper, M.R., Woolverton, M.: The Wisdom of Practice: Lessons Learned from the Study of Highly Effective Tutors. In: Improving Academic Achievement: Impact of Psychological Factors on Education, pp. 135–158 (2002)Google Scholar
  12. 12.
    Kim, Y.: Empathetic Virtual Peers Enhanced Learner Interest and Self-efficacy. Workshop on Motivation and Affect in Educational Software. In: 12th International Conference on Artificial Intelligence in Education (2005)Google Scholar
  13. 13.
    Linnenbrink, E.A., Pintrich, P.R.: The Role of Motivational Beliefs in Conceptual Change. In: Reconsidering Conceptual Change: Issues in Theory and Practice, pp. 115–135 (2002)Google Scholar
  14. 14.
    Craig, S.D., Graesser, A.C., Sullins, J., Gholson, B.: Affect and Learning: An Exploratory Look into the Role of Affect in Learning. Journal of Educational Media 29, 241–250 (2004)Google Scholar
  15. 15.
    Csikszentmihalyi, M.: Flow: The Psychology of Optimal Experience. Harper-Row, New York (1990)Google Scholar
  16. 16.
    Graesser, A.C., Person, N., Harter, D.: Tutoring Research Group.: Teaching Tactics and Dialogue in AutoTutor. International Journal of Artificial Intelligence in Education 12, 257–279 (2001)Google Scholar
  17. 17.
    D’Mello, S.K., Craig, S.D., Gholson, B., Franklin, S., Picard, R., Graesser, A.C.: Integrating Affect Sensors in an Intelligent Tutoring System. in Affective Interactions: The Computer in the Affective Loop. In: Workshop at 2005 International Conference on Intelligent User Interfaces, pp. 7–13 (2005)Google Scholar
  18. 18.
    Graesser, A.C., McDaniel, B., Chipman, P., Witherspoon, A., D’Mello, S., Gholson, B.: Detection of Emotions During Learning with AutoTutor. In: 28th Annual Conference of the Cognitive Science Society, CogSci. 2006 (in press)Google Scholar
  19. 19.
    Cohn, J.F., Kanade, T.: Use of Automated Facial Image Analysis for Measurement of Emotion Expression. In: Coan, J.A., Allen, J.B. (eds.) The Handbook of Emotion Elicitation and Assessment. Oxford University Press Series in Affective Science, New York Oxford (in press)Google Scholar
  20. 20.
    Mota, S., Picard, R.W.: Automated Posture Analysis for Detecting Learner’s Interest Level. In: Workshop on Computer Vision and Pattern Recognition for Human-Computer Interaction, CVPR HCI (2003)Google Scholar
  21. 21.
    Forbes-Riley, K., Litman, D.: Predicting Emotion in Spoken Dialogue from Multiple Knowledge Sources. In: Proceedings of the Human Language Technology Conference: 4th Meeting of the North American Chapter of the Association for Computational Linguistics, HLT/NAACL (2004)Google Scholar
  22. 22.
    Litman, D.J., Forbes-Riley, K.: Predicting Student Emotions in Computer-Human Tutoring Dialogues. In: Proceedings of the 42nd Annual Meeting of the Association for Computational Linguistics, pp. 352–359 (2004)Google Scholar
  23. 23.
    Litman, D.J., Silliman, S.: ITSPOKE: An Intelligent Tutoring Spoken Dialogue System. In: Proceedings of the Human Language Technology Conference: 3rd Meeting of the North American Chapter of the Association of Computational Linguistics, pp. 52–54 (2004)Google Scholar
  24. 24.
    Shafran, I., Riley, M., Mohri, M.: Voice Signatures. In: Proc. IEEE Automatic Speech Recognition and Understanding Workshop (2003)Google Scholar
  25. 25.
    Nakasone, A., Prendinger, H., Ishizuka, M.: Emotion Recognition from Electromyography and Skin Conductance. In: Fifth International Workshop on Biosignal Interpretation, pp. 219–222 (2005)Google Scholar
  26. 26.
    Rani, P., Sarkar, N., Smith, C.A.: An affect-sensitive Human-Robot Cooperation: Theory and Experiments. In: Proceedings of the IEEE Conference on Robotics and Automation, pp. 2382–2387 (2003)Google Scholar
  27. 27.
    Alm, C.O., Sproat, R.: Perceptions of Emotions in Expressive Storytelling. In: InterSpeech, pp. 533–536 (2005)Google Scholar
  28. 28.
    VanLehn, K., Jordan, P., Rosé, C.P., Bhembe, D., Bottner, M., Gaydos, A., et al.: The Architecture of Why2-atlas: A Coach for Qualitative Physics Essay Writing. In: Cerri, S.A., Gouardéres, G., Paraguaçu, F. (eds.) ITS 2002. LNCS, vol. 2363, p. 158. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  29. 29.
    Carberry, S., Schroeder, L., Lambert, L.: Toward Recognizing and Conveying an Attitude of Doubt via Natural Language. Applied Artificial Intelligence 16, 495–517 (2002)CrossRefGoogle Scholar
  30. 30.
    Graesser, A., Wiemer-Hastings, K., Wiemer-Hastings, P., Kreuz, R.: Tutoring Research Group.: AutoTutor: A Simulation of a Human Tutor. Journal of Cognitive Systems Research 1, 35–51 (1999)CrossRefGoogle Scholar
  31. 31.
    Landauer, T.K., Dumais, S.T.: A Solution to Plato’s Problem: The Latent Semantic Analysis Theory of Acquisition, Induction, and Representation of Knowledge. Psychological Review 104, 211–240 (1997)CrossRefGoogle Scholar
  32. 32.
    Olney, A., Louwerse, M., Mathews, E., Marineau, J., Hite-Mitchell, H., Graesser, A.: Utterance Classification in AutoTutor. In: Proceedings of the HLT-NAACL 2003 Workshop on Building Educational Applications using Natural Language Processing, pp. 1–8 (2003)Google Scholar
  33. 33.
    Ekman, P., Friesen, W.V.: The Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto (1978)Google Scholar
  34. 34.
    D’Mello, S.K., Craig, S.D., Sullins, J., raesser, A.C.: Predicting Affective States through an Emote-Aloud Procedure from AutoTutor’s Mixed-Initiative Dialogue. International Journal of Artificial Intelligence in Education 16, 3–28 (2006)Google Scholar
  35. 35.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 3rd edn. Morgan Kaufmann, San Francisco (2005)MATHGoogle Scholar
  36. 36.
    Kozma, R., Freeman, W.J.: Chaotic Resonance: Methods and Applications for Robust Classification of Noisy and Variable Patterns. International Journal of Bifurcation and Chaos 11, 1607–1629 (2001)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Sidney D’Mello
    • 1
  • Art Graesser
    • 2
  1. 1.Department of Computer ScienceThe University of MemphisMemphisUSA
  2. 2.Department of PsychologyThe University of MemphisMemphisUSA

Personalised recommendations