Towards Knowledge-Based Affective Interaction: Situational Interpretation of Affect

  • Abdul Rehman Abbasi
  • Takeaki Uno
  • Matthew N. Dailey
  • Nitin V. Afzulpurkar
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4738)

Abstract

Human-to-computer interaction in a variety of applications could benefit if systems could accurately analyze and respond to their users’ affect. Although a great deal of research has been conducted on affect recognition, very little of this work has considered what is the appropriate information to extract in specific situations. Towards understanding how specific applications such as affective tutoring and affective entertainment could benefit, we present two experiments. In the first experiment, we found that students’ facial expressions, together with their body actions, gave little information about their internal emotion per se but they would be useful features for predicting their self-reported “true” mental state. In the second experiment, we found significant differences between the facial expressions and self-reported affective state of viewers watching a movie sequence. Our results suggest that the noisy relationship between observable gestures and underlying affect must be accounted for when designing affective computing applications.

Keywords

Affective tutoring affective entertainer facial expression analysis gesture analysis situation-specific affect interpretation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Pantic, M., Sebe, N., Cohn, J.F., Huang, T.: Affective multimodal human-computer interaction. In: Proc. ACM. Int. Conf. Multimedia, pp. 669–676. ACM Press, New York (2005)Google Scholar
  2. 2.
    Picard, R.W.: Affective Computing. MIT Press, Cambridge (2000)Google Scholar
  3. 3.
    Prendinger, H., Becker, C., Ishizuka, M.: A study in users’ physiological response to an empathic interface agent. Int. J. of Humanoid Robotics 3(3), 371–391 (2006)CrossRefGoogle Scholar
  4. 4.
    Kapoor, A., Burleson, W., Picard, R.W.: Automatic prediction of frustration. International Journal of Human-Computer Studies (in press, 2007)Google Scholar
  5. 5.
    Darwin, C.: The Expression of the Emotions in Man and Animals. Oxford University Press, Oxford (1998)Google Scholar
  6. 6.
    Landis, C.: Studies of emotional reactions: I. A preliminary study of facial expression. Journal of Experimental Psychology 8 (1924)Google Scholar
  7. 7.
    Ekman, P., Friesen, W.: Constants across cultures in the face and emotion. Journal of Personality and Social Psychology 17(2), 124–129 (1971)CrossRefGoogle Scholar
  8. 8.
    Russell, J.A., Bullock, M.: Fuzzy concepts and the perception of emotion in facial expressions. Social Cognition 4, 309–341 (1986)Google Scholar
  9. 9.
    Ekman, P., Friesen, W.: Facial Action Coding System (FACS) Manual. Consulting Psychologists Press, Palo Alto, Calif (1978)Google Scholar
  10. 10.
    Fasel, B., Luettin, J.: Automatic facial expression analysis: A survey. Pattern Recognition 36(1), 259–275 (2003)MATHCrossRefGoogle Scholar
  11. 11.
    Pantic, M., Rothkrantz, L.J.M.: Automatic analysis of facial expressions: The state of the art. IEEE PAMI 22(12), 1424–1445 (2000)Google Scholar
  12. 12.
    Tian, Y., Kanade, T., Cohn, J.F.: Facial expression analysis. In: Li, S.Z., Jain, A.K. (eds.) Handbook of Face Recognition, Springer, Berlin (2005)Google Scholar
  13. 13.
    Kim, K.K., Kwak, K.C., Chi, S.Y.: Gesture analysis for human-robot interaction. In: Proc. 8th International Conference on Advanced Communication Technology (ICACT), pp. 1824–1827 (2006)Google Scholar
  14. 14.
    Dadgostar, F., Ryu, H., Sarrafzadeh, A., Overmyer, S.P.: Making sense of student use of non-verbal cues for intelligent tutoring systems. In: Proc. OZCHI (2005)Google Scholar
  15. 15.
    Tao, J., Tieniu, T.: Affective computing: A review. In: Proc. 1st International Conference on Affective Computing and Intelligent Interaction, pp. 981–995 (2005)Google Scholar
  16. 16.
  17. 17.
    Ekman, P., Friesen, W.: Pictures of Facial Affect. Consulting Psychologists Press, Palo Alto, Calif (1975)Google Scholar
  18. 18.
    Matsumoto, D., Ekman, P.: American-Japanese cultural differences in intensity ratings of facial expressions of emotion. Motivation and Emotion 13(2), 143–157 (1989)CrossRefGoogle Scholar
  19. 19.
    Cohn-Kanade AU-Coded Facial Expression Database (CMU Database), http://www.cs.cmu.edu/~face/index2.htm
  20. 20.
    Sun, Y., Sebe, N., Lew, M.S., Gevers, T.: Authentic emotion detection in real-time video. In: Sebe, N., Lew, M.S., Huang, T.S. (eds.) Computer Vision in Human-Computer Interaction. LNCS, vol. 3058, pp. 94–104. Springer, Heidelberg (2004)Google Scholar
  21. 21.
  22. 22.
    Cowie, E.D., Cowie, R., Schröder, M.: A new emotion database: Considerations, sources and scope. In: Proc. ISCA ITRW on Speech and Emotion, pp. 39–44 (2000)Google Scholar
  23. 23.
    Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: Analysis of affective physiological state. IEEE PAMI 23(10), 1175–1191 (2001)Google Scholar
  24. 24.
  25. 25.
    Scherer, K.R.: What are emotions? And how can they be measured? Social Science Information 44(4), 695–729 (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Abdul Rehman Abbasi
    • 1
  • Takeaki Uno
    • 2
  • Matthew N. Dailey
    • 1
  • Nitin V. Afzulpurkar
    • 1
  1. 1.Asian Institute of Technology, BangkokThailand
  2. 2.National Institute of Informatics, TokyoJapan

Personalised recommendations