Towards Unsupervised Detection of Affective Body Posture Nuances

  • P. Ravindra De Silva
  • Andrea Kleinsmith
  • Nadia Bianchi-Berthouze
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3784)

Abstract

Recently, researchers have been modeling three to nine discrete emotions for creating affective recognition systems. However, in every day life, humans use a rich and powerful language for defining a large variety of affective states. Thus, one of the challenging issues in affective computing is to give computers the ability to recognize a variety of affective states using unsupervised methods. In order to explore this possibility, we describe affective postures representing 4 emotion categories using low level descriptors. We applied multivariate analysis to recognize and categorize these postures into nuances of these categories. The results obtained show that low-level posture features may be used for this purpose, leaving the naming issue to interactive processes.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Piller, C.: A Human Touch for Machines- The radical movement affective computing is turning the filed of artificial intelligence upside down by adding emotion to the equation. Los Angeles Times, Los Angeles (2002)Google Scholar
  2. 2.
    Whissell, C.: The dictionary of affect in language, Emotion: Theory, research and experience, The measurement of emotions, vol. 4. Academic press, New York (1995)Google Scholar
  3. 3.
    Plutchik, R.: Emotion:A psychoevolutionary synthesis. Harper and Row (1989)Google Scholar
  4. 4.
    Schröder, M.: Dimensional emotion representation as a basis for speech synthesis with nonextreme emotions. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068, pp. 209–220. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  5. 5.
    Tsapatsoulis, N., Raousaiou, A., Kollias, S., Cowie, R.: Emotion recognition and synthesis based on mpeg-4 faps. In: MPEG-4 Facial Animation - The standard, implementations, applications, pp. 141–167 (2002)Google Scholar
  6. 6.
    Ruttkay, Z., Noot, H., Hagen, P.: Emotion disc and emotion squares:tools to explore the facial expression space. Computer Graphics Forum 22, 49–53 (2003)CrossRefGoogle Scholar
  7. 7.
    Ekman, P.: Emotion in the Human Face. Cambridge University Press, Cambridge (1982)Google Scholar
  8. 8.
    Argyle, M.: Bodily Communication. Methuen & Co. Ltd, London (1988)Google Scholar
  9. 9.
    Coulson, M.: Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. Journ. of Nonver. Behav. 28, 117–139 (2004)CrossRefGoogle Scholar
  10. 10.
    Kleinsmith, A., de Silva, P.R., Bianchi-Berthouze, N.: Recognizing emotion from postures: Cross-cultural differences in user modeling. In: Ardissono, L., Brna, P., Mitrović, A. (eds.) UM 2005. LNCS (LNAI), vol. 3538, pp. 50–59. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  11. 11.
    Bianchi-Berthouze, N., Kleinsmith, A.: A categorical approach to affective gesture recognition. Connection Science special issue on Epigenetic Robotics - Modeling Cognitive Development in Robotic Systems 15, 259–269 (2003)Google Scholar
  12. 12.
    De Silva, R., Bianchi-Berthouze, N.: Modeling human affective postures: An information theoretic characterization of posture features. Journal of Computer Animation and Virtual Worlds 15, 269–276 (2004)CrossRefGoogle Scholar
  13. 13.
    Hastie, T., Tibshirabi, R.: Discriminant analysis by gaussian mixture. Journal of the Royal Statistical Society, B, 155–176 (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • P. Ravindra De Silva
    • 1
  • Andrea Kleinsmith
    • 1
  • Nadia Bianchi-Berthouze
    • 1
  1. 1.Database Systems LaboratoryUniversity of AizuAizu WakamatsuJapan

Personalised recommendations