Computing and Evaluating the Body Laughter Index

  • Maurizio Mancini
  • Giovanna Varni
  • Donald Glowinski
  • Gualtiero Volpe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7559)

Abstract

The EU-ICT FET Project ILHAIRE is aimed at endowing machines with automated detection, analysis, and synthesis of laughter. This paper describes the Body Laughter Index (BLI) for automated detection of laughter starting from the analysis of body movement captured by a video source. The BLI algorithm is described, and the index is computed on a corpus of videos. The assessment of the algorithm by means of subject’s rating is also presented. Results show that BLI can successfully distinguish between different videos of laughter, even if improvements are needed with respect to perception of subjects, multimodal fusion, cultural aspects, and generalization to a broad range of social contexts.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(1), 39–58 (2009)CrossRefGoogle Scholar
  2. 2.
    Picard, R.: Affective Computing. MIT Press (1997)Google Scholar
  3. 3.
    Hashimoto, S.: Kansei as the third target of information processing and related topics in japan. In: Proceedings of the International Workshop on KANSEI: The Technology of Emotion, pp. 101–104 (1997)Google Scholar
  4. 4.
    Grammer, K.: Strangers meet: Laughter and nonverbal signs of interest in opposite-sex encounters. Journal of Nonverbal Behavior 14(4), 209–236 (1990)CrossRefGoogle Scholar
  5. 5.
    Owren, M.J., Bachorowski, J.-A.: Reconsidering the evolution of nonlinguistic communication: The case of laughter. Journal of Nonverbal Behavior 27, 183–200 (2003)CrossRefGoogle Scholar
  6. 6.
    Fredrickson, B., et al.: The broaden-and-build theory of positive emotions. Philosophical Transactions - Royal Society of London Series B, 1367–1378 (2004)Google Scholar
  7. 7.
    Dunbar, R.I.M.: Mind the gap: Or why humans are not just great apes. In: Proceedings of the British Academy, vol. 154 (2008)Google Scholar
  8. 8.
    Ruch, W., Ekman, P.: The expressive pattern of laughter. Emotion, Qualia, and Consciousness, 426–443 (2001)Google Scholar
  9. 9.
    Becker-Asano, C., Kanda, T., Ishi, C., Ishiguro, H.: How about laughter? perceived naturalness of two laughing humanoid robots. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009, pp. 1–6. IEEE (2009)Google Scholar
  10. 10.
    De Graaf, V.: Human Anatomy, 6th edn. McGraw-Hill, New York (2002)Google Scholar
  11. 11.
    de Melo, C.M., Kenny, P., Gratch, J.: Real-time expression of affect through respiration. Computer Animation and Virtual Worlds 21(3-4), 225–234 (2010)Google Scholar
  12. 12.
    Markaki, V., Merlino, S., Mondada, L., Oloff, F.: Laughter in professional meetings: the organization of an emergent ethnic joke. Journal of Pragmatics 42(6), 1526–1542 (2010)CrossRefGoogle Scholar
  13. 13.
    Shahid, S., Krahmer, E., Swerts, M., Melder, W.A., Neerincx, M.A.: You make me happy: Using an adaptive affective interface to investigate the effect of social presence on positive emotion induction. In: 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, ACII 2009, pp. 1–6 (September 2009)Google Scholar
  14. 14.
    Melder, W.A., Truong, K.P., Uyl, M.D., Van Leeuwen, D.A., Neerincx, M.A., Loos, L.R., Plum, B.S.: Affective multimodal mirror: sensing and eliciting laughter. In: Proceedings of the International Workshop on Human-centered Multimedia, HCM 2007, pp. 31–40. ACM, New York (2007)CrossRefGoogle Scholar
  15. 15.
    Camurri, A., Mazzarino, B., Volpe, G.: Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 460–467. Springer, Heidelberg (2004), http://www.springerlink.com/index/RFT1L2J1UM7W2H86.pdf CrossRefGoogle Scholar
  16. 16.
    Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., Scherer, K.: Toward a minimal representation of affective gestures. IEEE Transactions on Affective Computing 2(2), 106–118 (2011)CrossRefGoogle Scholar
  17. 17.
    Winter, D.A.: Biomechanics and motor control of human movement. John Wiley & Sons, Inc., Toronto (1990)Google Scholar
  18. 18.
    Sethares, W.A., Staley, T.W.: Periodicity transforms. IEEE Transactions on Signal Processing 47(11), 2953–2964 (1999)MathSciNetMATHCrossRefGoogle Scholar
  19. 19.
    Sneddon, I., McRorie, M., McKeown, G., Hanratty, J.: The belfast induced natural emotion database. IEEE Transactions on Affective Computing 3(1), 32–41 (2012)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Maurizio Mancini
    • 1
  • Giovanna Varni
    • 1
  • Donald Glowinski
    • 1
  • Gualtiero Volpe
    • 1
  1. 1.InfoMus – University of GenoaItaly

Personalised recommendations