Advertisement

Emotion Recognition Using Physiological and Speech Signal in Short-Term Observation

  • Jonghwa Kim
  • Elisabeth André
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4021)

Abstract

Recently, there has been a significant amount of work on the recognition of emotions from visual, verbal or physiological information. Most approaches to emotion recognition so far concentrate, however, on a single modality while work on the integration of multimodal information, in particular on fusing physiological signals with verbal or visual data, is scarce. In this paper, we analyze various methods for fusing physiological and vocal information and compare the recognition results of the bimodal recognition approach with the results of the unimodal approach.

Keywords

Heart Rate Variability Speech Signal Emotion Recognition Recognition Accuracy Spectral Entropy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Picard, R., Vyzas, E., Healy, J.: Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. and Machine Intell. 23, 1175–1191 (2001)CrossRefGoogle Scholar
  2. 2.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.G.: Emotion recognition in human-computer interaction. IEEE Signal Processing Mag. 18, 32–80 (2001)CrossRefGoogle Scholar
  3. 3.
    Nwe, T.L., Wei, F.S., Silva, L.C.D.: Speech based emotion classification. In: IEEE Region 10 International Conference on Electrical Electronic Technology, vol. 1, pp. 297–301 (2001)Google Scholar
  4. 4.
    Chen, L.S.: Joint processing of audio-visual information for the recognition of emotional expression in human-computer interaction. Ph.D. thesis, University of Illinois at Urbana-Champaign, Dept. of Electrical Engineering (2000)Google Scholar
  5. 5.
    Batliner, A., Zeissler, V., Frank, C., Adelhardt, J., Shi, R.P., Nöth, E.: We are not amused-but how do you know? user states in a multi-modal dialogue system. In: EUROSPEECH 2003, Geneva, pp. 733–736 (2003)Google Scholar
  6. 6.
    Nasoz, F., Alvarez, K., Lisetti, C., Finkelstein, N.: Emotion recognition from physiological signals for presence technologies. International Journal of Cognition, Technology, and Work - Special Issue on Presence 6(1) (2003)Google Scholar
  7. 7.
    Wagner, J., Kim, J., André, E.: From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification. In: ICME 2005, Amsterdam (2005)Google Scholar
  8. 8.
    De Silva, L.C., Ng, P.C.: Bimodal emotion recognition. In: IEEE International Conf. on Automatic Face and Gesture Recognition, pp. 332–335 (2000)Google Scholar
  9. 9.
    Chen, L.S., Huang, T.S.: Emotional expressions in audiovisual human computer interaction. In: ICME 2000, pp. 423–426 (2000)Google Scholar
  10. 10.
    Zeng, Z., Tu, J., Liu, M., Zhang, T., Rizzolo, N., Zhang, Z., Huang, T.S., Roth, D., Levinson, S.: Bimodal HCI-related affect recognition. In: ICMI 2004 (2004)Google Scholar
  11. 11.
    Kim, J., André, E., Rehm, M., Vogt, T., Wagner, J.: Integrating information from speech and physiological signals to achieve emotional sensitivity. In: INTERSPEECH 2005, Lisbon, Portugal, pp. 809–812 (2005)Google Scholar
  12. 12.
    Lang, P.: The emotion probe: Studies of motivation and attention. American Psychologist 50(5), 372–385 (1995)CrossRefGoogle Scholar
  13. 13.
    Douglas-Cowie, E., Devillers, L., Martin, J.C., Cowie, R., Savvidou, S., Abrilian, S., Cox, C.: Multimodal Databases of Everyday Emotion: Facing up to Complexity. In: InterSpeech, Lisbon (2005)Google Scholar
  14. 14.
    Haag, A., Goronzy, S., Schaich, P., Williams, J.: Emotion recognition using bio-sensors: First step towards an automatic system. In: Affective Dialogue Systems, Tutorial and Research Workshop, Kloster Irsee, Germany (2004)Google Scholar
  15. 15.
    Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 42, 419–427 (2004)CrossRefGoogle Scholar
  16. 16.
    Pan, J., Tompkins, W.: A real-time qrs detection algorithm. IEEE Trans. Biomed. Eng. 32 (1985)Google Scholar
  17. 17.
    Busso, C., Deng, Z., Yildirim, S., Bulut, M., Lee, C.H., Kazemzaden, A., Lee, S., Neumann, U., Narayanan, S.: Analysis of emotion recognition using facial expression, speech and multimodal information. In: ICMI 2004, State College, Pennsylvania, USA, pp. 205–211 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Jonghwa Kim
    • 1
  • Elisabeth André
    • 1
  1. 1.Institute of Computer ScienceUniversity of AugsburgGermany

Personalised recommendations