Skip to main content

Biosignal Based Emotion Analysis of Human-Agent Interactions

  • Conference paper
Cross-Modal Analysis of Speech, Gestures, Gaze and Facial Expressions

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5641))

Abstract

A two-phase procedure, based on biosignal recordings, is applied in an attempt to classify the emotion valence content in human-agent interactions. In the first phase, participants are exposed to a sample of pictures with known valence values (taken from IAPS dataset) and classifiers are trained on selected features of the biosignals recorded. During the second phase, biosignals are recorded for each participant while watching video clips with interactions with a female and male ECAs. The classifiers trained in the first phase are applied and a comparison between the two interfaces is carried on based on the classifications of the emotional response from the video clips. The results obtained are promising and are discussed in the paper together with the problems encountered, and the suggestions for possible future improvement.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Bradley, M.: Emotion and motivation. In: John, T., Cacioppo, J., Tassinary, L.G., Berntson, G. (eds.) Handbook of Psychophysiology. Cambridge University Press, Cambridge (2000)

    Google Scholar 

  2. Levenson, R.: Autonomic nervous system differences among emotions. Psychological Science 3(1) (1992)

    Google Scholar 

  3. Mandryk, R., Atkins, M.: A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. International Journal of Human Computer Studies 65(4), 329–347 (2006)

    Article  Google Scholar 

  4. Benedek, J., Hazlett, R.: Incorporating facial EMG emotion measures as feedback in the software design process. In: Proc. Human Computer Interaction Consortium (2005)

    Google Scholar 

  5. Ward, R., Mardseen, P.: Psychological responses to different WEB page designs. International Journal of Human-Computer Studies 59, 199–212 (2003)

    Article  Google Scholar 

  6. Wilson, G., Sasse, M.: Do users always know what’s good for them? Utilizing physiological responses to assess media quality. In: McDonald, S., Waern, Y., Cockton, G. (eds.) People and Computers XIV - Usability or Else! Proceedings of HCI 2000, Sunderland, UK, September 5- 8. Springer, Heidelberg (2000)

    Google Scholar 

  7. Haag, A., Goronzy, S., Schaich, P., Williams, J.: Emotion recognition using bio-sensors: First steps towards an automatic system. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS, vol. 3068, pp. 36–48. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  8. Kim, J., André, E.: Emotion Recognition Based on Physiological Changes in Listening Music. IEEE Trans. on Pattern Analysis and Machine Intelligence 30(12), 2067–2083 (2008)

    Article  Google Scholar 

  9. Wagner, J., Kim, J., André, A.: From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification. In: IEEE International Conference on Multimedia & Expo. (2005)

    Google Scholar 

  10. Nasoz, F., Alvarez, K., Lisetti, C., Finkelstein, N.: Emotion recognition from physiological signals for presence technologies. International Journal of Cognition, Technology, and Work – Special Issue on Presence 6(1) (2003)

    Google Scholar 

  11. Krenn, B.: RASCALLI. Responsive Artificial Situated Cognitive Agents Living and Learning on the Internet. In: Proc. of the International Conference on Cognitive Systems/University of Karlsruhe, Karlsruhe, Germany, April 2–4 (2008)

    Google Scholar 

  12. Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(10), 1175–1191 (2001)

    Article  Google Scholar 

  13. Christie, I.C., Friedman, B.H.: Autonomic specificity of discrete emotion and dimensions of affective space: A multivariate approach. International Journal of Psychophysiology 51, 143–153 (2004)

    Article  Google Scholar 

  14. Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International affective picture system (IAPS): Digitized photographs, instruction manual and affective ratings. Technical Report A-6. University of Florida, Gainesville, FL (2005)

    Google Scholar 

  15. Tassinary, L., Cacioppo, J., Geen, T.: A psychometric study of surface electrode placements for facial electromyographic recording: I. The brow and cheek muscle regions. Psychophysiology 26(1), 1–16 (1989)

    Google Scholar 

  16. Wagner, J.: Augsburg Biosignal Toolbox (AuBT): User Guide (2005)

    Google Scholar 

  17. Cunningham, P.: Dimension reduction. Technical report UCD-CSI-2007-7, August 8, 2007, University College Dublin (2007)

    Google Scholar 

  18. Fukunaga, K.: Introduction to statistical pattern recognition. Academic Press, Inc., London (1990)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hristova, E., Grinberg, M., Lalev, E. (2009). Biosignal Based Emotion Analysis of Human-Agent Interactions. In: Esposito, A., Vích, R. (eds) Cross-Modal Analysis of Speech, Gestures, Gaze and Facial Expressions. Lecture Notes in Computer Science(), vol 5641. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03320-9_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-03320-9_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-03319-3

  • Online ISBN: 978-3-642-03320-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics