Skip to main content

Visual-Based Emotion Detection for Natural Man-Machine Interaction

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5243))

Abstract

The demand for humanoid robots as service robots for everyday life has increased during the last years. The processing power of the hardware and the development of complex software applications allows the realization of “natural” human-robot interaction. One of the important topics of natural interaction is the detection of emotions which enables the robot to react appropriately to the emotional state of the communication partner. Humanoid robots designed for natural interaction require a short response time and a reliable detection. In this paper we introduce a emotion detection system realized with a combination of a haar-cascade classifier and a contrast filter. The detected feature points are then used to estimate the emotional state using the so called action units. Final experiments with the humanoid robot ROMAN show the performance of the proposed approach.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   89.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bartlett, M.: Face Image Analysis by Unsupervised Learning and Redundancy Reduction. PhD thesis, University of California, San Diego (1998)

    Google Scholar 

  2. Canzler, U.: Automatische erfassung und analyse der menschlichen mimik. In: Bildverarbeitung fr Medizin 2001. Algorithmen, Systeme, Anwendungen (2001)

    Google Scholar 

  3. Chiou, G.I., Hwang, J.-N.: Lipreading by using snakes, principal component analysis, and hidden markov models to recognize color motion video. unknown (1996)

    Google Scholar 

  4. Dailey, M.N., Cottrell, G.W.: Pca = gabor for expression recognition. Technical Report CS1999-0629, University of California, San Diego, October 26 (1999)

    Google Scholar 

  5. Ekman, P., Friesen, W.: Facial Action Coding System. Consulting psychologist Press, Inc. (1978)

    Google Scholar 

  6. Fasel, B., Luettin, J.: Recognition of asymmetric facial action unit activities and intensities. In: Proceedings of International Conference on Pattern Recognition (ICPR 2000), Barcelona, Spain (2000)

    Google Scholar 

  7. Fellenz, W., Taylor, J., Tsapatsoulis, N., Kollias, S.: Comparing Template-based, Feature-based and Supervised Classification of Facial Expressions from Static Images. In: Computational Intelligence and Applications. World Scientific and Engineering Society Press, Singapore (1999)

    Google Scholar 

  8. Fischer, S., Dring, S., Wimmer, M., Krummheuer, A.: Experiences with an emotional sales agent. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068. Springer, Heidelberg (2004)

    Google Scholar 

  9. Kass, M., Witkin, A., Terzopoulos, D.: Snakes:active contour models. International Journal of Computer Vision, 321–331 (1988)

    Google Scholar 

  10. Kawato, S., Tetsutani, N.: Detection and tracking of eyes for gaze-camera control. In: VI (2002)

    Google Scholar 

  11. James Lien, J.-J.: Automatic recognition of facial expressions using hidden markov models and estimation of expression intensity. Technical Report CMU-R1-TR-31, Robotics Institute, Carnegie Mellon University, Pittsburgh, PA (April 1998)

    Google Scholar 

  12. Mase, K., Pentland, A.: Recognition of facial expressions from optical flow. IEICE Transactions (Special Issue on Computer Vision and its Applications) (1991)

    Google Scholar 

  13. Rosenblum, M., Yacoob, Y., Davis, L.S.: Human expression recognition from motion using a radial basis function network architecture. IEEE transactions on neural networks 7, 1121–1138 (1996)

    Article  Google Scholar 

  14. Soderstrom, U., Li, H.: Customizing lip video into animation for wireless emotional communication. Technical Report DML-TR-2004:06, Department of Applied Physics and Electronics, Umea University, Sweden (2004)

    Google Scholar 

  15. Terzopoulos, D., Waters, K.: Analysis and synthesis of facial image sequences using physical and anatomical models. Pattern Analysis and Machine Intelligence PAMI 15(6), 579–596 (1993)

    Google Scholar 

  16. Tian, Y.-L., Kanade, T., Cohn, J.: Recognizing action units for facial expression analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(2), 97–115 (2001)

    Article  Google Scholar 

  17. Viola, P., Jones, M.: Robust real-time object detection. In: Second International Workshop on Statistical and Computational Theories of Vision, Vancouver, Canada, July 13 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Andreas R. Dengel Karsten Berns Thomas M. Breuel Frank Bomarius Thomas R. Roth-Berghofer

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Strupp, S., Schmitz, N., Berns, K. (2008). Visual-Based Emotion Detection for Natural Man-Machine Interaction. In: Dengel, A.R., Berns, K., Breuel, T.M., Bomarius, F., Roth-Berghofer, T.R. (eds) KI 2008: Advances in Artificial Intelligence. KI 2008. Lecture Notes in Computer Science(), vol 5243. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-85845-4_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-85845-4_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-85844-7

  • Online ISBN: 978-3-540-85845-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics