Skip to main content

Physiological Sensing for Affective Computing

  • Chapter
Affective Information Processing

Abstract

Sensing affective states is a challenging undertaking and a variety of approaches exist. One of the oldest but yet often neglected way is to observe physiological processes related to sympathetic activity of the autonomic nervous system. A reason for physiological measurements not being used in emotion-related HCI research or affective applications is the lack of appropriate sensing devices. Existing systems often don’t live up to the high requirements of the real life in which such applications are to be used. This chapter starts with a short overview of the physiological processes affective applications rely on today, and commonly used techniques to access them. After this, requirements of affective applications for physiological sensors are worked out. A design concept meeting the requirements is drawn and exemplary implementations including evaluation results are described.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. ACII (2005).First International Conference on Affective Computing and Intelligent Interaction, Beijing 2005. New York: Springer-Verlag .

    Google Scholar 

  2. ACII (2007).Second International Conference on Affective Computing and Intelligent Interaction, Lisbon, Portugal.

    Google Scholar 

  3. Aleksic, P. S., & Katsaggelos, A. K. (2006). Automatic facial expression recognition using facial animation parameters and multi-stream HMMs.IEEE Transactions on Information Forensics and Security 1(1, March), 3–11.

    Article  Google Scholar 

  4. Branco, P., Firth, P., Encarnacao, L. M., & Bonato, P. (2005). Faces of emotion in human-computer interaction. InProceedings of the CHI 2005 Conference, Extended Abstracts (pp. 1236–1239). New York: ACM Press.

    Google Scholar 

  5. BS 7986 (2004). Industrial process measurement and control — Data quality metrics. Available from BSI Customer Services email: orders@bsi-global.com.

    Google Scholar 

  6. Burkhardt, F., & Sendlmeier, W. F. (2000). Verification of acoustical correlates of emotional speech using formant-synthesis. InISCA Workshop on Speech and Emotion, 151–156.

    Google Scholar 

  7. Cacioppo, J. T., Tassinary, L. G., & Berntson, G. G. (Eds.) (2000).Handbook of psychophysiology(2nd ed.). Cambridge University Press, Cambridge, UK.

    Google Scholar 

  8. Castellano, G., Villalba, S. D., & Camurri, A. (2007). Recognising human emotions from body movement and gesture dynamics. affective computing and intelligent interaction. InSec¬ond International Conference, ACII 2007, Lisbon, September 12–14, Proceedings, Series: LNCS 4738. New York: Springer-Verlag.

    Google Scholar 

  9. CHI (2005).Proceedings of the SIGCHI conference on human factors in computing systems, CHI 2005, Portland, OR, April. New York: ACM Press.

    Google Scholar 

  10. CHI (2006).Proceedings of the SIGCHI conference on human factors in computing systems. Montréal, Québec, Canada April 22–27. New York: ACM Press.

    Google Scholar 

  11. CHI (2007).SIGCHI conference on human factors in computing systems (CHI 2007), April 28— May 3, San Jose, CA.

    Google Scholar 

  12. Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence.Journal of Nonviolent Behavior 28, 117–139.

    Article  Google Scholar 

  13. Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human computer interfaces.IEEE Signal Processing Magazine 18(1), 32–80.

    Article  Google Scholar 

  14. Crane, E., & Gross, M. (2007). Motion capture and emotion: Affect detection in whole body movement. Affective computing and intelligent interaction. InSecond International Confer¬ence, ACII 2007, Lisbon, September 12–14, 2007, Proceedings, Series: LNCS 4738. New York: Springer-Verlag.

    Google Scholar 

  15. Douglas-Cowie, E., Cowie, R., & Campbell, N. (2003). Speech and emotion.Speech Communica¬tion 40, 1–3.

    Article  Google Scholar 

  16. Ebert, E. (2005). Weiterentwicklung und Verifikation eines Messsystem zur Erfassung human-physiologischer Sensordaten unter BerĂĽcksichtigung des SEVA-Konzepts. Master Thesis, University of Rostock

    Google Scholar 

  17. Ekman, P., & Davidson, R. J. (Eds.) (1994).The nature of emotion: Fundamental questions. New York: Oxford University Press.

    Google Scholar 

  18. Ekman, P., & Friesen, W. (1976).Pictures of facial affect. Palo Alto, CA: Consulting Psychologists Press.

    Google Scholar 

  19. Ekman, P., & Friesen, W. (1978).Facial action coding system: A technique for the measurement of facial movement. Palo Alto, CA: Consulting Psychologists Press.

    Google Scholar 

  20. Ekman, P., Levenson, R. W., & Friesen, W. (1983). Autonomic nervous system activity distin¬guishes among emotions. InScience221. The American Association for Advancement of Science221(4616), 1208–1210.

    Google Scholar 

  21. Fasel B., & Luettin J. (2003). Automatic facial expression analysis: A survey.Pattern Recognition, 36(1), 259–275.

    Article  MATH  Google Scholar 

  22. Haag, A., Goronzy, S., Schaich, P., & Williams, J. (2004). Emotion recognition using bio-sensors: First steps towards an automatic system. In André et al. (Eds.)Affective dialogue systems, Pro¬ceedings of the Kloster Irsee Tutorial and Research Workshop on Affective Dialogue Systems, LNCS 3068 (pp. 36–48). New York: Springer-Verlag.

    Google Scholar 

  23. HCII (2007).The 12th International Conference on Human-Computer Interaction, 22–27 July 2007, Beijing.

    Google Scholar 

  24. Henry M. (2001). Recent developments in self-validating (SEVA) sensors. Sensor Review21(1), 16–22, MCB University Press. 0260–2288.

    Article  Google Scholar 

  25. Herbon, A., Peter, C., Markert, L., van der Meer, E., & Voskamp, J. (2005). Emotion studies in HCI — a new approach. InProceedings of the 2005 HCI International Conference, Las Vegas (Vol. 1), CD-ROM.

    Google Scholar 

  26. Jovanov, E., Milenkovic, A., Otto, C., & deGroen, P. C. (2005). A wireless body area network of intelligent motion sensor for computer assisted physical rehabilitation.Journal of NeuroEngi-neering and Rehabilitation 2005 2, 6.

    Article  Google Scholar 

  27. Kleinsmith, A., & Bianchi-Berthouze, N. (2007). Recognizing affective dimensions from body posture. InAffective Computing and Intelligent Interaction, Second International Conference, ACII 2007, Lisbon, September 12–14, 2007, Proceedings, Series: LNCS 4738. New York: Springer-Verlag.

    Google Scholar 

  28. Kleinsmith, A., de Silva, P., & Bianchi-Berthouze, N. (2005). Grounding affective dimensions into posture features. InProceedings of the First International Conference on Affective Computing and Intelligent Interaction(pp. 263–270). Heidelberg: Springer.

    Chapter  Google Scholar 

  29. Küstner, D., Tato, R., Kemp, T., & Meffert, B. (2004). Towards real life applications in emotion recognition; comparing different databases, feature sets, and reinforcement methods for recog¬nizing emotions from speech. In André et al. (Eds.)Affective Dialogue Systems, Proceedings of the Kloster Irsee Tutorial and Research Workshop on Affective Dialogue Systems, LNCS 3068. New York: Springer-Verlag.

    Google Scholar 

  30. Laukka, P., Juslin, P. N., & Bresin, R. (2005). A dimensional approach to vocal expression of emotion.Cognition and Emotion 19(5), 633–653.

    Article  Google Scholar 

  31. Li L., & Chen J. (2006). Emotion recognition using physiological signals from multiple subjects. InProceedings of the 2006 International Conference on Intelligent Information Hiding and Multimedia Signal Processing(pp. 355–358). Los Alamitos, CA: IEEE Computer Society.

    Chapter  Google Scholar 

  32. Mandryk, R. L., Inkpen, K. M., & Calvert, T. W. (2006). Using psychophysiological techniques to measure user experience with entertainment technologies.Behaviour and Information Tech¬nology (Special Issue on User Experience) 25, 141–158.

    Google Scholar 

  33. Palomba, D., & Stegagno, L. (1993). Physiology, perceived emotion and memory: Responding to film sequences. In N. Birbaumer & A. Ühman (Eds.)The Structure of Emotion(pp. 158–168). Toronto: Hogrefe & Huber.

    Google Scholar 

  34. Peter, C., & Beale, R. (Eds.) (2008).Affect and Emotion in Human-Computer Interaction, LNCS 4868. Heidelberg: Springer.

    Google Scholar 

  35. Peter, C., & Herbon, A. (2006). Emotion representation and physiology assignments in digital systems. InInteracting With Computers 18(2), 139–170.

    Google Scholar 

  36. Peter C., Schultz, R., Voskamp, J., Urban, B., Nowack, N., Janik, H., Kraft, K., & Göcke, R. (2007). EREC-II in use — Studies on usability and suitability of a sensor system for affect detection and human performance monitoring. In J. Jacko (Ed.)Human-Computer Interaction, Part III, HCII 2007, LNCS 4552 (pp. 465–474). Heidelberg: Springer-Verlag.

    Google Scholar 

  37. Picard, R., & Scheirer, J. (2001). The galvactivator: A glove that senses and communicates skin conductivity. InNinth International Conference on Human-Computer Interaction, New Orleans. LA, 1538–1542.

    Google Scholar 

  38. Scheirer, J., Fernandez, R., Klein, J., & Picard, R.W. (2002). Frustrating the user on purpose: A step toward building an affective computer.Interacting with Computers 14(2): 93–118.

    Google Scholar 

  39. Schröder, M. (2004).Proceedings of the HUMAINE Workshop From Signals to Signs of Emotion and Vice Versa, Santorini.

    Google Scholar 

  40. Schwartz, M. S., & Andrasik, F. (2003). Biofeedback: A practitioner's guide (3rd ed.). New York: Guilford Press.

    Google Scholar 

  41. Tischler, M. A., Peter, C., Wimmer, M., & Voskamp, J. (2007). Application of emotion recogni¬tion methods in automotive research. In D. Reichardt & P. Levi (Eds)Proceedings of the 2nd Workshop Emotion and Computing — Current Research on Future Impact(pp. 55–60).

    Google Scholar 

  42. Ververidis, D., & Constantine Kotropoulos, C. (2006). Emotional speech recognition: Resources, features, and methods.Speech Communication 48, 1162–1181.

    Article  Google Scholar 

  43. Ward, R. D., & Marsden, P. H. (2003). Physiological responses to different web page designs.International Journal of Human-Computer Studies 59(1/2): 199–212.

    Article  Google Scholar 

  44. Zhou, F., Archer, N., Bowles, J., Clarke, D., Henry, M., & Peter, C. (1999). A general hardware platform for sensor validation.IEE Colloquium on Intelligent and Self-Validating Sensors, Oxford.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag London Limited

About this chapter

Cite this chapter

Peter, C., Ebert, E., Beikirch, H. (2009). Physiological Sensing for Affective Computing. In: Tao, J., Tan, T. (eds) Affective Information Processing. Springer, London. https://doi.org/10.1007/978-1-84800-306-4_16

Download citation

  • DOI: https://doi.org/10.1007/978-1-84800-306-4_16

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84800-305-7

  • Online ISBN: 978-1-84800-306-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics