Abstract
Sensing affective states is a challenging undertaking and a variety of approaches exist. One of the oldest but yet often neglected way is to observe physiological processes related to sympathetic activity of the autonomic nervous system. A reason for physiological measurements not being used in emotion-related HCI research or affective applications is the lack of appropriate sensing devices. Existing systems often don’t live up to the high requirements of the real life in which such applications are to be used. This chapter starts with a short overview of the physiological processes affective applications rely on today, and commonly used techniques to access them. After this, requirements of affective applications for physiological sensors are worked out. A design concept meeting the requirements is drawn and exemplary implementations including evaluation results are described.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
ACII (2005).First International Conference on Affective Computing and Intelligent Interaction, Beijing 2005. New York: Springer-Verlag .
ACII (2007).Second International Conference on Affective Computing and Intelligent Interaction, Lisbon, Portugal.
Aleksic, P. S., & Katsaggelos, A. K. (2006). Automatic facial expression recognition using facial animation parameters and multi-stream HMMs.IEEE Transactions on Information Forensics and Security 1(1, March), 3–11.
Branco, P., Firth, P., Encarnacao, L. M., & Bonato, P. (2005). Faces of emotion in human-computer interaction. InProceedings of the CHI 2005 Conference, Extended Abstracts (pp. 1236–1239). New York: ACM Press.
BS 7986 (2004). Industrial process measurement and control — Data quality metrics. Available from BSI Customer Services email: orders@bsi-global.com.
Burkhardt, F., & Sendlmeier, W. F. (2000). Verification of acoustical correlates of emotional speech using formant-synthesis. InISCA Workshop on Speech and Emotion, 151–156.
Cacioppo, J. T., Tassinary, L. G., & Berntson, G. G. (Eds.) (2000).Handbook of psychophysiology(2nd ed.). Cambridge University Press, Cambridge, UK.
Castellano, G., Villalba, S. D., & Camurri, A. (2007). Recognising human emotions from body movement and gesture dynamics. affective computing and intelligent interaction. InSec¬ond International Conference, ACII 2007, Lisbon, September 12–14, Proceedings, Series: LNCS 4738. New York: Springer-Verlag.
CHI (2005).Proceedings of the SIGCHI conference on human factors in computing systems, CHI 2005, Portland, OR, April. New York: ACM Press.
CHI (2006).Proceedings of the SIGCHI conference on human factors in computing systems. Montréal, Québec, Canada April 22–27. New York: ACM Press.
CHI (2007).SIGCHI conference on human factors in computing systems (CHI 2007), April 28— May 3, San Jose, CA.
Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence.Journal of Nonviolent Behavior 28, 117–139.
Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human computer interfaces.IEEE Signal Processing Magazine 18(1), 32–80.
Crane, E., & Gross, M. (2007). Motion capture and emotion: Affect detection in whole body movement. Affective computing and intelligent interaction. InSecond International Confer¬ence, ACII 2007, Lisbon, September 12–14, 2007, Proceedings, Series: LNCS 4738. New York: Springer-Verlag.
Douglas-Cowie, E., Cowie, R., & Campbell, N. (2003). Speech and emotion.Speech Communica¬tion 40, 1–3.
Ebert, E. (2005). Weiterentwicklung und Verifikation eines Messsystem zur Erfassung human-physiologischer Sensordaten unter BerĂĽcksichtigung des SEVA-Konzepts. Master Thesis, University of Rostock
Ekman, P., & Davidson, R. J. (Eds.) (1994).The nature of emotion: Fundamental questions. New York: Oxford University Press.
Ekman, P., & Friesen, W. (1976).Pictures of facial affect. Palo Alto, CA: Consulting Psychologists Press.
Ekman, P., & Friesen, W. (1978).Facial action coding system: A technique for the measurement of facial movement. Palo Alto, CA: Consulting Psychologists Press.
Ekman, P., Levenson, R. W., & Friesen, W. (1983). Autonomic nervous system activity distin¬guishes among emotions. InScience221. The American Association for Advancement of Science221(4616), 1208–1210.
Fasel B., & Luettin J. (2003). Automatic facial expression analysis: A survey.Pattern Recognition, 36(1), 259–275.
Haag, A., Goronzy, S., Schaich, P., & Williams, J. (2004). Emotion recognition using bio-sensors: First steps towards an automatic system. In André et al. (Eds.)Affective dialogue systems, Pro¬ceedings of the Kloster Irsee Tutorial and Research Workshop on Affective Dialogue Systems, LNCS 3068 (pp. 36–48). New York: Springer-Verlag.
HCII (2007).The 12th International Conference on Human-Computer Interaction, 22–27 July 2007, Beijing.
Henry M. (2001). Recent developments in self-validating (SEVA) sensors. Sensor Review21(1), 16–22, MCB University Press. 0260–2288.
Herbon, A., Peter, C., Markert, L., van der Meer, E., & Voskamp, J. (2005). Emotion studies in HCI — a new approach. InProceedings of the 2005 HCI International Conference, Las Vegas (Vol. 1), CD-ROM.
Jovanov, E., Milenkovic, A., Otto, C., & deGroen, P. C. (2005). A wireless body area network of intelligent motion sensor for computer assisted physical rehabilitation.Journal of NeuroEngi-neering and Rehabilitation 2005 2, 6.
Kleinsmith, A., & Bianchi-Berthouze, N. (2007). Recognizing affective dimensions from body posture. InAffective Computing and Intelligent Interaction, Second International Conference, ACII 2007, Lisbon, September 12–14, 2007, Proceedings, Series: LNCS 4738. New York: Springer-Verlag.
Kleinsmith, A., de Silva, P., & Bianchi-Berthouze, N. (2005). Grounding affective dimensions into posture features. InProceedings of the First International Conference on Affective Computing and Intelligent Interaction(pp. 263–270). Heidelberg: Springer.
Küstner, D., Tato, R., Kemp, T., & Meffert, B. (2004). Towards real life applications in emotion recognition; comparing different databases, feature sets, and reinforcement methods for recog¬nizing emotions from speech. In André et al. (Eds.)Affective Dialogue Systems, Proceedings of the Kloster Irsee Tutorial and Research Workshop on Affective Dialogue Systems, LNCS 3068. New York: Springer-Verlag.
Laukka, P., Juslin, P. N., & Bresin, R. (2005). A dimensional approach to vocal expression of emotion.Cognition and Emotion 19(5), 633–653.
Li L., & Chen J. (2006). Emotion recognition using physiological signals from multiple subjects. InProceedings of the 2006 International Conference on Intelligent Information Hiding and Multimedia Signal Processing(pp. 355–358). Los Alamitos, CA: IEEE Computer Society.
Mandryk, R. L., Inkpen, K. M., & Calvert, T. W. (2006). Using psychophysiological techniques to measure user experience with entertainment technologies.Behaviour and Information Tech¬nology (Special Issue on User Experience) 25, 141–158.
Palomba, D., & Stegagno, L. (1993). Physiology, perceived emotion and memory: Responding to film sequences. In N. Birbaumer & A. Ühman (Eds.)The Structure of Emotion(pp. 158–168). Toronto: Hogrefe & Huber.
Peter, C., & Beale, R. (Eds.) (2008).Affect and Emotion in Human-Computer Interaction, LNCS 4868. Heidelberg: Springer.
Peter, C., & Herbon, A. (2006). Emotion representation and physiology assignments in digital systems. InInteracting With Computers 18(2), 139–170.
Peter C., Schultz, R., Voskamp, J., Urban, B., Nowack, N., Janik, H., Kraft, K., & Göcke, R. (2007). EREC-II in use — Studies on usability and suitability of a sensor system for affect detection and human performance monitoring. In J. Jacko (Ed.)Human-Computer Interaction, Part III, HCII 2007, LNCS 4552 (pp. 465–474). Heidelberg: Springer-Verlag.
Picard, R., & Scheirer, J. (2001). The galvactivator: A glove that senses and communicates skin conductivity. InNinth International Conference on Human-Computer Interaction, New Orleans. LA, 1538–1542.
Scheirer, J., Fernandez, R., Klein, J., & Picard, R.W. (2002). Frustrating the user on purpose: A step toward building an affective computer.Interacting with Computers 14(2): 93–118.
Schröder, M. (2004).Proceedings of the HUMAINE Workshop From Signals to Signs of Emotion and Vice Versa, Santorini.
Schwartz, M. S., & Andrasik, F. (2003). Biofeedback: A practitioner's guide (3rd ed.). New York: Guilford Press.
Tischler, M. A., Peter, C., Wimmer, M., & Voskamp, J. (2007). Application of emotion recogni¬tion methods in automotive research. In D. Reichardt & P. Levi (Eds)Proceedings of the 2nd Workshop Emotion and Computing — Current Research on Future Impact(pp. 55–60).
Ververidis, D., & Constantine Kotropoulos, C. (2006). Emotional speech recognition: Resources, features, and methods.Speech Communication 48, 1162–1181.
Ward, R. D., & Marsden, P. H. (2003). Physiological responses to different web page designs.International Journal of Human-Computer Studies 59(1/2): 199–212.
Zhou, F., Archer, N., Bowles, J., Clarke, D., Henry, M., & Peter, C. (1999). A general hardware platform for sensor validation.IEE Colloquium on Intelligent and Self-Validating Sensors, Oxford.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag London Limited
About this chapter
Cite this chapter
Peter, C., Ebert, E., Beikirch, H. (2009). Physiological Sensing for Affective Computing. In: Tao, J., Tan, T. (eds) Affective Information Processing. Springer, London. https://doi.org/10.1007/978-1-84800-306-4_16
Download citation
DOI: https://doi.org/10.1007/978-1-84800-306-4_16
Publisher Name: Springer, London
Print ISBN: 978-1-84800-305-7
Online ISBN: 978-1-84800-306-4
eBook Packages: Computer ScienceComputer Science (R0)