Computing Emotion Awareness Through Facial Electromyography

  • Egon L. van den Broek
  • Marleen H. Schut
  • Joyce H. D. M. Westerink
  • Jan van Herk
  • Kees Tuinenbreijer
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3979)


To improve human-computer interaction (HCI), computers need to recognize and respond properly to their user’s emotional state. This is a fundamental application of affective computing, which relates to, arises from, or deliberately influences emotion. As a first step to a system that recognizes emotions of individual users, this research focuses on how emotional experiences are expressed in six parameters (i.e., mean, absolute deviation, standard deviation, variance, skewness, and kurtosis) of physiological measurements of three electromyography signals: frontalis (EMG1), corrugator supercilii (EMG2), and zygomaticus major (EMG3). The 24 participants were asked to watch film scenes of 120 seconds, which they rated afterward. These ratings enabled us to distinguish four categories of emotions: negative, positive, mixed, and neutral. The skewness of the EMG2 and four parameters of EMG3, discriminate between the four emotion categories. This, despite the coarse time windows that were used. Moreover, rapid processing of the signals proved to be possible. This enables tailored HCI facilitated by an emotional awareness of systems.


Emotion Category Affective Computing Psychophysiological Measure Mixed Emotion Corrugator Supercilii 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Merriam-Webster, Incorporated: Merriam-Webster Online (last accessed on February 6, 2006), URL:
  2. 2.
    Oviatt, S.L., Darves, C., Coulston, R.: Toward adaptive conversational interfaces: Modeling speech convergence with animated personas. ACM Transactions on Computer-Human Interaction 11, 300–328 (2004)CrossRefGoogle Scholar
  3. 3.
    Ceaparu, I., Lazar, J., Bessiere, K., Robinson, J., Shneiderman, B.: Determining causes and severity of end-user frustration. International Journal of Human-Computer Interaction 17, 333–356 (2004)CrossRefGoogle Scholar
  4. 4.
    Picard, R.: Affective computing for HCI. In: Proceedings of HCI International (the 8th International Conference on Human-Computer Interaction) on Human-Computer Interaction: Ergonomics and User Interfaces, vol. 1, pp. 829–833. Lawrence Erlbaum Associates, Inc., Mahwah (1999)Google Scholar
  5. 5.
    Hone, K., Akhtar, F., Saffu, M.: Affective agents to reduce user frustration: the role of agent embodiment. In: Proceedings of Human-Computer Interaction (HCI 2003), Bath, UK (2003)Google Scholar
  6. 6.
    Albrecht, I., Schröder, M., Haber, J., Seidel, H.: Mixed feelings: expression of non-basic emotions in a muscle-based talking head. Virtual Reality 8, 201–212 (2005)CrossRefGoogle Scholar
  7. 7.
    Ortony, A., Clore, G.L., Collins, A.: The cognitive structure of emotions. Cambridge University Press, Cambridge, New York (1988)CrossRefGoogle Scholar
  8. 8.
    Picard, R.: Affective Computing. MIT Press, Boston (1997)CrossRefGoogle Scholar
  9. 9.
    Ball, G., Breese, J.: Modeling the emotional state of computer users. In: Workshop on Attitude, Personality and Emotions in User-Adapted Interaction, Banff, Canada (1999)Google Scholar
  10. 10.
    Lang, P.J.: The emotion probe: Studies of motivation and attention. American Psychologist 52, 372–385 (1995)CrossRefGoogle Scholar
  11. 11.
    Bosma, W., André, E.: Exploiting emotions to disambiguate dialogue acts. In: Proceedings of the 9th International Conference on Intelligent User Interface, Funchal, Madeira, Portugal, pp. 85–92. ACM Press, New York (2004)Google Scholar
  12. 12.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: Motion, motivation, and anxiety: Brain mechanisms and psychophysiology. Biological Psychiatry 44, 1248–1263 (1998)CrossRefGoogle Scholar
  13. 13.
    Konijn, E.A., Hoorn, J.F.: Some like it bad. Testing a model for perceiving and experiencing fictional characters. Media Psychology 7, 107–144 (2005)CrossRefGoogle Scholar
  14. 14.
    Larsen, J.T., Norris, C.J., Cacioppo, J.T.: Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology 40, 776–785 (2003)CrossRefGoogle Scholar
  15. 15.
    Scerbo, M.W., Freeman, F.G., Mikulka, P.J., Parasuraman, R., Di Nocero, F.: The efficacy of psychophysiological measures for implementing adaptive technology. Technical Report NASA / TP-2001-211018, NASA Center for AeroSpace Information, CASI (2001)Google Scholar
  16. 16.
    Aizawa, K., Ishijima, K., Shiina, M.: Summarizing wearable video. In: IEEE International Conference on Image Processing (ICIP), Thessaloniki, Greece, vol. 3, pp. 398–401 (2001)Google Scholar
  17. 17.
    Van den Broek, E.L.: Emotional Prosody Measurement (EPM): A voice-based evaluation method for psychological therapy effectiveness. Studies in Health Technology and Informatics (Medical and Care Compunetics 1) 103, 118–125 (2004)Google Scholar
  18. 18.
    Hilty, D.M., Marks, S.L., Urness, D., Yellowlees, P.M., Nesbitt, T.S.: Clinical and educational telepsychiatry applications: A review. The Canadian Journal of Psychiatry 49, 12–23 (2004)CrossRefGoogle Scholar
  19. 19.
    Picard, R.W., Scheirer, J.: The galvactivator: A glove that senses and communicates skin conductivity. In: Proceedings of the 9th International Conference on Human-Computer Interaction, New Orleans (2001)Google Scholar
  20. 20.
    Picard, R.W.: Toward computers that recognize and respond to user emotion. IBM Systems Journal 39, 705–719 (2000)CrossRefGoogle Scholar
  21. 21.
    Gross, J.J., Levenson, R.W.: Emotion elicitation using films. Cognition and Emotion 9, 87–108 (1995)CrossRefGoogle Scholar
  22. 22.
    Press, W.H., Flannery, B.P., Teukolsky, S.A., Vetterling, W.T.: Numerical recipes in C: The art of scientific computing, 2nd edn. Cambridge University Press, Cambridge, England (1992)MATHGoogle Scholar
  23. 23.
    Weisstein, E.W.: CRC Concise Encyclopedia of Mathematics, 2nd edn. Chapman & Hall/CRC, USA (2002)CrossRefMATHGoogle Scholar
  24. 24.
    Cacioppo, J.T., Dorfman, D.D.: Waveform movement analysis in psychophysiological research. Psychological Bulletin 102, 421–438 (1987)CrossRefGoogle Scholar
  25. 25.
    Cacioppo, J.T., Marshall-Goodell, B., Dorfman, D.D.: Skeletal muscular patterning: Topographical analysis of the integrated electromyogram. Psychophysiology 20, 269–283 (1983)CrossRefGoogle Scholar
  26. 26.
    Hess, U., Kappas, A., McHugo, G.J., Kleck, R.E., Lanzetta, J.T.: An analysis of the encoding and decoding of spontaneous and posed smiles: The use of facial electromyography. Journal of Nonverbal Behavior 13, 121–137 (1989)CrossRefGoogle Scholar
  27. 27.
    Matzler, K., Faullant, R., Renzl, B., Leiter, V.: The relationship between personality traits (extraversion and neuroticism), emotions and customer self-satisfaction. Innovative Marketing 1, 32–39 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Egon L. van den Broek
    • 1
    • 2
  • Marleen H. Schut
    • 2
    • 3
  • Joyce H. D. M. Westerink
    • 4
  • Jan van Herk
    • 4
  • Kees Tuinenbreijer
    • 3
  1. 1.Center for Telematics and Information Technology (CTIT) / Institute for Behavioral Research (IBR)University of TwenteEnschedeThe Netherlands
  2. 2.Department of Artificial Intelligence / Nijmegen Institute for Cognition and Information (NICI)Radboud University NijmegenNijmegenThe Netherlands
  3. 3.Philips Consumer Electronics, The Innovation LaboratoriesEindhovenThe Netherlands
  4. 4.Philips ResearchEindhovenThe Netherlands

Personalised recommendations