Computing Emotion Awareness Through Facial Electromyography
To improve human-computer interaction (HCI), computers need to recognize and respond properly to their user’s emotional state. This is a fundamental application of affective computing, which relates to, arises from, or deliberately influences emotion. As a first step to a system that recognizes emotions of individual users, this research focuses on how emotional experiences are expressed in six parameters (i.e., mean, absolute deviation, standard deviation, variance, skewness, and kurtosis) of physiological measurements of three electromyography signals: frontalis (EMG1), corrugator supercilii (EMG2), and zygomaticus major (EMG3). The 24 participants were asked to watch film scenes of 120 seconds, which they rated afterward. These ratings enabled us to distinguish four categories of emotions: negative, positive, mixed, and neutral. The skewness of the EMG2 and four parameters of EMG3, discriminate between the four emotion categories. This, despite the coarse time windows that were used. Moreover, rapid processing of the signals proved to be possible. This enables tailored HCI facilitated by an emotional awareness of systems.
KeywordsEmotion Category Affective Computing Psychophysiological Measure Mixed Emotion Corrugator Supercilii
Unable to display preview. Download preview PDF.
- 1.Merriam-Webster, Incorporated: Merriam-Webster Online (last accessed on February 6, 2006), URL: http://www.m-w.com/
- 4.Picard, R.: Affective computing for HCI. In: Proceedings of HCI International (the 8th International Conference on Human-Computer Interaction) on Human-Computer Interaction: Ergonomics and User Interfaces, vol. 1, pp. 829–833. Lawrence Erlbaum Associates, Inc., Mahwah (1999)Google Scholar
- 5.Hone, K., Akhtar, F., Saffu, M.: Affective agents to reduce user frustration: the role of agent embodiment. In: Proceedings of Human-Computer Interaction (HCI 2003), Bath, UK (2003)Google Scholar
- 9.Ball, G., Breese, J.: Modeling the emotional state of computer users. In: Workshop on Attitude, Personality and Emotions in User-Adapted Interaction, Banff, Canada (1999)Google Scholar
- 11.Bosma, W., André, E.: Exploiting emotions to disambiguate dialogue acts. In: Proceedings of the 9th International Conference on Intelligent User Interface, Funchal, Madeira, Portugal, pp. 85–92. ACM Press, New York (2004)Google Scholar
- 15.Scerbo, M.W., Freeman, F.G., Mikulka, P.J., Parasuraman, R., Di Nocero, F.: The efficacy of psychophysiological measures for implementing adaptive technology. Technical Report NASA / TP-2001-211018, NASA Center for AeroSpace Information, CASI (2001)Google Scholar
- 16.Aizawa, K., Ishijima, K., Shiina, M.: Summarizing wearable video. In: IEEE International Conference on Image Processing (ICIP), Thessaloniki, Greece, vol. 3, pp. 398–401 (2001)Google Scholar
- 17.Van den Broek, E.L.: Emotional Prosody Measurement (EPM): A voice-based evaluation method for psychological therapy effectiveness. Studies in Health Technology and Informatics (Medical and Care Compunetics 1) 103, 118–125 (2004)Google Scholar
- 19.Picard, R.W., Scheirer, J.: The galvactivator: A glove that senses and communicates skin conductivity. In: Proceedings of the 9th International Conference on Human-Computer Interaction, New Orleans (2001)Google Scholar
- 27.Matzler, K., Faullant, R., Renzl, B., Leiter, V.: The relationship between personality traits (extraversion and neuroticism), emotions and customer self-satisfaction. Innovative Marketing 1, 32–39 (2005)Google Scholar