Abstract
Researchers in emotional human-robot interaction (HRI) have often focused on the abilities of sociable emotional robots to express emotions themselves, and on the ability of people to recognize these. However, it has been shown that the recognition of human emotional expressions can be influenced by the surrounding context [17]. So far, no empirical research has been done to examine whether or not the recognition of robot emotions is similarly influenced. Two experiments are reported here that examine how a robot’s simulated emotions were perceived by human observers, depending on what the surrounding context was. Evidence of an effect of surrounding context on user’s perception of the synthetic robot emotions was obtained. Observers were better at recognizing the robot’s expressions when they matched the emotional valence of accompanying pictures or recorded News announcements, than when they did not.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Alonso, J.: Studying social cues in human robot interaction. In: Microsoft External Research Symposium, MIT Media Laboratory, Personal Robots (2009)
Bailey, A.: Synthetic Social Interactions with a Robot using the BASIC personality model. MSc thesis, University of Sheffield, Sheffield, England (2006)
Bradley, M.M., Lang, P.J.: The International Affective Picture System (IAPS) in the Study of Emotion and Attention. In: Coan, J.A., Allen, J.B. (eds.) The Handbook of Emotion Elicitation and Assessment, pp. 29–46 (2007)
Breazeal, C.L.: Designing Sociable Robots. A Bradford Book, The MIT Press (2002)
Brooks, A., Gray, J., Hoffman, G., Lockerd, A.T., Lee, H., Breazeal, C.: Robot’s play: interactive games with sociable machines. Computers in Entertainment 2(3), 1–18 (2004)
Ekman, P., Friesen, W.: Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1978)
Ekman, P., Friesen, W., Hager, J.: Facial Action Coding System. Research Nexus, Salt Lake City, Utah (2002)
Russell, J.A., Weiss, A., Mendelsohn, G.A.: Affect Grid: A Single-Item Scale of Pleasure and Arousal. Journal of Personality and Social Psychology 57(3), 493–502 (1989)
Goris, K., Saldien, J., Vanderniepen, I., Lefeber, D.: The Huggable Robot Probo, a Multi-disciplinary Research Platform. In: Proceedings of the EUROBOT Conference 2008, Heidelberg, Germany, pp. 63–68 (2008)
Gorostiza, J.F., Barber, R., Khamis, A.M., Malfaz, M., Pacheco, R., Rivas, R., Corrales, A., Delgado, E., Salichs, M.A.: Multimodal Human-Robot Interaction Framework for a Personal Robot. In: RO-MAN 2006: The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, United Kingdom (September 2006)
Heerink, M., Kröse, B.J.A., Wielinga, B.J., Evers, V.: The Influence of a Robot’s Social Abilities on Acceptance by Elderly Users. In: Proceedings RO-MAN, Hertfordshire (2006)
Kahn Jr., P.H., Freier, N.G., Kanda, T., Ishiguro, H., Ruckert, J.H., Severson, R.L., Kane, S.K.: Design Patterns for Sociality in Human-Robot Interaction. In: HRI 2008, Amsterdam, Netherlands, March 12-15 (2008)
Kidd, C.D., Breazeal, C.: Robots at Home: Understanding Long-Term Human-Robot Interaction. In: 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Acropolis Convention Center, Nice, France, September 22-26 (2008)
Mayer, J.D., Gaschke, Y.N.: The experience and meta-experience of mood. Journal of Personality and Social Psychology 55, 102–111 (1988)
Mower, E., Lee, S., Matarific, M.J., Narayanan, S.: Human perception of synthetic character emotions in the presence of conflicting and congruent vocal and facial expressions. In: IEEE Int. Conf. Acoustics, Speech, and Signal Processing (ICASSP 2008), Las Vegas, NV, pp. 2201–2204 (2008)
Mower, E., Matarić, M.J., Narayanan, S.: Human perception of audio-visual synthetic character emotion expression in the presence of ambiguous and conflicting information. IEEE Transactions on Multimedia 11(5) (2009)
Niedenthal, P.M., Kruth-Gruber, S., Ric, F.: What Information Determines the Recognition of Emotion? In: The Psychology of Emotion: Interpersonal Experiential, and Cognitive Approaches. Principles of Social Psychology, pp. 136–144. Psychology Press, New York (2006)
Russell, J.: Reading emotions from and into faces: resurrecting a dimensional–contextual perspective. In: Russell, J., Fernandez-Dols, J. (eds.) The Psychology of Facial Expression, pp. 295–320. Cambridge University Press, Cambridge (1997)
Posner, J., Russell, J., Peterson, B.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev. Psychopathol. 17(03), 715–734 (2005)
Hong, P., Wen, Z., Huang, T.: Real-time speech driven expressive synthetic talking faces using neural networks. IEEE Transaction on Neural Networks (April 2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, J., Sharkey, A.J.C. (2011). Contextual Recognition of Robot Emotions. In: Groß, R., Alboul, L., Melhuish, C., Witkowski, M., Prescott, T.J., Penders, J. (eds) Towards Autonomous Robotic Systems. TAROS 2011. Lecture Notes in Computer Science(), vol 6856. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23232-9_8
Download citation
DOI: https://doi.org/10.1007/978-3-642-23232-9_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-23231-2
Online ISBN: 978-3-642-23232-9
eBook Packages: Computer ScienceComputer Science (R0)