Skip to main content

Increasing Helpfulness towards a Robot by Emotional Adaption to the User

Abstract

This article describes an emotional adaption approach to proactively trigger increased helpfulness towards a robot in task-related human-robot interaction (HRI). Based on social-psychological predictions of human behavior, the approach aims at inducing empathy, paired with a feeling of similarity in human users towards the robot. This is achieved by two differently expressed emotional control variables: by an explicit statement of similarity before task-related interaction, and implicitly expressed by adapting the emotional state of the robot to the mood of the human user, such that the current values of the human mood in the dimensions of pleasure, arousal, and dominance (PAD) are matched. The thereby shifted emotional state of the robot serves as a basis for the generation of task-driven emotional facial- and verbal expressions, employed to induce and sustain high empathy towards the robot throughout the interaction. The approach is evaluated in a user study utilizing an expressive robot head. The effectiveness of the approach is confirmed by significant experimental results. An analysis of the individual components of the approach reveals significant effects of explicit emotional adaption on helpfulness, as well as on the HRI-key concepts anthropomorphism and animacy.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Notes

  1. 1.

    See http://www.iuro-project.eu.

  2. 2.

    See www.akinator.com.

References

  1. 1.

    Asteriadis S, Tzouveli P, Karpouzis K, Kollias S (2009) Estimation of behavioral user state based on eye gaze and head pose—application in an e-learning environment. Multimed Tools Appl 41(3):469–493

    Article  Google Scholar 

  2. 2.

    Backs RW, Lenneman JK, Wetzel JM, Green P (2003) Cardiac measures of driver workload during simulated driving with and without visual occlusion. Hum Factors 45(4):525–538

    Article  Google Scholar 

  3. 3.

    Bartneck C, Kulic D, Croft E (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1(1):71–81

    Article  Google Scholar 

  4. 4.

    Batson CD, Duncan BD, Ackermann P, Buckley T, Birch K (1981) Is empathic emotion a source of altruistic motivation? J Pers Soc Psychol 40:290–302

    Article  Google Scholar 

  5. 5.

    Berger D (1987) Clinical empathy. Jason Aronson, Northvale

    Google Scholar 

  6. 6.

    Bickmore TW, Cassell J (1999) Small talk and conversational storytelling in embodied interface agents. In: Proc of the AAAI Fall symposium “Narrative Intelligence”. Cape Cod, MA

    Google Scholar 

  7. 7.

    Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  8. 8.

    Carston R (2009) The explicit/implicit distinction in pragmatics and the limits of explicit communication. Int Rev Pragmat 1(1):35–62

    Article  Google Scholar 

  9. 9.

    Castelfranchi C (1998) Modelling social action for ai agents. Artif Intell 103:157–182

    Article  MATH  Google Scholar 

  10. 10.

    Castelfranchi C (2010) Grounding social action and phenomena in mental representations. In: Advances in cognitive science: learning, evolution, and social action. Proc of IWCogSc-10, ILCLI, pp 93–112

    Google Scholar 

  11. 11.

    Clark HH, Schaefer EF (1989) Contributing to discourse. Cogn Sci 13:259–294

    Article  Google Scholar 

  12. 12.

    Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor J (2001) Emotion recognition in human-computer interaction. IEEE Signal Process Mag 18(1):32–80

    Article  Google Scholar 

  13. 13.

    Cramer H, Goddijn J, Wielinga B, Evers V (2010) Effects of (in)accurate empathy and situational valence on attitudes towards robots. In: Proceedings of the 5th ACM/IEEE international conference on human-robot interaction, HRI ’10. ACM Press, New York, pp 141–142

    Chapter  Google Scholar 

  14. 14.

    Damm O, Malchus K, Hegel F, Jaecks P, Stenneken P, Wrede B, Hielscher-Fastabend M (2011) A computational model of emotional alignment. In: Proc of 5th workshop on emotion and computing, Berlin

    Google Scholar 

  15. 15.

    Ekman P (1971) Universals and cultural differences in facial expressions of emotion. In: Cole J (ed) Proc of the symposium on motivation, vol 19, pp 207–283. University of Nebraska

    Google Scholar 

  16. 16.

    Ekman P, Friesen W (1978) Investigator’s guide: part two. Facial action coding system. Consulting Psychologists Press, Palo Alto

    Google Scholar 

  17. 17.

    Elliott C, Rickel J, Lester J (1999) Lifelike pedagogical agents and affective computing: an exploratory synthesis. In: Artificial intelligence today: recent trends and developments. Lecture notes in computer science, vol 1600. Springer, Berlin, pp 195–211. 1999

    Chapter  Google Scholar 

  18. 18.

    Fischer AH, van Kleef GA (2010) Where have all the people gone? a plea for including social interaction in emotion research. Emot Rev 2(3):208–211

    Article  Google Scholar 

  19. 19.

    Fogg B (2003) Computers as persuasive social actors. In: Persuasive technology: using computers to change what we think and do. The Morgan Kaufmann series in interactive technologies. Morgan Kaufmann, San Francisco, pp 89–120

    Chapter  Google Scholar 

  20. 20.

    Frey D, Irle M (eds) (2002) Theorien der Sozialpsychologie, Band II: Gruppen-, Interaktions- und Lerntheorien, vol 2. Verlag Hans Huber, Bern

    Google Scholar 

  21. 21.

    Gentner D (2002) Psychology of mental models. In: International encyclopedia of the social and behavioral sciences. Elsevier, Amsterdam, pp 9683–9687

    Google Scholar 

  22. 22.

    Gonsior B, BußM, Sosnowski S, Wollherr D, Kühnlenz K, Buss M (2012) Towards transferability of theories on prosocial behavior from social psychology to hri. In: Proc of the IEEE int workshop on advanced robotics and its social impacts, ARSO, Munich, pp 101–103

    Google Scholar 

  23. 23.

    Gonsior B, Sosnowski S, BußM, Wollherr D, Kühnlenz K (2012) An emotional adaption approach to increase helpfulness towards a robot. In: Proc of the IEEE int conf on intelligent robots and systems, IROS, pp 2429–2436

    Google Scholar 

  24. 24.

    Gonsior B, Sosnowski S, Mayer C, Blume J, Radig B, Wollherr D, Kühnlenz K (2011) Improving aspects of empathy and subjective performance for hri through mirroring facial expressions. In: Proc of IEEE int symp on robot and human interactive communication, RO-MAN, Atlanta, GA, USA, pp 350–356

    Google Scholar 

  25. 25.

    Heise DR (2004) Enculturating agents with expressive role behavior. In: Agent culture: human-agent interaction in a multicultural world. Lawrence Erlbaum, Hillsdale, pp 127–142

    Google Scholar 

  26. 26.

    Kaliouby R, Picard R, Baron-Cohen S (2006) Affective computing and autism. In: Annals of the New York academy of sciences. Progress in convergence, vol 1093, pp 228–248

    Google Scholar 

  27. 27.

    Kiesler S, Goetz J (2006) Mental models and cooperation with robotic assistants. In: Proc of the ACM conference on human factors in computing systems (CHI), ACM SIGCHI

    Google Scholar 

  28. 28.

    Kim KH, Bang SW, Kim SR (2004) Emotion recognition system using short-term monitoring of physiological signals. Med Biol Eng Comput 42(3):419–427

    Article  Google Scholar 

  29. 29.

    Kraut RE, Johnston RE (1979) Social and emotional messages of smiling: an ethological approach. J Pers Soc Psychol 37(9):1539–1553

    Article  Google Scholar 

  30. 30.

    Krebs D (1975) Empathy and altruism. J Pers Soc Psychol 32:1134–1146

    Article  Google Scholar 

  31. 31.

    Kühnlenz K, Sosnowski S, Buss M (2010) Impact of animal-like features on emotion expression of robot head eddie. Adv Robot 24(8–9):1239–1255

    Article  Google Scholar 

  32. 32.

    Liu C, Conn K, Sarkar N, Stone W (2008) Online affect detection and robot behavior adaptation for intervention of children with autism. IEEE Trans Robot 24(4):883–896

    Article  Google Scholar 

  33. 33.

    Mayer C, Sosnowski S, Kühnlenz K, Radig B (2011) Towards robotic facial mimicry: System development and evaluation. In: Proc of IEEE Int symp on robot and human interactive communication, Ro-Man, pp 198–203

    Google Scholar 

  34. 34.

    Mayer C, Wimmer M, Eggers M, Radig B (2009) Facial expression recognition with 3d deformable models. In: Proc of the 2nd int conf on advancements computer-human interaction, ACHI. Springer, Berlin

    Google Scholar 

  35. 35.

    McQuiggan S, Lester J (2007) Modeling and evaluating empathy in embodied companion agents. Int J Hum-Comput Stud 65(4):348–360

    Article  Google Scholar 

  36. 36.

    Mehrabian A (1981) Silent messages: implicit communication of emotions and attitudes. Wadsworth, Belmont

    Google Scholar 

  37. 37.

    Mehrabian A (1996) Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292

    MathSciNet  Article  Google Scholar 

  38. 38.

    Nakagawa K, Shiomi M, Shinozawa K, Matsumura R, Ishiguro H, Hagita N (2011) Effect of robot’s active touch on people’s motivation. In: Proc of IEEE int conf on human-robot interaction, HRI

    Google Scholar 

  39. 39.

    Nakagawa K, Shiomi M, Shinozawa K, Matsumura R, Ishiguro H, Hagita N (2013) Effect of robot’s whispering behavior on people’s motivation. Int J Soc Robot 5(1):5–16

    Article  Google Scholar 

  40. 40.

    Niewiadomski R, Ochs M, Pelachaud C (2008) Expressions of empathy in ECAs. In: Intelligent virtual agents, pp 1–8

    Google Scholar 

  41. 41.

    Nourbakhsh IR, Bobenage J, Grange S, Lutz R, Meyer R, Soto A (1999) An affective mobile robot educator with a full-time job. Artif Intell 114(1–2):95–124

    Article  MATH  Google Scholar 

  42. 42.

    Ochs M, Pelachaud C, Sadek D (2008) An empathic virtual dialog agent to improve human-machine interaction. In: Padgham L, Parkes DC, Muller JP, Parsons S (eds) Proceedings of the 7th international conference on autonomous agents and multiagent systems (AAMAS 2008), Estoril, Portugal, pp 89–96

    Google Scholar 

  43. 43.

    Paiva A, Dias J, Sobral D, Aylett R, Woods S, Hall L, Zoll C (2005) Learning by feeling: evoking empathy with synthetic characters. Appl Artif Intell 19(3–4):235–266

    Article  Google Scholar 

  44. 44.

    Picard R (1997) Affective computing. MIT Press, Cambridge

    Google Scholar 

  45. 45.

    Pickering MJ, Garrod S (2004) Toward a mechanistic psychology of dialogue. Behav Brain Sci, 1–58

  46. 46.

    Prendinger H, Ishizuka M (2005) The emphatic companion: a character-based interface that adresses users’ affective states. Appl Artif Intell 19(3–4):267–285

    Article  Google Scholar 

  47. 47.

    Rani P, Liu C, Sarkar N, Vanman E (2006) An empirical study of machine learning techniques for affect recognition in human–robot interaction. Pattern Anal Appl 9(1):58–69

    Article  Google Scholar 

  48. 48.

    Reitberger W, Meschtscherjakov A, Mirlacher T, Scherndl T, Huber H, Tscheligi M (2009) A persuasive interactive mannequin for shop windows. In: Proc of persuasive ’09, The 4th int conf on persuasive technology. ACM, New York. Article No. 4

    Google Scholar 

  49. 49.

    Riek L (2012) Wizard of oz studies in hri: a systematic review and new reporting guidelines. J Hum Robot Interact 1(1):119–136

    Article  Google Scholar 

  50. 50.

    Riek L, Robinson P (2008) Real-time empathy: Facial mimicry on a robot. In: Workshop on affective interaction in natural environments (AFFINE) at the international ACM conference on multimodal interfaces. ACM, New York, pp 1–5

    Google Scholar 

  51. 51.

    Scheeff M, Pinto J, Rahardja K, Snibbe S, Tow R (2002) Experiences with sparky, a social robot. Soc Intell Agents 3:173–180

    Article  Google Scholar 

  52. 52.

    Schröder M (2001) The german text-to-speech synthesis system mary: a tool for research, development and teaching, pp 365–377

  53. 53.

    Schroeder M (2004) Dimensional emotion representation as a basis for speech synthesis with non-extreme emotions. In: Proc of workshop on affective dialogue systems, pp 209–220

    Chapter  Google Scholar 

  54. 54.

    Sosnowski S, Kuehnlenz K, Buss M (2006) Eddie—an emotion display with dynamic intuitive expressions. In: Proc IEEE int symp on robot and human interactive communication, RO-MAN

    Google Scholar 

  55. 55.

    Spreng RN, McKinnon M, Mar R, Levine B (2009) The toronto empathy questionnaire: scale development and initial validation of a factor-analytic solution to multiple empathy measures. J Pers Assess 91(1):62–71

    Article  Google Scholar 

  56. 56.

    Tabachnick B, Fidell L (2007) Experimental Design Using ANOVA. Duxbury applied series. Brooks/Cole, Pacific Grove

    Google Scholar 

  57. 57.

    Tapus A, Mataric’ MJ (2007) Emulating empathy in socially assistive robotics empathy in socially assistive robotics. In: AAAI spring symposium on multidisciplinary collaboration for socially assistive robotics, Palo Alto, Stanford, USA

    Google Scholar 

  58. 58.

    Thomas AP, Bull P, Roger D (1982) Conversational exchange analysis. J Lang Soc Psychol 1(2):141–156

    Article  Google Scholar 

  59. 59.

    Traum DR (1994) A computational theory of grounding in natural language conversation. PhD thesis, University of Rochester, Computer Science

  60. 60.

    Wallhoff F, Rehrl T, Mayer C, Radig B (2010) Realtime face and gesture analysis for human-robot interaction. In: Proc of the SPIE, society of photo-optical instrumentation engineers conf

  61. 61.

    Weiss A, Igelsböck J, Tscheligi M, Bauer A, Kühnlenz K, Wollherr D, Buss M (2010) Robots asking for directions: the willingness of passers-by to support robots. In: Int conf on human-robot interaction, HRI, pp 23–30

    Google Scholar 

  62. 62.

    Young JE, Hawkins R, Sharlin E, Igarashi T (2009) Toward acceptable domestic robots: apllying insights from social psychology. Int J Soc Robot 1(1):95–108

    Article  Google Scholar 

  63. 63.

    van der Zwaan J, Dignum V, Jonker C (2012) A BDI dialogue agent for social support: specification of verbal support types (Extended Abstract) Categories and subject descriptors. In: Conitzer V, Winikoff M, Padgham L, van der Hoek Torre W (eds) Proceedings of the 11th international conference on autonomous agents and multiagent systems, AAMAS, Valencia, Spain

    Google Scholar 

Download references

Acknowledgements

This work is supported in part by the EU FP7 STREP project “IURO—Interactive Urban Robot),” contract number 248317, see www.iuro-project.eu, the ERC Advanced Grant project “SHRINE—Seamless Human Robot Interaction in Dynamic Environments,” contract number 267877, within the DFG excellence initiative research cluster Cognition for Technical Systems—CoTeSys, see www.cotesys.org, and by the Institute for Advanced Study (IAS), Technische Universität München, see also www.tum-ias.de. The authors like to thank Elokence (see www.elokence.com) for providing the interface to the Akinator game (see also www.akinator.com), Dr. Jürgen Blume for the dialog system, and Christian Landsiedel for speech synchronization. Special thanks to Dr. Angelika Peer and Katrin Landsiedel for their highly appreciated statistical cues.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Barbara Kühnlenz.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Kühnlenz, B., Sosnowski, S., Buß, M. et al. Increasing Helpfulness towards a Robot by Emotional Adaption to the User. Int J of Soc Robotics 5, 457–476 (2013). https://doi.org/10.1007/s12369-013-0182-2

Download citation

Keywords

  • Emotions
  • Adaption
  • Prosocial behavior
  • Empathy
  • Helpfulness
  • Similarity
  • Anthropomorphism
  • Animacy