Advertisement

International Journal of Social Robotics

, Volume 9, Issue 2, pp 277–292 | Cite as

Exploiting the Robot Kinematic Redundancy for Emotion Conveyance to Humans as a Lower Priority Task

  • Josep-Arnau Claret
  • Gentiane Venture
  • Luis Basañez
Article

Abstract

Current approaches do not allow robots to execute a task and simultaneously convey emotions to users using their body motions. This paper explores the capabilities of the Jacobian null space of a humanoid robot to convey emotions. A task priority formulation has been implemented in a Pepper robot which allows the specification of a primary task (waving gesture, transportation of an object, etc.) and exploits the kinematic redundancy of the robot to convey emotions to humans as a lower priority task. The emotions, defined by Mehrabian as points in the pleasure–arousal–dominance space, generate intermediate motion features (jerkiness, activity and gaze) that carry the emotional information. A map from this features to the joints of the robot is presented. A user study has been conducted in which emotional motions have been shown to 30 participants. The results show that happiness and sadness are very well conveyed to the user, calm is moderately well conveyed, and fear is not well conveyed. An analysis on the dependencies between the motion features and the emotions perceived by the participants shows that activity correlates positively with arousal, jerkiness is not perceived by the user, and gaze conveys dominance when activity is low. The results indicate a strong influence of the most energetic motions of the emotional task and point out new directions for further research. Overall, the results show that the null space approach can be regarded as a promising mean to convey emotions as a lower priority task.

Keywords

Human–robot interaction Social robotics Emotion conveyance Robot kinematics Task priority Pepper robot 

Abbreviations

DOF

Degree of freedom

JVG

Jerkiness–activity–gaze

PAD

Pleasure–arousal–dominance

Supplementary material

12369_2016_387_MOESM1_ESM.mp4 (27.8 mb)
Supplementary material 1 (mp4 28429 KB)

References

  1. 1.
    Adams B Jr, Kleck R (2005) Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion 5(1):3–11CrossRefGoogle Scholar
  2. 2.
    Amaya K, Bruderlin A, Calvert T (1996) Emotion from motion. In: Proceedings of the conference on graphics interface ’96, GI ’96, pp 222–229Google Scholar
  3. 3.
    Asada M (2015) Towards artificial empathy. Int J Soc Robot 7(1):19–33. doi: 10.1007/s12369-014-0253-z CrossRefGoogle Scholar
  4. 4.
    Baddoura R, Venture G (2015) This robot is sociable: close-up on the gestures and measured motion of a human responding to a proactive robot. Int J Soc Robot 7(4):489–496. doi: 10.1007/s12369-015-0279-x CrossRefGoogle Scholar
  5. 5.
    Baerlocher P, Boulic R (1998) Task-priority formulations for the kinematic control of highly redundant articulated structures. In: Proceedings of 1998 IEEE/RSJ international conference on intelligent robots and systems, 1998, vol 1, pp 323–329. doi: 10.1109/IROS.1998.724639
  6. 6.
    Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments, AFFINE ’10, pp 37–42Google Scholar
  7. 7.
    Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: Proceedings of the 2nd international conference on affective computing and intelligent interaction, ACII ’07. Springer, Berlin, pp 59–70Google Scholar
  8. 8.
    Berns K, Hirth J (2006) Control of facial expressions of the humanoid robot head roman. In: 2006 IEEE/RSJ international conference on intelligent robots and systems, pp 3119–3124. doi: 10.1109/IROS.2006.282331
  9. 9.
    Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59CrossRefGoogle Scholar
  10. 10.
    Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155. doi: 10.1016/S1071-5819(03)00018-1 CrossRefGoogle Scholar
  11. 11.
    Breazeal C, Brooks R (2005) Robot Emotion: a functional perspective. In: Fellous J-M, Arbib MA (eds) Who needs emotions? the brain meets the robot. Oxford ScholarshipGoogle Scholar
  12. 12.
    Busso C, Deng Z, Grimm M, Neumann U, Narayanan S (2007) Rigid head motion in expressive speech animation: analysis and synthesis. IEEE Speech Audio Process 15(3):1075–1086. doi: 10.1109/TASL.2006.885910 CrossRefGoogle Scholar
  13. 13.
    Carney DR, Hall JA, LeBeau LS (2005) Beliefs about the nonverbal expression of social power. J Nonverbal Behav 29(2):105–123. doi: 10.1007/s10919-005-2743-z CrossRefGoogle Scholar
  14. 14.
    Chiaverini S (1997) Singularity-robust task-priority redundancy resolution for real-time kinematic control of robot manipulators. IEEE Trans Robot Autom 13(3):398–410CrossRefGoogle Scholar
  15. 15.
    Chiaverini S, Oriolo G, Walker ID (2007) Springer handbook of robotics, chap 11. Springer, New YorkGoogle Scholar
  16. 16.
    Crumpton J, Bethel CL (2016) A survey of using vocal prosody to convey emotion in robot speech. Int J Soc Robot 8(2):271–285. doi: 10.1007/s12369-015-0329-4 CrossRefGoogle Scholar
  17. 17.
    De Schutter J, De Laet T, Rutgeerts J, Decré W, Smits R, Aertbeliën E, Claes K, Bruyninckx H (2007) Constraint-based task specification and estimation for sensor-based robot systems in the presence of geometric uncertainty. Int J Robot Res 26(5):433–455CrossRefGoogle Scholar
  18. 18.
    Derakshan N, Eysenck MW (2009) Anxiety, processing efficiency, and cognitive performance: new developments from attentional control theory. Eur Psychol 14(2):168–176CrossRefGoogle Scholar
  19. 19.
    Disalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the DIS conference. ACM Press, pp 321–326. doi: 10.1145/778712.778756
  20. 20.
    Gebhard P (2005) Alma: a layered model of affect. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems, AAMAS ’05, pp 29–36Google Scholar
  21. 21.
    Glowinski D, Dael N, Camurri A, Volpe G, Mortillaro M, Scherer K (2011) Toward a minimal representation of affective gestures. IEEE Trans Affect Comput 2(2):106–118. doi: 10.1109/T-AFFC.2011.7 CrossRefGoogle Scholar
  22. 22.
    Hudson J, Orviska M, Hunady J (2016) People’s attitudes to robots in caring for the elderly. Int J Soc Robot. doi: 10.1007/s12369-016-0384-5 Google Scholar
  23. 23.
    Johnston O, Thomas F (1981) The illusion of life: disney animation. Abbeville Press, New YorkGoogle Scholar
  24. 24.
    Karg M, Samadani AA, Gorbet R, Kuhnlenz K, Hoey J, Kulic D (2013) Body movements for affective expression: a survey of automatic recognition and generation. IEEE Trans Affect Comput 4(4):341–359. doi: 10.1109/T-AFFC.2013.29 CrossRefGoogle Scholar
  25. 25.
    Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100CrossRefGoogle Scholar
  26. 26.
    Kulic D, Croft E (2007) Physiological and subjective responses to articulated robot motion. Robotica 25:13–27CrossRefGoogle Scholar
  27. 27.
    Lance BJ, Marsella SC (2008) A model of gaze for the purpose of emotional expression in virtual embodied agents. In: Proceedings of the 7th international joint conference on autonomous agents and multiagent systems, vol 1, AAMAS ’08, pp 199–206Google Scholar
  28. 28.
    Lang PJ, Bradley MM, Cuthbert BN (2008) International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical Report A-8, The Center for Research in Psychophysiology, University of Florida, Gainesville, FLGoogle Scholar
  29. 29.
    Liégeois A (1977) Automatic supervisory control of the configuration and behavior of multibody mechanisms. IEEE Trans Syst Man Cybern Syst 7(12):868–871. doi: 10.1109/TSMC.1977.4309644 CrossRefzbMATHGoogle Scholar
  30. 30.
    Lim A, Ogata T, Okuno H (2011) Converting emotional voice to motion for robot telepresence. In: 2011 11th IEEE-RAS international conference on humanoid robots (humanoids), pp 472–479. doi: 10.1109/Humanoids.2011.6100891
  31. 31.
    Mansard N, Stasse O, Evrard P, Kheddar A (2009) A versatile generalized inverted kinematics implementation for collaborative working humanoid robots: the stack of tasks. In: International conference on advanced robotics 2009 (ICAR 2009), pp 1–6Google Scholar
  32. 32.
    Mehrabian A (1996) Pleasure–arousal–dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292. doi: 10.1007/bf02686918 CrossRefGoogle Scholar
  33. 33.
    Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 23(2):133–152. doi: 10.1023/A:1021435526134 CrossRefGoogle Scholar
  34. 34.
    Nakagawa K, Shinozawa K, Ishiguro H, Akimoto T, Hagita N (2009) Motion modification method to control affective nuances for robots. In: IEEE/RSJ international conference on intelligent robots and systems, 2009 (IROS 2009), pp 5003–5008. doi: 10.1109/IROS.2009.5354205
  35. 35.
    Nomura T, Nakao A (2010) Comparison on identification of affective body motions by robots between elder people and university students: a case study in Japan. Int J Soc Robot 2(2):147–157CrossRefGoogle Scholar
  36. 36.
    Oswald A, Proto E, Sgroi D (2009) Happiness and productivity. IZA Discussion Papers 4645Google Scholar
  37. 37.
    Palanica A, Itier R (2012) Attention capture by direct gaze is robust to context and task demands. J Nonverbal Behav 36(2):123–134. doi: 10.1007/s10919-011-0128-z CrossRefGoogle Scholar
  38. 38.
    Pierre-Yves O (2003) The production and recognition of emotions in speech: features and algorithms. Int J Hum Comput Stud 59(12):157–183. doi: 10.1016/S1071-5819(02)00141-6 CrossRefGoogle Scholar
  39. 39.
    Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: 2010 5th ACM/IEEE international conference on human–robot interaction (HRI), pp 53–60. doi: 10.1109/HRI.2010.5453269
  40. 40.
    Sentis L, Khatib O (2005) Synthesis of whole-body behaviors through hierarchical control of behavioral primitives. Int J Humanoid Robot 2(4):505–518. images/pdfs/Sentis_2005_IJHR.pdfGoogle Scholar
  41. 41.
    Siciliano B, Slotine JJ (1991) A general framework for managing multiple tasks in highly redundant robotic systems. In: 91 ICAR, Fifth international conference on advanced robotics, 1991. ’Robots in unstructured environments’, vol 2, pp 1211–1216Google Scholar
  42. 42.
    Tang D, Schmeichel BJ (2015) Look me in the eye: manipulated eye gaze affects dominance mindsets. J Nonverbal Behav 39(2):181–194. doi: 10.1007/s10919-015-0206-8 CrossRefGoogle Scholar
  43. 43.
    Tapus A, Mataric MJ (2007) Emulating empathy in socially assistive robotics. In: Proceedings of the AAAI spring symposium on multidisciplinary collaboration for socially assistive roboticsGoogle Scholar
  44. 44.
    Tapus A, Mataric M, Scasselati B (2007) Socially assistive robotics [grand challenges of robotics]. IEEE Robot Autom Mag 14(1):35–42. doi: 10.1109/MRA.2007.339605 CrossRefGoogle Scholar
  45. 45.
    Unuma M, Anjyo K, Takeuchi R (1995) Fourier principles for emotion-based human figure animation. In: Proceedings of the 22nd annual conference on computer graphics and interactive techniques, SIGGRAPH ’95, pp 91–96Google Scholar
  46. 46.
    White G, Bhatt R, Tang CP, Krovi V (2009) Experimental evaluation of dynamic redundancy resolution in a nonholonomic wheeled mobile manipulator. IEEE/ASME Trans Mechatron 14(3):349–357. doi: 10.1109/TMECH.2008.2008802 CrossRefGoogle Scholar
  47. 47.
    Zheng M, Moon A, Croft EA, Meng MQH (2015) Impacts of robot head gaze on robot-to-human handovers. Int J Soc Robot 7(5):783–798. doi: 10.1007/s12369-015-0305-z CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2017

Authors and Affiliations

  • Josep-Arnau Claret
    • 1
  • Gentiane Venture
    • 2
  • Luis Basañez
    • 1
  1. 1.Institute of Industrial and Control EngineeringUniversitat Politècnica de Catalunya-BarcelonaTech (UPC)BarcelonaSpain
  2. 2.Tokyo University of Agriculture and TechnologyTokyoJapan

Personalised recommendations