Skip to main content
Log in

Exploiting the Robot Kinematic Redundancy for Emotion Conveyance to Humans as a Lower Priority Task

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Current approaches do not allow robots to execute a task and simultaneously convey emotions to users using their body motions. This paper explores the capabilities of the Jacobian null space of a humanoid robot to convey emotions. A task priority formulation has been implemented in a Pepper robot which allows the specification of a primary task (waving gesture, transportation of an object, etc.) and exploits the kinematic redundancy of the robot to convey emotions to humans as a lower priority task. The emotions, defined by Mehrabian as points in the pleasure–arousal–dominance space, generate intermediate motion features (jerkiness, activity and gaze) that carry the emotional information. A map from this features to the joints of the robot is presented. A user study has been conducted in which emotional motions have been shown to 30 participants. The results show that happiness and sadness are very well conveyed to the user, calm is moderately well conveyed, and fear is not well conveyed. An analysis on the dependencies between the motion features and the emotions perceived by the participants shows that activity correlates positively with arousal, jerkiness is not perceived by the user, and gaze conveys dominance when activity is low. The results indicate a strong influence of the most energetic motions of the emotional task and point out new directions for further research. Overall, the results show that the null space approach can be regarded as a promising mean to convey emotions as a lower priority task.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. https://www.ald.softbankrobotics.com.

Abbreviations

DOF:

Degree of freedom

JVG:

Jerkiness–activity–gaze

PAD:

Pleasure–arousal–dominance

References

  1. Adams B Jr, Kleck R (2005) Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion 5(1):3–11

    Article  Google Scholar 

  2. Amaya K, Bruderlin A, Calvert T (1996) Emotion from motion. In: Proceedings of the conference on graphics interface ’96, GI ’96, pp 222–229

  3. Asada M (2015) Towards artificial empathy. Int J Soc Robot 7(1):19–33. doi:10.1007/s12369-014-0253-z

    Article  Google Scholar 

  4. Baddoura R, Venture G (2015) This robot is sociable: close-up on the gestures and measured motion of a human responding to a proactive robot. Int J Soc Robot 7(4):489–496. doi:10.1007/s12369-015-0279-x

    Article  Google Scholar 

  5. Baerlocher P, Boulic R (1998) Task-priority formulations for the kinematic control of highly redundant articulated structures. In: Proceedings of 1998 IEEE/RSJ international conference on intelligent robots and systems, 1998, vol 1, pp 323–329. doi:10.1109/IROS.1998.724639

  6. Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments, AFFINE ’10, pp 37–42

  7. Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: Proceedings of the 2nd international conference on affective computing and intelligent interaction, ACII ’07. Springer, Berlin, pp 59–70

  8. Berns K, Hirth J (2006) Control of facial expressions of the humanoid robot head roman. In: 2006 IEEE/RSJ international conference on intelligent robots and systems, pp 3119–3124. doi:10.1109/IROS.2006.282331

  9. Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59

    Article  Google Scholar 

  10. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155. doi:10.1016/S1071-5819(03)00018-1

    Article  Google Scholar 

  11. Breazeal C, Brooks R (2005) Robot Emotion: a functional perspective. In: Fellous J-M, Arbib MA (eds) Who needs emotions? the brain meets the robot. Oxford Scholarship

  12. Busso C, Deng Z, Grimm M, Neumann U, Narayanan S (2007) Rigid head motion in expressive speech animation: analysis and synthesis. IEEE Speech Audio Process 15(3):1075–1086. doi:10.1109/TASL.2006.885910

    Article  Google Scholar 

  13. Carney DR, Hall JA, LeBeau LS (2005) Beliefs about the nonverbal expression of social power. J Nonverbal Behav 29(2):105–123. doi:10.1007/s10919-005-2743-z

    Article  Google Scholar 

  14. Chiaverini S (1997) Singularity-robust task-priority redundancy resolution for real-time kinematic control of robot manipulators. IEEE Trans Robot Autom 13(3):398–410

    Article  Google Scholar 

  15. Chiaverini S, Oriolo G, Walker ID (2007) Springer handbook of robotics, chap 11. Springer, New York

    Google Scholar 

  16. Crumpton J, Bethel CL (2016) A survey of using vocal prosody to convey emotion in robot speech. Int J Soc Robot 8(2):271–285. doi:10.1007/s12369-015-0329-4

    Article  Google Scholar 

  17. De Schutter J, De Laet T, Rutgeerts J, Decré W, Smits R, Aertbeliën E, Claes K, Bruyninckx H (2007) Constraint-based task specification and estimation for sensor-based robot systems in the presence of geometric uncertainty. Int J Robot Res 26(5):433–455

    Article  Google Scholar 

  18. Derakshan N, Eysenck MW (2009) Anxiety, processing efficiency, and cognitive performance: new developments from attentional control theory. Eur Psychol 14(2):168–176

    Article  Google Scholar 

  19. Disalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the DIS conference. ACM Press, pp 321–326. doi:10.1145/778712.778756

  20. Gebhard P (2005) Alma: a layered model of affect. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems, AAMAS ’05, pp 29–36

  21. Glowinski D, Dael N, Camurri A, Volpe G, Mortillaro M, Scherer K (2011) Toward a minimal representation of affective gestures. IEEE Trans Affect Comput 2(2):106–118. doi:10.1109/T-AFFC.2011.7

    Article  Google Scholar 

  22. Hudson J, Orviska M, Hunady J (2016) People’s attitudes to robots in caring for the elderly. Int J Soc Robot. doi:10.1007/s12369-016-0384-5

    Google Scholar 

  23. Johnston O, Thomas F (1981) The illusion of life: disney animation. Abbeville Press, New York

  24. Karg M, Samadani AA, Gorbet R, Kuhnlenz K, Hoey J, Kulic D (2013) Body movements for affective expression: a survey of automatic recognition and generation. IEEE Trans Affect Comput 4(4):341–359. doi:10.1109/T-AFFC.2013.29

    Article  Google Scholar 

  25. Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100

    Article  Google Scholar 

  26. Kulic D, Croft E (2007) Physiological and subjective responses to articulated robot motion. Robotica 25:13–27

    Article  Google Scholar 

  27. Lance BJ, Marsella SC (2008) A model of gaze for the purpose of emotional expression in virtual embodied agents. In: Proceedings of the 7th international joint conference on autonomous agents and multiagent systems, vol 1, AAMAS ’08, pp 199–206

  28. Lang PJ, Bradley MM, Cuthbert BN (2008) International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical Report A-8, The Center for Research in Psychophysiology, University of Florida, Gainesville, FL

  29. Liégeois A (1977) Automatic supervisory control of the configuration and behavior of multibody mechanisms. IEEE Trans Syst Man Cybern Syst 7(12):868–871. doi:10.1109/TSMC.1977.4309644

    Article  MATH  Google Scholar 

  30. Lim A, Ogata T, Okuno H (2011) Converting emotional voice to motion for robot telepresence. In: 2011 11th IEEE-RAS international conference on humanoid robots (humanoids), pp 472–479. doi:10.1109/Humanoids.2011.6100891

  31. Mansard N, Stasse O, Evrard P, Kheddar A (2009) A versatile generalized inverted kinematics implementation for collaborative working humanoid robots: the stack of tasks. In: International conference on advanced robotics 2009 (ICAR 2009), pp 1–6

  32. Mehrabian A (1996) Pleasure–arousal–dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292. doi:10.1007/bf02686918

    Article  Google Scholar 

  33. Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 23(2):133–152. doi:10.1023/A:1021435526134

    Article  Google Scholar 

  34. Nakagawa K, Shinozawa K, Ishiguro H, Akimoto T, Hagita N (2009) Motion modification method to control affective nuances for robots. In: IEEE/RSJ international conference on intelligent robots and systems, 2009 (IROS 2009), pp 5003–5008. doi:10.1109/IROS.2009.5354205

  35. Nomura T, Nakao A (2010) Comparison on identification of affective body motions by robots between elder people and university students: a case study in Japan. Int J Soc Robot 2(2):147–157

    Article  Google Scholar 

  36. Oswald A, Proto E, Sgroi D (2009) Happiness and productivity. IZA Discussion Papers 4645

  37. Palanica A, Itier R (2012) Attention capture by direct gaze is robust to context and task demands. J Nonverbal Behav 36(2):123–134. doi:10.1007/s10919-011-0128-z

    Article  Google Scholar 

  38. Pierre-Yves O (2003) The production and recognition of emotions in speech: features and algorithms. Int J Hum Comput Stud 59(12):157–183. doi:10.1016/S1071-5819(02)00141-6

    Article  Google Scholar 

  39. Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: 2010 5th ACM/IEEE international conference on human–robot interaction (HRI), pp 53–60. doi:10.1109/HRI.2010.5453269

  40. Sentis L, Khatib O (2005) Synthesis of whole-body behaviors through hierarchical control of behavioral primitives. Int J Humanoid Robot 2(4):505–518. images/pdfs/Sentis_2005_IJHR.pdf

  41. Siciliano B, Slotine JJ (1991) A general framework for managing multiple tasks in highly redundant robotic systems. In: 91 ICAR, Fifth international conference on advanced robotics, 1991. ’Robots in unstructured environments’, vol 2, pp 1211–1216

  42. Tang D, Schmeichel BJ (2015) Look me in the eye: manipulated eye gaze affects dominance mindsets. J Nonverbal Behav 39(2):181–194. doi:10.1007/s10919-015-0206-8

    Article  Google Scholar 

  43. Tapus A, Mataric MJ (2007) Emulating empathy in socially assistive robotics. In: Proceedings of the AAAI spring symposium on multidisciplinary collaboration for socially assistive robotics

  44. Tapus A, Mataric M, Scasselati B (2007) Socially assistive robotics [grand challenges of robotics]. IEEE Robot Autom Mag 14(1):35–42. doi:10.1109/MRA.2007.339605

    Article  Google Scholar 

  45. Unuma M, Anjyo K, Takeuchi R (1995) Fourier principles for emotion-based human figure animation. In: Proceedings of the 22nd annual conference on computer graphics and interactive techniques, SIGGRAPH ’95, pp 91–96

  46. White G, Bhatt R, Tang CP, Krovi V (2009) Experimental evaluation of dynamic redundancy resolution in a nonholonomic wheeled mobile manipulator. IEEE/ASME Trans Mechatron 14(3):349–357. doi:10.1109/TMECH.2008.2008802

    Article  Google Scholar 

  47. Zheng M, Moon A, Croft EA, Meng MQH (2015) Impacts of robot head gaze on robot-to-human handovers. Int J Soc Robot 7(5):783–798. doi:10.1007/s12369-015-0305-z

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Josep-Arnau Claret.

Additional information

This work has been partially supported by the Spanish MINECO Projects DPI2011-22471, DPI2013-40882-P and DPI2014-57757-R, the Spanish predoctoral Grant BES-2012-054899, and the Japanese challenging exploratory research Grant 15K12124.The authors would like to thank the members of the GVLab for their invaluable support with the translations and attending the local participants during the user study.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 28429 KB)

Appendices

Appendix 1

The emotional conveyance algorithm: task priority proof

Following the work in [5, 14], a proof of the task prioritization of the proposed solution is shown below, that is, that the execution of the lower priority tasks does not affect the execution of the higher priority tasks.

The next identities will be used:

$$\begin{aligned} A \, A^+ \, A = A \end{aligned}$$
(9)

where \(A^+\) is the pseudoinverse of A.

Given an idempotent matrix B, that is, \(B = B^2\); and Hermitian, \(B = B^*\) in general, with \(B^*\) the conjugate of B, and \(B = B^T\) in particular for this work, then

$$\begin{aligned} B \, (A B)^+ = (A B)^+ \end{aligned}$$
(10)

Given a matrix C, then \(D = I - C^+C\) is the orthogonal projector onto the kernel of C, thus idempotent and Hermitian. So, in light of (10):

$$\begin{aligned} (A D)^+ = D (A D)^+ \end{aligned}$$
(11)

Given a task \(x_i\) of the robot defined as a function of its configuration q:

$$\begin{aligned} x_i = f_i(q) \end{aligned}$$

differentiating and applying the chain rule,

$$\begin{aligned} \dot{x}_i = \frac{\partial x_i}{\partial t} = \frac{\partial f_i(q)}{\partial t} = \frac{\partial f_i(q)}{\partial q} \frac{\partial q}{\partial t} = J_i \, \dot{q} \end{aligned}$$

a mapping between the velocity of task i and the joint velocities is obtained.

Similarly, the differential mapping of the main task t, as defined in Sect. 3.4, becomes \(\dot{x}_t = J_t \, \dot{q}\). The joint velocities \(\dot{q}\) as defined in this work in (8) are:

$$\begin{aligned} \dot{q} = J_t^+ e_t + P_t \; J_h^+ e_h + \left[ P_t - (J_h P_t)^+ (J_h P_t) \right] e_m \end{aligned}$$
(12)

thus substituting (12) them in \(x_t = J_t \, \dot{q}\) it is obtained:

$$\begin{aligned} \dot{x}_t = J_t \, \left\{ J_t^+ e_t + P_t \; J_h^+ e_h + \left[ P_t - (J_h P_t)^+ (J_h P_t) \right] e_m \right\} \ \end{aligned}$$

Using (11) in \((J_h P_t)^+\) and rearranging terms the expression can be transformed to

$$\begin{aligned} \dot{x}_t = J_t \, J_t^+ e_t + J_t \, P_t \left\{ J_h^+ e_h + \left[ I - (J_h P_t)^+ (J_h P_t) \right] e_m \right\} \end{aligned}$$

and now using (9) in \(J_t \, P_t\) as:

$$\begin{aligned} J_t \, P_t = J_t \left( I - J_t^+ J_t \right) = J_t - J_t J_t^+ J_t = J_t - J_t = 0 \end{aligned}$$

becomes \(\dot{x}_t = J_t \, J_t^+ e_t\). This expression shows that the emotional tasks h and m do not affect the execution of the main task t.

Similarly for the task h corresponding to the gaze:

$$\begin{aligned} \dot{x}_h= & {} J_h \, \left\{ J_t^+ e_t + P_t \; J_h^+ e_h + \left[ P_t - (J_h P_t)^+ (J_h P_t) \right] e_m \right\} \\= & {} J_h \, J_t^+ e_t + J_h \, P_t \; J_h^+ e_h + J_h \, \left[ P_t - (J_h P_t)^+ (J_h P_t) \right] e_m \end{aligned}$$

which using (9) and (10)

$$\begin{aligned}&J_h \left[ P_t - (J_h P_t)^+ (J_h P_t) \right] = J_h \left[ P_t - P_t (J_h P_t)^+ (J_h P_t) \right] \\&\quad = J_h P_t - (J_h P_t) (J_h P_t)^+ (J_h P_t) = J_h P_t - J_h P_t = 0 \end{aligned}$$

becomes \(\dot{x}_h = J_h \, J_t^+ e_t + J_h \, P_t \; J_h^+ e_h \). In this expression it can be seen that the second priority task h is only affected by the higher priority task t.

Appendix 2

Implementation values

The angular velocity \(\omega \) in (3) has been set to \(\omega = 2.79\) rad/s. One term has been used in (5) with \(n_J = 1\), \(a_1 = b_1 = 0.25\) rad and \(\omega _1 = 12.57\) rad/s.

Following the convention of Sect. 2 the values \(\Theta _{0_i}\), \(\Theta _{E_{0i}}\) and \(h_i\) implemented in (3) can be seen in Table 4.

Table 4 Implementedvalues in radians

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Claret, JA., Venture, G. & Basañez, L. Exploiting the Robot Kinematic Redundancy for Emotion Conveyance to Humans as a Lower Priority Task. Int J of Soc Robotics 9, 277–292 (2017). https://doi.org/10.1007/s12369-016-0387-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-016-0387-2

Keywords

Navigation