Abstract
Current approaches do not allow robots to execute a task and simultaneously convey emotions to users using their body motions. This paper explores the capabilities of the Jacobian null space of a humanoid robot to convey emotions. A task priority formulation has been implemented in a Pepper robot which allows the specification of a primary task (waving gesture, transportation of an object, etc.) and exploits the kinematic redundancy of the robot to convey emotions to humans as a lower priority task. The emotions, defined by Mehrabian as points in the pleasure–arousal–dominance space, generate intermediate motion features (jerkiness, activity and gaze) that carry the emotional information. A map from this features to the joints of the robot is presented. A user study has been conducted in which emotional motions have been shown to 30 participants. The results show that happiness and sadness are very well conveyed to the user, calm is moderately well conveyed, and fear is not well conveyed. An analysis on the dependencies between the motion features and the emotions perceived by the participants shows that activity correlates positively with arousal, jerkiness is not perceived by the user, and gaze conveys dominance when activity is low. The results indicate a strong influence of the most energetic motions of the emotional task and point out new directions for further research. Overall, the results show that the null space approach can be regarded as a promising mean to convey emotions as a lower priority task.
Similar content being viewed by others
Abbreviations
- DOF:
-
Degree of freedom
- JVG:
-
Jerkiness–activity–gaze
- PAD:
-
Pleasure–arousal–dominance
References
Adams B Jr, Kleck R (2005) Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion 5(1):3–11
Amaya K, Bruderlin A, Calvert T (1996) Emotion from motion. In: Proceedings of the conference on graphics interface ’96, GI ’96, pp 222–229
Asada M (2015) Towards artificial empathy. Int J Soc Robot 7(1):19–33. doi:10.1007/s12369-014-0253-z
Baddoura R, Venture G (2015) This robot is sociable: close-up on the gestures and measured motion of a human responding to a proactive robot. Int J Soc Robot 7(4):489–496. doi:10.1007/s12369-015-0279-x
Baerlocher P, Boulic R (1998) Task-priority formulations for the kinematic control of highly redundant articulated structures. In: Proceedings of 1998 IEEE/RSJ international conference on intelligent robots and systems, 1998, vol 1, pp 323–329. doi:10.1109/IROS.1998.724639
Beck A, Hiolle A, Mazel A, Cañamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments, AFFINE ’10, pp 37–42
Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: Proceedings of the 2nd international conference on affective computing and intelligent interaction, ACII ’07. Springer, Berlin, pp 59–70
Berns K, Hirth J (2006) Control of facial expressions of the humanoid robot head roman. In: 2006 IEEE/RSJ international conference on intelligent robots and systems, pp 3119–3124. doi:10.1109/IROS.2006.282331
Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59
Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155. doi:10.1016/S1071-5819(03)00018-1
Breazeal C, Brooks R (2005) Robot Emotion: a functional perspective. In: Fellous J-M, Arbib MA (eds) Who needs emotions? the brain meets the robot. Oxford Scholarship
Busso C, Deng Z, Grimm M, Neumann U, Narayanan S (2007) Rigid head motion in expressive speech animation: analysis and synthesis. IEEE Speech Audio Process 15(3):1075–1086. doi:10.1109/TASL.2006.885910
Carney DR, Hall JA, LeBeau LS (2005) Beliefs about the nonverbal expression of social power. J Nonverbal Behav 29(2):105–123. doi:10.1007/s10919-005-2743-z
Chiaverini S (1997) Singularity-robust task-priority redundancy resolution for real-time kinematic control of robot manipulators. IEEE Trans Robot Autom 13(3):398–410
Chiaverini S, Oriolo G, Walker ID (2007) Springer handbook of robotics, chap 11. Springer, New York
Crumpton J, Bethel CL (2016) A survey of using vocal prosody to convey emotion in robot speech. Int J Soc Robot 8(2):271–285. doi:10.1007/s12369-015-0329-4
De Schutter J, De Laet T, Rutgeerts J, Decré W, Smits R, Aertbeliën E, Claes K, Bruyninckx H (2007) Constraint-based task specification and estimation for sensor-based robot systems in the presence of geometric uncertainty. Int J Robot Res 26(5):433–455
Derakshan N, Eysenck MW (2009) Anxiety, processing efficiency, and cognitive performance: new developments from attentional control theory. Eur Psychol 14(2):168–176
Disalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the DIS conference. ACM Press, pp 321–326. doi:10.1145/778712.778756
Gebhard P (2005) Alma: a layered model of affect. In: Proceedings of the fourth international joint conference on autonomous agents and multiagent systems, AAMAS ’05, pp 29–36
Glowinski D, Dael N, Camurri A, Volpe G, Mortillaro M, Scherer K (2011) Toward a minimal representation of affective gestures. IEEE Trans Affect Comput 2(2):106–118. doi:10.1109/T-AFFC.2011.7
Hudson J, Orviska M, Hunady J (2016) People’s attitudes to robots in caring for the elderly. Int J Soc Robot. doi:10.1007/s12369-016-0384-5
Johnston O, Thomas F (1981) The illusion of life: disney animation. Abbeville Press, New York
Karg M, Samadani AA, Gorbet R, Kuhnlenz K, Hoey J, Kulic D (2013) Body movements for affective expression: a survey of automatic recognition and generation. IEEE Trans Affect Comput 4(4):341–359. doi:10.1109/T-AFFC.2013.29
Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100
Kulic D, Croft E (2007) Physiological and subjective responses to articulated robot motion. Robotica 25:13–27
Lance BJ, Marsella SC (2008) A model of gaze for the purpose of emotional expression in virtual embodied agents. In: Proceedings of the 7th international joint conference on autonomous agents and multiagent systems, vol 1, AAMAS ’08, pp 199–206
Lang PJ, Bradley MM, Cuthbert BN (2008) International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical Report A-8, The Center for Research in Psychophysiology, University of Florida, Gainesville, FL
Liégeois A (1977) Automatic supervisory control of the configuration and behavior of multibody mechanisms. IEEE Trans Syst Man Cybern Syst 7(12):868–871. doi:10.1109/TSMC.1977.4309644
Lim A, Ogata T, Okuno H (2011) Converting emotional voice to motion for robot telepresence. In: 2011 11th IEEE-RAS international conference on humanoid robots (humanoids), pp 472–479. doi:10.1109/Humanoids.2011.6100891
Mansard N, Stasse O, Evrard P, Kheddar A (2009) A versatile generalized inverted kinematics implementation for collaborative working humanoid robots: the stack of tasks. In: International conference on advanced robotics 2009 (ICAR 2009), pp 1–6
Mehrabian A (1996) Pleasure–arousal–dominance: a general framework for describing and measuring individual differences in temperament. Curr Psychol 14(4):261–292. doi:10.1007/bf02686918
Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 23(2):133–152. doi:10.1023/A:1021435526134
Nakagawa K, Shinozawa K, Ishiguro H, Akimoto T, Hagita N (2009) Motion modification method to control affective nuances for robots. In: IEEE/RSJ international conference on intelligent robots and systems, 2009 (IROS 2009), pp 5003–5008. doi:10.1109/IROS.2009.5354205
Nomura T, Nakao A (2010) Comparison on identification of affective body motions by robots between elder people and university students: a case study in Japan. Int J Soc Robot 2(2):147–157
Oswald A, Proto E, Sgroi D (2009) Happiness and productivity. IZA Discussion Papers 4645
Palanica A, Itier R (2012) Attention capture by direct gaze is robust to context and task demands. J Nonverbal Behav 36(2):123–134. doi:10.1007/s10919-011-0128-z
Pierre-Yves O (2003) The production and recognition of emotions in speech: features and algorithms. Int J Hum Comput Stud 59(12):157–183. doi:10.1016/S1071-5819(02)00141-6
Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: 2010 5th ACM/IEEE international conference on human–robot interaction (HRI), pp 53–60. doi:10.1109/HRI.2010.5453269
Sentis L, Khatib O (2005) Synthesis of whole-body behaviors through hierarchical control of behavioral primitives. Int J Humanoid Robot 2(4):505–518. images/pdfs/Sentis_2005_IJHR.pdf
Siciliano B, Slotine JJ (1991) A general framework for managing multiple tasks in highly redundant robotic systems. In: 91 ICAR, Fifth international conference on advanced robotics, 1991. ’Robots in unstructured environments’, vol 2, pp 1211–1216
Tang D, Schmeichel BJ (2015) Look me in the eye: manipulated eye gaze affects dominance mindsets. J Nonverbal Behav 39(2):181–194. doi:10.1007/s10919-015-0206-8
Tapus A, Mataric MJ (2007) Emulating empathy in socially assistive robotics. In: Proceedings of the AAAI spring symposium on multidisciplinary collaboration for socially assistive robotics
Tapus A, Mataric M, Scasselati B (2007) Socially assistive robotics [grand challenges of robotics]. IEEE Robot Autom Mag 14(1):35–42. doi:10.1109/MRA.2007.339605
Unuma M, Anjyo K, Takeuchi R (1995) Fourier principles for emotion-based human figure animation. In: Proceedings of the 22nd annual conference on computer graphics and interactive techniques, SIGGRAPH ’95, pp 91–96
White G, Bhatt R, Tang CP, Krovi V (2009) Experimental evaluation of dynamic redundancy resolution in a nonholonomic wheeled mobile manipulator. IEEE/ASME Trans Mechatron 14(3):349–357. doi:10.1109/TMECH.2008.2008802
Zheng M, Moon A, Croft EA, Meng MQH (2015) Impacts of robot head gaze on robot-to-human handovers. Int J Soc Robot 7(5):783–798. doi:10.1007/s12369-015-0305-z
Author information
Authors and Affiliations
Corresponding author
Additional information
This work has been partially supported by the Spanish MINECO Projects DPI2011-22471, DPI2013-40882-P and DPI2014-57757-R, the Spanish predoctoral Grant BES-2012-054899, and the Japanese challenging exploratory research Grant 15K12124.The authors would like to thank the members of the GVLab for their invaluable support with the translations and attending the local participants during the user study.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Appendices
Appendix 1
The emotional conveyance algorithm: task priority proof
Following the work in [5, 14], a proof of the task prioritization of the proposed solution is shown below, that is, that the execution of the lower priority tasks does not affect the execution of the higher priority tasks.
The next identities will be used:
where \(A^+\) is the pseudoinverse of A.
Given an idempotent matrix B, that is, \(B = B^2\); and Hermitian, \(B = B^*\) in general, with \(B^*\) the conjugate of B, and \(B = B^T\) in particular for this work, then
Given a matrix C, then \(D = I - C^+C\) is the orthogonal projector onto the kernel of C, thus idempotent and Hermitian. So, in light of (10):
Given a task \(x_i\) of the robot defined as a function of its configuration q:
differentiating and applying the chain rule,
a mapping between the velocity of task i and the joint velocities is obtained.
Similarly, the differential mapping of the main task t, as defined in Sect. 3.4, becomes \(\dot{x}_t = J_t \, \dot{q}\). The joint velocities \(\dot{q}\) as defined in this work in (8) are:
thus substituting (12) them in \(x_t = J_t \, \dot{q}\) it is obtained:
Using (11) in \((J_h P_t)^+\) and rearranging terms the expression can be transformed to
and now using (9) in \(J_t \, P_t\) as:
becomes \(\dot{x}_t = J_t \, J_t^+ e_t\). This expression shows that the emotional tasks h and m do not affect the execution of the main task t.
Similarly for the task h corresponding to the gaze:
becomes \(\dot{x}_h = J_h \, J_t^+ e_t + J_h \, P_t \; J_h^+ e_h \). In this expression it can be seen that the second priority task h is only affected by the higher priority task t.
Appendix 2
Implementation values
The angular velocity \(\omega \) in (3) has been set to \(\omega = 2.79\) rad/s. One term has been used in (5) with \(n_J = 1\), \(a_1 = b_1 = 0.25\) rad and \(\omega _1 = 12.57\) rad/s.
Following the convention of Sect. 2 the values \(\Theta _{0_i}\), \(\Theta _{E_{0i}}\) and \(h_i\) implemented in (3) can be seen in Table 4.
Rights and permissions
About this article
Cite this article
Claret, JA., Venture, G. & Basañez, L. Exploiting the Robot Kinematic Redundancy for Emotion Conveyance to Humans as a Lower Priority Task. Int J of Soc Robotics 9, 277–292 (2017). https://doi.org/10.1007/s12369-016-0387-2
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-016-0387-2