Skip to main content
Log in

Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task

Experiments with the iCub humanoid

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Estimating the engagement is critical for human–robot interaction. Engagement measures typically rely on the dynamics of the social signals exchanged by the partners, especially speech and gaze. However, the dynamics of these signals are likely to be influenced by individual and social factors, such as personality traits, as it is well documented that they critically influence how two humans interact with each other. Here, we assess the influence of two factors, namely extroversion and negative attitude toward robots, on speech and gaze during a cooperative task, where a human must physically manipulate a robot to assemble an object. We evaluate if the score of extroversion and negative attitude towards robots co-variate with the duration and frequency of gaze and speech cues. The experiments were carried out with the humanoid robot iCub and N = 56 adult participants. We found that the more people are extrovert, the more and longer they tend to talk with the robot; and the more people have a negative attitude towards robots, the less they will look at the robot face and the more they will look at the robot hands where the assembly and the contacts occur. Our results confirm and provide evidence that the engagement models classically used in human–robot interaction should take into account attitudes and personality traits.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. In social psychology, there is a net distinction between personality traits and attitudes. Here, we use methods from differential psychology rather than social psychology: the distinction between the two is not important, as long as the two factors are two characteristics of the individual that are evaluated at a certain time prior to the interaction. We measured the attitude towards robots with the NARS questionnaire, a test that was created to capture the projected anxiety of the person before its interaction with the robot. We used it to evaluate an individual attitude prior to the direct interaction with the robot (participants filled the NARS questionnaire several days before the experiment—see details about the experimental procedure in Sect. 4.4).

  2. See for example the press article: “Will workplace robots cost more jobs than they create?” http://www.bbc.com/news/technology-27995372.

  3. We interviewed our participants after the experiments. Some reported that they “do not like robots because they are going to take our jobs”. Some reported to have enjoyed the experiment with the robot and made explicit reference to their expectations being influenced by “the robots of Star Wars”.

  4. In social psychology, there is a net distinction personality traits and attitudes. Here, we use methods from differential psychology rather than social psychology. We measured the attitude towards robots with the NARS questionnaire, a test that was created to capture the projected anxiety of the person before its interaction with the robot. We used it to evaluate an individual attitude prior to the direct interaction with the robot (participants filled the NARS questionnaire several days before the experiment—see details about the experimental procedure in Sect. 4.4).

  5. http://www.loria.fr/~sivaldi/edhhi.htm.

  6. We cannot report the questions, as the questionnaire is not publicly available: we refer the interested reader to the English manual [12] and the official French adaptation that we used [13].

  7. A recent paper from Dinet and Vivian [16] studied the NARS and validated it on a sample of French population. Their study was published only after our work and experiments. They employed their own translation of the questionnaire, which has some slight differences with ours, mostly due to some nuances of the French language. These do not preserve the original meaning when translated back into English. In their paper there is no mention of a double translation mechanism for validating the French adaptation of the questionnaire.

  8. This was done as a safety measure. However, nothing happened during the experiments: the experimenter never had to push the safety button, and she never had to stop the physical interaction between the robot and the subject.

  9. It is a dissemination video from IIT showing the iCub, available on Youtube: http://youtu.be/ZcTwO2dpX8A.

  10. In the post-experiment interview, we asked the participants if they thought or had the impression that the robot was controlled by someone: all the participants thought that the robot was fully autonomous.

  11. The demonstration was also part of the safety measures required by the Ethics Committee to approve our protocol.

  12. The operator could switch the control mode without the need of the verbal command, since he had a direct visibility of the interaction zone in front of the robot through an additional camera that was centered on the workspace in front of the robot (see Fig. 2).

  13. Utterances are units of speech that begin and end by a pause. To determine the beginning and the end of each utterance, we consider pauses greater than 500ms.

  14. Correlation is frequently used to study the link between personality and behavior, as discussed in [18], a survey on the link between extroversion and behavior where all the cited studies use correlations to test their hypothesis.

  15. According to the NEO-PIR, a participant obtaining a score bigger than 137 is considered extrovert, while one with a score below 80 is introvert.

  16. According to the NARS, a score over 65 is a sign of negative attitude towards robots, while a score below 35 indicates a rather positive attitude towards robots.

  17. Contact forces are the forces due to the physical interaction between the human and the robot, originated at the contact location where the interaction occurs.

  18. See download instructions at http://eris.liralab.it/wiki/UPMC_iCub_project/MACSi_Software.

  19. https://github.com/robotology/icub-basic-demos/tree/master/demoForceControl.

References

  1. Ajzen Icek I,Madden TJ (1986) Prediction of goal-directed behavior: attitudes, intentions, and perceived behavioral control. J Exp Soc Psychol 22(5):453–474

  2. Ajzen Icek I,Madden TJ (1986) Prediction of goal-directed behavior: attitudes, intentions, and perceived behavioral control. J Exp Soc Psychol 22(5):453–474

  3. Anzalone S, Yoshikawa Y, Ishiguro H, Menegatti E, Pagello E, Sorbello R (2012) Towards partners profiling in human robot interaction contexts. In: Noda I, Ando N, Brugali D, Kuffner J (eds) Simulation, modeling, and programming for autonomous robot, vol 7628., Lecture Notes in Computer ScienceSpringer, Berlin, pp 4–15

  4. Anzalone SM, Boucenna S, Ivaldi S, Chetouani M (2015) Evaluating the engagement with social robots. Int J Soc Robot 7(4):465–478

    Article  Google Scholar 

  5. Argyle M (1976) Personality and social behaviour. Blackwell, Oxford

    Google Scholar 

  6. Ba S, Odobez JM (2009) Recognizing visual focus of attention from head pose in natural meetings. IEEE Trans Syst Man Cybernet B 39(1):16–33

    Article  Google Scholar 

  7. Beatty M, McCroskey J, Valencic K (2001) The biology of communication: a communibiological perspective. Hampton Press, Cresskill

    Google Scholar 

  8. Berry D, Hansen J (2000) Personality, nonverbal behavior, and interaction quality in female dyads. Pers Soc Psychol Bull 26(3):278–292

    Article  Google Scholar 

  9. Boucenna S, Anzalone S, Tilmont E, Cohen D, Chetouani M (2014) Learning of social signatures through imitation game between a robot and a human partner. IEEE Trans Auton Mental Dev 6(3):213–225

    Article  Google Scholar 

  10. Castellano G, Pereira A, Leite I, Paiva A, McOwan P W (2009) Detecting user engagement with a robot companion using task and social interaction-based features. In: Proceedings of the 2009 international conference on multimodal interfaces, pp 119–126

  11. Chen T, King CH, Thomaz A, Kemp C (2014) An investigation of responses to robot-initiated touch in a nursing context. Int J Soc Robot 6(1):141–161

    Article  Google Scholar 

  12. Costa P, McCrae R (1992) Revised NEO personality inventory (NEO-PI-R) and NEO five-factor inventory (NEO-FFI) professional manual. Psychological Assessment Resources, Odessa

    Google Scholar 

  13. Costa P, McCrae R, Rolland J (1998) NEO-PI-R. Inventaire de Personnalité révisé, Editions du Centre de Psychologie Appliquée, Paris

    Google Scholar 

  14. Dang TH, Tapus A (2014) Towards personality-based assistance in human-machine interaction. In: Proceedings of IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)

  15. Dewaele JM, Furnham A (2000) Personality and speech production: a pilot study of second language learners. Pers Individ Differ 28(2):355–365

    Article  Google Scholar 

  16. Dinet J, Vivian R (2015) Perception and attitudes towards anthropomorphic robots in france: validation of an assessment scale. Psychologie Francaise 60(1):173–189

    Article  Google Scholar 

  17. Eysenck HJ (1981) A model for personality. Springer-Verlag, General features of the model. New York

    Book  Google Scholar 

  18. France BHL, Heisel AD, Beatty MJ (2004) Is there empirical evidence for a nonverbal profile of extraversion? A meta-analysis and critique of the literature. Commun Monogr 71(1):28–48

    Article  Google Scholar 

  19. Fumagalli M, Ivaldi S, Randazzo M, Natale L, Metta G, Sandini G, Nori F (2012) Force feedback exploiting tactile and proximal force/torque sensing. Theory and implementation on the humanoid robot icub. Auton Robots 4:381–398

    Article  Google Scholar 

  20. Gaudiello I, Zibetti E, Lefort S, Chetouani M, Ivaldi S (2016) Trust as indicator of robot functional and social acceptance. an experimental study on user conformation to iCub answers. Comput Hum Behav 61:633–655. doi:10.1016/j.chb.2016.03.057

  21. Goffman E (1967) Interaction ritual: essays on face-to-face behavior. Anchor Books, New York

    Google Scholar 

  22. Goldberg L (1990) An alternative description of personality: the big-five factor structure. J Pers Soc Psychol 59:1216–1229

    Article  Google Scholar 

  23. Gray K, Wegner D (2012) Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125(1):125–130

    Article  Google Scholar 

  24. Hanninen L, Pastell M (2009) Cowlog: open-source software for coding behaviors from digital video. Behav Res Methods 41(2):472–476

    Article  Google Scholar 

  25. Huang CM, Thomaz A (2011) Effects of responding to, initiating and ensuring joint attention in human–robot interaction. In: IEEE RO-MAN, pp 65–71

  26. Iishi R, Shinohara Y, Nakano T, Nishida T (2011) Combining multiple types of eye-gaze information to predict user’s conversatinal engagement. In: 2nd workshop on eye gaze in intelligent human machine interaction, pp 1–8

  27. Ivaldi S, Anzalone S, Rousseau W, Sigaud O, Chetouani M (2014) Robot initiative in a team learning task increases the rhythm of interaction but not the perceived engagement. Front Neurorobot 8(5):1–23

    Google Scholar 

  28. Ivaldi S, Nguyen SM, Lyubova N, Droniou A, Padois V, Filliat D, Oudeyer PY, Sigaud O (2014) Object learning through active exploration. IEEE Trans Auton Mental Dev 6(1):56–72

    Article  Google Scholar 

  29. Le Maitre J, Chetouani M (2013) Self-talk discrimination in human-robot interaction situations for supporting social awareness. Int J Soc Robot 5(2):277–289

    Article  Google Scholar 

  30. Lepri B, Subramanian R, Kalimeri K, Staiano J, Pianesi F, Sebe N (2010) Employing social gaze and speaking activity for automatic determination of the extraversion trait. In: International conference on multimodal interfaces and the workshop on machine learning for multimodal interaction, pp 1–7

  31. Mara M, Appel M (2015) Science fiction reduces the eeriness of android robots: a field experiment. Comput Hum Behav 48(1):156–162

    Article  Google Scholar 

  32. Mohammadi G, Vinciarelli A (2012) Automatic personality perception: prediction of trait attribution based on prosodic features. IEEE Trans Affect Comput 3(3):273–284

    Article  Google Scholar 

  33. Mori M, MacDorman K, Kageki N (2012) The uncanny valley (from the field). IEEE Robot Autom Mag 19(2):98–100

    Article  Google Scholar 

  34. Natale L, Nori F, Metta G, Fumagalli M, Ivaldi S, Pattacini U, Randazzo M, Schmitz A, Sandini G (2013) The iCub platform: a tool for studying intrinsically motivated learning. Springer, Berlin

    Google Scholar 

  35. Nomura T, Kanda T, Suzuki T (2006) Experimental investigation into influence of negative attitudes toward robots on human-robot interaction. AI Soc 20(2):138–150

    Article  Google Scholar 

  36. Nomura T, Kanda T, Suzuki T, Kato K (2006) Exploratory investigation into influence of negative attitudes toward robots on human-robot interaction. In: Mobile Robots: towards New Applications, Aleksandar Lazinica

  37. Nomura T, Kanda T, Suzuki T, Kato K (2008) Prediction of human behavior in human-robot interaction using psychological scales for anxiety and negative attitudes toward robots. IEEE Trans Robot 24(2):442–451

    Article  Google Scholar 

  38. Pianesi F, Mana N, Cappelletti A, Lepri B, Zancanaro M (2008) Multimodal recognition of personality traits in social interactions. Proceedings of the 10th International Conference on Multimodal Interfaces, ICMI ’08. ACM, New York, pp 53–60

    Google Scholar 

  39. Poggi I, D’Errico F (2012) Social signals: a framework in terms of goals and beliefs. Cogn Process 13(2):427–445

    Article  Google Scholar 

  40. Rahbar F, Anzalone S, Varni G, Zibetti E, Ivaldi S, Chetouani M (2015) Predicting extraversion from non-verbal features during a face-to-face human-robot interaction. In: International Conference on Social Robotics, pp 1–10

  41. Rauthmann J, Seubert C, Sachse P, Furtner M (2012) Eyes as windows to the soul: gazing behavior is related to personality. J Res Pers 46(2):147–156

    Article  Google Scholar 

  42. Rich C, Ponsler B, Holroyd A, Sidner C L (2010) Recognizing engagement in human–robot interaction. In: Proceedings of ACM/IEEE international conference human–robot interaction (HRI), pp 375–382

  43. Sanghvi J, Castellano G, Leite I, Pereira A, McOwan P W, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: 6th ACM/IEEE International Conference on human–robot interaction (HRI), 2011, pp 305–311

  44. Saygin A, Chaminade T, Ishiguro H, Driver J, Frith C (2012) The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Soc Cogn Affect Neurosci 7(4):413–422

    Article  Google Scholar 

  45. Scherer K, Scherer U (1981) Speech evaluation in psychiatry. Speech behavior and personality, Grune and Stratton

    Google Scholar 

  46. Sidner C, Kidd C, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: Proceedings 9th International Conference on Intelligent User Interfaces, pp 78–84

  47. Sidner C, Lee C, Kidd C, Lesh N (2005) Explorations in engagement for humans and robots. Artif Intell 1(166):140–164

    Article  Google Scholar 

  48. Sidner CL, Lee C, Morency L P, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proceedings of 1st ACM SIGCHI/SIGART Conference on human–robot interaction, pp 290–296

  49. Stefanov N, Peer A, Buss M (2009) Role determination in human–human interaction. In: 3rd Joint EuroHaptics conference and world haptics, pp 51–56

  50. Takayama L, Pantofaru C (2009) Influences on proxemic behaviors in human–robot interaction. In: Proceedings of IEEE-RAS international conference on intelligent robots and systems

  51. Tapus A, Ţǎpuş C, Matarić MJ (2008) User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy. Intell Serv Robot 1(2):169–183

    Article  Google Scholar 

  52. Tapus A, Matarić MJ (2008) Socially assistive robots: the link between personality, empathy, physiological signals, and task performance. In: AAAI spring symposium on emotion, personality, and social behavior, pp 133–140

  53. Vinciarelli A, Mohammadi G (2014) A survey of personality computing. IEEE Trans Affect Comput 5(3):273–291

    Article  Google Scholar 

  54. Wilcox R, Nikolaidis S, Shah JA (2012) Optimization of temporal dynamics for adaptive human-robot interaction in assembly manufacturing. In: Robotics: science and systems

  55. Wood W (2000) Attitude change: Persuasion and social influence. Annu Rev Psychol 51:539–570

    Article  Google Scholar 

  56. Wu D, Bischof W, Anderson N, Jakobsen T, Kingstone A (2014) The influence of personality on social attention. Pers Individ Differ 60:25–29

    Article  Google Scholar 

  57. Yuichi I (1992) Extraversion, introversion, and visual interaction. Percept Mot Skills 74(1):43–50

    Article  Google Scholar 

  58. Zen G, Lepri B, Ricci E, Lanz O (2010) Space speaks: towards socially and personality aware visual surveillance. In: Proceedings of the 1st ACM international workshop on Multimodal pervasive video analysis, pp 37–42

Download references

Acknowledgments

The authors wish to thank Charles Ballarini for his contribution in software and experiments, Salvatore Anzalone and Ilaria Gaudiello for their contribution to the design of the experimental protocol. This work was performed within the Project EDHHI of Labex SMART (ANR-11-LABX-65) supported by French state funds managed by the ANR within the Investissements d’Avenir programme under reference ANR-11-IDEX-0004-02. The work was partially supported by the FP7 EU projects CoDyCo (No. 600716 ICT 2011.2.1 Cognitive Systems and Robotics).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Serena Ivaldi.

Appendices

Appendix 1: Questionnaire for Negative Attitude Towards Robots (NARS)

See Table 6 for the questions in English and French.

Table 6 NARS questionnaire for evaluating the negative attitude towards robots

Appendix 2: Questionnaire for Post-experimental Evaluation of the Assembly Task

See Table 7 for the questions in English and French.

Table 7 Post-experimental questionnaire for evaluating the perception and interaction with the iCub in the assembly task of this work

Appendix 3: Software for Operating the Robot

The WoZ GUI was organized in several tabs, each dedicated to a specific task, such as controlling the robot movements (gaze, hands movements, posture), its speech, its face expressions etc. The GUI events are elaborated by the actionServer module and others developed by the authors in previous studies [27, 28]. All the developed software is open sourceFootnote 18.

Fig. 11
figure 11

WoZ GUI. a The tab dedicated to the quick control of gaze, grasps and hands movements in the Cartesian space. The buttons sends pre-defined commands to the actionsServer module, developed in [28]. The buttons of the bottom row allows the operator to bring the robot in pre-defined postures (whole-body joint configurations): they were pre-programmed so as to simplify the control of the iCub during the experiments, in case the operator had to “bring it back” to a pre-defined configuration that could simplify the interaction for the participants. They were useful also for prototyping and testing of the experiments. b Part of the GUI dedicated to switching the control mode of the arms—position, zero-torque, then impedance control with low, medium and high stiffness

Fig. 12
figure 12

WoZ GUI. a The tab related to the robot’s speech. The operator can choose between a list of pre-defined sentences and expressions, or he can type a new sentence on-the-fly: this is done to be able to quickly formulate an answer to an unexpected request of the participant. The operator can switch between french and english speech (at the moment, the only two supported languages), even if in the experiments of this paper of course the robot was always speaking french. b The tab related to facial expressions. The list of facial expression along with their specific realization on the iCub face (the combination of the activation of the LEDs in eyelids and mouth) is loaded from a configuration file

Figure 11a shows the tab related to the control of head gaze and hands movements. It is designed to control the gaze direction in the Cartesian space, with relative movements with respect to the fixation position (joints at zero degrees in both eyes and neck). The hands can be quickly controlled by a list of available pre-defined grasps, plus primitives for rotating the palm orientation (towards the ground, skywards, facing each other). It is also possible to control the hand position and orientation in the Cartesian space, providing relative movements with respect to the current position with respect to the Cartesian base frame of the robot (the origin located at the base of the torso, with x-axis pointing backward, y-axis pointing towards the right side of the robot and z-axis pointing towards the robot head). Some buttons allow the operator to control the whole posture of the robot and bring it back to pre-defined configurations. Figure 11b shows the part of the GUI dedicated to switching the control mode of the arms: position, zero-torque, then impedance with high, medium and low stiffness. The default values of the module demoForceControl Footnote 19 for stiffness and damping were used. During the experiments, the arms were controlled in the “medium compliance” impedance mode, which allows the robot to exhibit a good compliance in case of unexpected contacts with the human participant. When the participant had grabbed the robot arms to start the teaching movement, the operator switched the control to zero-torque, which made the arms move under the effect of the human guidance. Figure 12a shows the tab related to the robot’s speech. It is designed to quickly choose choose one among a list of pre-defined sentences and expressions, in one of the supported languages (currently French or English). It is also possible to generate new sentences, that can be typed on-the-fly by the operator: this is done to allow the operator to quickly formulate an answer to an unexpected request of the participant. The operator can switch between the supported languages, but of course in the experiments of this paper the robot was always speaking French (as all the participants were native french speakers). The text-to-speech in English is generated by the festival library, while in French by the Pico library. Figure 12b shows the tab related to facial expressions. The list of facial expressions along with their specific realization on the iCub face (the combination of the activation of the LEDs in eyelids and mouth) is loaded from a configuration file that was designed by the experimenter.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ivaldi, S., Lefort, S., Peters, J. et al. Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task. Int J of Soc Robotics 9, 63–86 (2017). https://doi.org/10.1007/s12369-016-0357-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-016-0357-8

Keywords

Navigation