International Journal of Social Robotics

, Volume 9, Issue 1, pp 63–86 | Cite as

Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task

Experiments with the iCub humanoid
  • Serena Ivaldi
  • Sebastien Lefort
  • Jan Peters
  • Mohamed Chetouani
  • Joelle Provasi
  • Elisabetta Zibetti
Article

Abstract

Estimating the engagement is critical for human–robot interaction. Engagement measures typically rely on the dynamics of the social signals exchanged by the partners, especially speech and gaze. However, the dynamics of these signals are likely to be influenced by individual and social factors, such as personality traits, as it is well documented that they critically influence how two humans interact with each other. Here, we assess the influence of two factors, namely extroversion and negative attitude toward robots, on speech and gaze during a cooperative task, where a human must physically manipulate a robot to assemble an object. We evaluate if the score of extroversion and negative attitude towards robots co-variate with the duration and frequency of gaze and speech cues. The experiments were carried out with the humanoid robot iCub and N = 56 adult participants. We found that the more people are extrovert, the more and longer they tend to talk with the robot; and the more people have a negative attitude towards robots, the less they will look at the robot face and the more they will look at the robot hands where the assembly and the contacts occur. Our results confirm and provide evidence that the engagement models classically used in human–robot interaction should take into account attitudes and personality traits.

Keywords

Human–robot interaction Social signals Engagement Personality 

References

  1. 1.
    Ajzen Icek I,Madden TJ (1986) Prediction of goal-directed behavior: attitudes, intentions, and perceived behavioral control. J Exp Soc Psychol 22(5):453–474Google Scholar
  2. 2.
    Ajzen Icek I,Madden TJ (1986) Prediction of goal-directed behavior: attitudes, intentions, and perceived behavioral control. J Exp Soc Psychol 22(5):453–474Google Scholar
  3. 3.
    Anzalone S, Yoshikawa Y, Ishiguro H, Menegatti E, Pagello E, Sorbello R (2012) Towards partners profiling in human robot interaction contexts. In: Noda I, Ando N, Brugali D, Kuffner J (eds) Simulation, modeling, and programming for autonomous robot, vol 7628., Lecture Notes in Computer ScienceSpringer, Berlin, pp 4–15Google Scholar
  4. 4.
    Anzalone SM, Boucenna S, Ivaldi S, Chetouani M (2015) Evaluating the engagement with social robots. Int J Soc Robot 7(4):465–478CrossRefGoogle Scholar
  5. 5.
    Argyle M (1976) Personality and social behaviour. Blackwell, OxfordGoogle Scholar
  6. 6.
    Ba S, Odobez JM (2009) Recognizing visual focus of attention from head pose in natural meetings. IEEE Trans Syst Man Cybernet B 39(1):16–33CrossRefGoogle Scholar
  7. 7.
    Beatty M, McCroskey J, Valencic K (2001) The biology of communication: a communibiological perspective. Hampton Press, CresskillGoogle Scholar
  8. 8.
    Berry D, Hansen J (2000) Personality, nonverbal behavior, and interaction quality in female dyads. Pers Soc Psychol Bull 26(3):278–292CrossRefGoogle Scholar
  9. 9.
    Boucenna S, Anzalone S, Tilmont E, Cohen D, Chetouani M (2014) Learning of social signatures through imitation game between a robot and a human partner. IEEE Trans Auton Mental Dev 6(3):213–225CrossRefGoogle Scholar
  10. 10.
    Castellano G, Pereira A, Leite I, Paiva A, McOwan P W (2009) Detecting user engagement with a robot companion using task and social interaction-based features. In: Proceedings of the 2009 international conference on multimodal interfaces, pp 119–126Google Scholar
  11. 11.
    Chen T, King CH, Thomaz A, Kemp C (2014) An investigation of responses to robot-initiated touch in a nursing context. Int J Soc Robot 6(1):141–161CrossRefGoogle Scholar
  12. 12.
    Costa P, McCrae R (1992) Revised NEO personality inventory (NEO-PI-R) and NEO five-factor inventory (NEO-FFI) professional manual. Psychological Assessment Resources, OdessaGoogle Scholar
  13. 13.
    Costa P, McCrae R, Rolland J (1998) NEO-PI-R. Inventaire de Personnalité révisé, Editions du Centre de Psychologie Appliquée, ParisGoogle Scholar
  14. 14.
    Dang TH, Tapus A (2014) Towards personality-based assistance in human-machine interaction. In: Proceedings of IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)Google Scholar
  15. 15.
    Dewaele JM, Furnham A (2000) Personality and speech production: a pilot study of second language learners. Pers Individ Differ 28(2):355–365CrossRefGoogle Scholar
  16. 16.
    Dinet J, Vivian R (2015) Perception and attitudes towards anthropomorphic robots in france: validation of an assessment scale. Psychologie Francaise 60(1):173–189CrossRefGoogle Scholar
  17. 17.
    Eysenck HJ (1981) A model for personality. Springer-Verlag, General features of the model. New YorkCrossRefGoogle Scholar
  18. 18.
    France BHL, Heisel AD, Beatty MJ (2004) Is there empirical evidence for a nonverbal profile of extraversion? A meta-analysis and critique of the literature. Commun Monogr 71(1):28–48CrossRefGoogle Scholar
  19. 19.
    Fumagalli M, Ivaldi S, Randazzo M, Natale L, Metta G, Sandini G, Nori F (2012) Force feedback exploiting tactile and proximal force/torque sensing. Theory and implementation on the humanoid robot icub. Auton Robots 4:381–398CrossRefGoogle Scholar
  20. 20.
    Gaudiello I, Zibetti E, Lefort S, Chetouani M, Ivaldi S (2016) Trust as indicator of robot functional and social acceptance. an experimental study on user conformation to iCub answers. Comput Hum Behav 61:633–655. doi:10.1016/j.chb.2016.03.057
  21. 21.
    Goffman E (1967) Interaction ritual: essays on face-to-face behavior. Anchor Books, New YorkGoogle Scholar
  22. 22.
    Goldberg L (1990) An alternative description of personality: the big-five factor structure. J Pers Soc Psychol 59:1216–1229CrossRefGoogle Scholar
  23. 23.
    Gray K, Wegner D (2012) Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125(1):125–130CrossRefGoogle Scholar
  24. 24.
    Hanninen L, Pastell M (2009) Cowlog: open-source software for coding behaviors from digital video. Behav Res Methods 41(2):472–476CrossRefGoogle Scholar
  25. 25.
    Huang CM, Thomaz A (2011) Effects of responding to, initiating and ensuring joint attention in human–robot interaction. In: IEEE RO-MAN, pp 65–71Google Scholar
  26. 26.
    Iishi R, Shinohara Y, Nakano T, Nishida T (2011) Combining multiple types of eye-gaze information to predict user’s conversatinal engagement. In: 2nd workshop on eye gaze in intelligent human machine interaction, pp 1–8Google Scholar
  27. 27.
    Ivaldi S, Anzalone S, Rousseau W, Sigaud O, Chetouani M (2014) Robot initiative in a team learning task increases the rhythm of interaction but not the perceived engagement. Front Neurorobot 8(5):1–23Google Scholar
  28. 28.
    Ivaldi S, Nguyen SM, Lyubova N, Droniou A, Padois V, Filliat D, Oudeyer PY, Sigaud O (2014) Object learning through active exploration. IEEE Trans Auton Mental Dev 6(1):56–72CrossRefGoogle Scholar
  29. 29.
    Le Maitre J, Chetouani M (2013) Self-talk discrimination in human-robot interaction situations for supporting social awareness. Int J Soc Robot 5(2):277–289CrossRefGoogle Scholar
  30. 30.
    Lepri B, Subramanian R, Kalimeri K, Staiano J, Pianesi F, Sebe N (2010) Employing social gaze and speaking activity for automatic determination of the extraversion trait. In: International conference on multimodal interfaces and the workshop on machine learning for multimodal interaction, pp 1–7Google Scholar
  31. 31.
    Mara M, Appel M (2015) Science fiction reduces the eeriness of android robots: a field experiment. Comput Hum Behav 48(1):156–162CrossRefGoogle Scholar
  32. 32.
    Mohammadi G, Vinciarelli A (2012) Automatic personality perception: prediction of trait attribution based on prosodic features. IEEE Trans Affect Comput 3(3):273–284CrossRefGoogle Scholar
  33. 33.
    Mori M, MacDorman K, Kageki N (2012) The uncanny valley (from the field). IEEE Robot Autom Mag 19(2):98–100CrossRefGoogle Scholar
  34. 34.
    Natale L, Nori F, Metta G, Fumagalli M, Ivaldi S, Pattacini U, Randazzo M, Schmitz A, Sandini G (2013) The iCub platform: a tool for studying intrinsically motivated learning. Springer, BerlinGoogle Scholar
  35. 35.
    Nomura T, Kanda T, Suzuki T (2006) Experimental investigation into influence of negative attitudes toward robots on human-robot interaction. AI Soc 20(2):138–150CrossRefGoogle Scholar
  36. 36.
    Nomura T, Kanda T, Suzuki T, Kato K (2006) Exploratory investigation into influence of negative attitudes toward robots on human-robot interaction. In: Mobile Robots: towards New Applications, Aleksandar LazinicaGoogle Scholar
  37. 37.
    Nomura T, Kanda T, Suzuki T, Kato K (2008) Prediction of human behavior in human-robot interaction using psychological scales for anxiety and negative attitudes toward robots. IEEE Trans Robot 24(2):442–451CrossRefGoogle Scholar
  38. 38.
    Pianesi F, Mana N, Cappelletti A, Lepri B, Zancanaro M (2008) Multimodal recognition of personality traits in social interactions. Proceedings of the 10th International Conference on Multimodal Interfaces, ICMI ’08. ACM, New York, pp 53–60Google Scholar
  39. 39.
    Poggi I, D’Errico F (2012) Social signals: a framework in terms of goals and beliefs. Cogn Process 13(2):427–445CrossRefGoogle Scholar
  40. 40.
    Rahbar F, Anzalone S, Varni G, Zibetti E, Ivaldi S, Chetouani M (2015) Predicting extraversion from non-verbal features during a face-to-face human-robot interaction. In: International Conference on Social Robotics, pp 1–10Google Scholar
  41. 41.
    Rauthmann J, Seubert C, Sachse P, Furtner M (2012) Eyes as windows to the soul: gazing behavior is related to personality. J Res Pers 46(2):147–156CrossRefGoogle Scholar
  42. 42.
    Rich C, Ponsler B, Holroyd A, Sidner C L (2010) Recognizing engagement in human–robot interaction. In: Proceedings of ACM/IEEE international conference human–robot interaction (HRI), pp 375–382Google Scholar
  43. 43.
    Sanghvi J, Castellano G, Leite I, Pereira A, McOwan P W, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: 6th ACM/IEEE International Conference on human–robot interaction (HRI), 2011, pp 305–311Google Scholar
  44. 44.
    Saygin A, Chaminade T, Ishiguro H, Driver J, Frith C (2012) The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Soc Cogn Affect Neurosci 7(4):413–422CrossRefGoogle Scholar
  45. 45.
    Scherer K, Scherer U (1981) Speech evaluation in psychiatry. Speech behavior and personality, Grune and StrattonGoogle Scholar
  46. 46.
    Sidner C, Kidd C, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: Proceedings 9th International Conference on Intelligent User Interfaces, pp 78–84Google Scholar
  47. 47.
    Sidner C, Lee C, Kidd C, Lesh N (2005) Explorations in engagement for humans and robots. Artif Intell 1(166):140–164CrossRefGoogle Scholar
  48. 48.
    Sidner CL, Lee C, Morency L P, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proceedings of 1st ACM SIGCHI/SIGART Conference on human–robot interaction, pp 290–296Google Scholar
  49. 49.
    Stefanov N, Peer A, Buss M (2009) Role determination in human–human interaction. In: 3rd Joint EuroHaptics conference and world haptics, pp 51–56Google Scholar
  50. 50.
    Takayama L, Pantofaru C (2009) Influences on proxemic behaviors in human–robot interaction. In: Proceedings of IEEE-RAS international conference on intelligent robots and systemsGoogle Scholar
  51. 51.
    Tapus A, Ţǎpuş C, Matarić MJ (2008) User-robot personality matching and assistive robot behavior adaptation for post-stroke rehabilitation therapy. Intell Serv Robot 1(2):169–183CrossRefGoogle Scholar
  52. 52.
    Tapus A, Matarić MJ (2008) Socially assistive robots: the link between personality, empathy, physiological signals, and task performance. In: AAAI spring symposium on emotion, personality, and social behavior, pp 133–140Google Scholar
  53. 53.
    Vinciarelli A, Mohammadi G (2014) A survey of personality computing. IEEE Trans Affect Comput 5(3):273–291CrossRefGoogle Scholar
  54. 54.
    Wilcox R, Nikolaidis S, Shah JA (2012) Optimization of temporal dynamics for adaptive human-robot interaction in assembly manufacturing. In: Robotics: science and systemsGoogle Scholar
  55. 55.
    Wood W (2000) Attitude change: Persuasion and social influence. Annu Rev Psychol 51:539–570CrossRefGoogle Scholar
  56. 56.
    Wu D, Bischof W, Anderson N, Jakobsen T, Kingstone A (2014) The influence of personality on social attention. Pers Individ Differ 60:25–29CrossRefGoogle Scholar
  57. 57.
    Yuichi I (1992) Extraversion, introversion, and visual interaction. Percept Mot Skills 74(1):43–50CrossRefGoogle Scholar
  58. 58.
    Zen G, Lepri B, Ricci E, Lanz O (2010) Space speaks: towards socially and personality aware visual surveillance. In: Proceedings of the 1st ACM international workshop on Multimodal pervasive video analysis, pp 37–42Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  1. 1.Inria, Villers-lès-NancyNancyFrance
  2. 2.Loria, CNRS & Université de Lorraine, Loria, UMR n. 7503, Vandoeuvre-lès-NancyNancyFrance
  3. 3.Intelligent Autonomous Systems, TU Darmstadt DarmstadtGermany
  4. 4.LIP6ParisFrance
  5. 5.Max Planck Institute for Intelligent SystemsStuttgartGermany
  6. 6.CNRS & Sorbonne Universités, UPMC Université Paris 06, Institut des Systèmes Intelligents et de Robotique (ISIR) UMR7222ParisFrance
  7. 7.CHARt-Lutin, Université Paris 8ParisFrance

Personalised recommendations