Skip to main content
Log in

How does Modality Matter? Investigating the Synthesis and Effects of Multi-modal Robot Behavior on Social Intelligence

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Multi-modal behavior for social robots is crucial for the robot’s perceived social intelligence, ability to communicate nonverbally, and the extent to which the robot can be trusted. However, most of the research conducted so far has been with only one modality, thus there is still a lack of understanding of the effect of each modality when performed in a multi-modal interaction. This study presents a multi-modal interaction focusing on the following modalities: proxemics for social navigation, gaze mechanisms (for turn-taking floor-holding, turn-yielding and joint attention), kinesics (for symbolic, deictic, and beat gestures), and social dialogue. The multi-modal behaviors were evaluated through an experiment with 105 participants in a seven minute interaction to analyze the effects on perceived social intelligence through both objective and subjective measurements. The results show various insights of the effect of modalities in a multi-modal interaction onto several behavioral outcomes of the users, including taking physical suggestions, distances maintained during the interaction, wave gestures performed in greeting and closing, back-channeling, and how socially the robot is treated, while having no effect on self-disclosure and subjective liking.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

Notes

  1. Further subjective measurements, referring to the comparison between self-reported attitudes and behaviors towards social robots will be examined elsewhere.

  2. Multi-modal Social Cues System Implementation GitHub Repository https://github.com/KarenTatarian/multimodal_socialcues.

References

  1. Abele A (1986) Functions of gaze in social interaction: communication and monitoring. J Nonverbal Behav 10(2):83–101

    Article  Google Scholar 

  2. Admoni H, Datsikas C, Scassellati B (2014) Speech and gaze conflicts in collaborative human-robot interactions. In: Proceedings of annual meeting of the cognitive science society (CogSci’14), vol 36, pp 104–109

  3. Admoni H, Scassellati B (2017) Social eye gaze in human-robot interaction: a review. J Hum Robot Interact 6(1):25–63

    Article  Google Scholar 

  4. Akbıyık S, Karaduman A, Goksun T, Chatterjee A (2018) The relationship between co-speech gesture production and macrolinguistic discourse abilities in people with focal brain injury. Neuropsychologia. https://doi.org/10.1016/j.neuropsychologia.2018.06.025

    Article  Google Scholar 

  5. Andrist S, Pejsa T, Mutlu B, Gleicher M (2012) Designing effective gaze mechanisms for virtual agents. In: Proceedings of the SIGCHI conference on human factors in computing systems, (CHI’12). Association for Computing Machinery, New York, pp 705–714. https://doi.org/10.1145/2207676.2207777

  6. Andrist S, Tan, XZ, Gleicher, M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction (HRI’14). Association for Computing Machinery, New York, pp 25-3-2. https://doi.org/10.1145/2559636.2559666

  7. Argyle M, Cook M (1976) Gaze and mutual gaze. Cambridge University Press, Cambridge

    Google Scholar 

  8. Argyle M, Dean J (1965) Eye-contact, distance and affiliation. Sociometry 28:289–304

    Article  Google Scholar 

  9. Birnbaum GE, Mizrahi M, Hoffman G, Reis HT, Finkel EJ, Sass O (2016) Machines as a source of consolation: Robot responsiveness increases human approach behavior and desire for companionship. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI), pp 165–172. https://doi.org/10.1109/HRI.2016.7451748

  10. Bolstad CA (2001) Situation awareness: Does it change with age? In: Proceedings of the human factors and ergonomics society annual meeting, vol 45, no 4, pp 272–276. https://doi.org/10.1177/154193120104500401

  11. Boucher JD, Pattacini U, Lelong A, Bailly G, Elisei F, Fagel S, Dominey P, Ventre-Dominey J (2012) I reach faster when i see you look: gaze effects in human–human and human–robot face-to-face cooperation. Front Neurorobot 6:3. https://doi.org/10.3389/fnbot.2012.00003

    Article  Google Scholar 

  12. Branigan HP, Pickering MJ, Pearson J, McLean JF (2010) Linguistic alignment between people and computers. J Pragmat 42(9):2355–2368. https://doi.org/10.1016/j.pragma.2009.12.012 (How people talk to Robots and Computers)

  13. Breazeal CL, Kidd CD, Thomaz AL, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: 2005 IEEE/RSJ International conference on intelligent robots and systems, IROS, pp 383–388. https://doi.org/10.1109/IROS.2005.1545011

  14. Chartrand T, Bargh J (1999) The chameleon effect: the perception-behavior link and social interaction. J Pers Soc Psychol 76(6):893–910. https://doi.org/10.1037//0022-3514.76.6.893

    Article  Google Scholar 

  15. Chiu Cy, Hong YY, Krauss RM (1995) Gaze direction and fluency in conversational speech. Unpublished manuscript

  16. Clark HH, Krych MA (2004) Speaking while monitoring addressees for understanding

  17. Dautenhahn K, Walters M, Woods S, Koay K, Nehaniv C, Sisbot E, Alami R, Siméon T (2006) How may I serve you? A robot companion approaching a seated person in a helping context. pp 172–179. https://doi.org/10.1145/1121241.1121272

  18. Delaherche E, Chetouani M, Mahdhaoui A, Saint-georges C, Viaux S, Cohen D (2012) Interpersonal synchrony: a survey of evaluation methods across disciplines. IEEE Trans Affect Comput 3:349–365. https://doi.org/10.1109/T-AFFC.2012.12

    Article  Google Scholar 

  19. Dolinski D, Nawrat M, Iza R (2001) Dialogue involvement as a social influence technique. Pers Soc Psychol Bull 27:1395–1406. https://doi.org/10.1177/01461672012711001

    Article  Google Scholar 

  20. Eastwick P, Gardner W (2009) Is it a game? Evidence for social influence in the virtual world. Soc Influ 4:18–32. https://doi.org/10.1080/15534510802254087

    Article  Google Scholar 

  21. Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica. https://doi.org/10.1515/semi.1969.1.1.49

    Article  Google Scholar 

  22. Endsley MR (1995) Toward a theory of situation awareness in dynamic systems. Hum Factors J Hum Factors Ergon Soc 37(33):32–64. https://doi.org/10.1518/001872095779049543

    Article  Google Scholar 

  23. Fiore SM, Wiltshire TJ, Lobato EJC, Jentsch FG, Huang WH, Axelrod B (2013) Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior. Front Psychol. https://doi.org/10.3389/fpsyg.2013.00859

    Article  Google Scholar 

  24. Fong T, Thorpe C, Baur C (2003) Collaboration, dialogue, human–robot interaction. In: Jarvis RA, Zelinsky A (eds) Robotics research. Springer, Berlin, pp 255–266

    Chapter  Google Scholar 

  25. Grosz BJ, Sidner CL (1986) Attention, intentions, and the structure of discourse. Comput Linguist 12(3):175–204

    Google Scholar 

  26. Hall ET (1959) The silent language. Edward Hall, Doubleday Garden City

    Google Scholar 

  27. Hall ET (1963) A system for the notation of proxemic behavior 1. Am Anthropol 65(5):1003–1026. https://doi.org/10.1525/aa.1963.65.5.02a00020

    Article  Google Scholar 

  28. Hall E, Congress of CPCL (1966) The hidden dimension. Anchor books. Doubleday

  29. Hall E (1974) Handbook for proxemic research. Studies in the anthropology of visual communication. Society for the Anthropology of Visual Communication

  30. Ham J, Bokhorst R, Cuijpers R, van der Pol D, Cabibihan JJ (2011) Making robots persuasive: the influence of combining persuasive strategies (gazing and gestures) by a storytelling robot on its persuasive power. Res Educ Media 9:71–83. https://doi.org/10.1007/978-3-642-25504-58

    Article  Google Scholar 

  31. Hasson U, Frith CD (2016) Mirroring and beyond: coupled dynamics as a generalized framework for modelling social interactions. Philos Trans Roy Soc B Biol Sci 371(1693):20150366. https://doi.org/10.1098/rstb.2015.0366

    Article  Google Scholar 

  32. Huang CM, Mutlu B (2013) Modeling and evaluating narrative gestures for humanlike robots. https://doi.org/10.15607/RSS.2013.IX.026

  33. Kanda T, Kamasima M, Imai M, Ono T, Sakamoto D, Ishiguro H, Anzai Y (2007) A humanoid robot that pretends to listen to route guidance from a human. Auton Robot 22:87–100. https://doi.org/10.1007/s10514-006-9007-6

    Article  Google Scholar 

  34. Kendon A (1967) Some functions of gaze-direction in social interaction. Acta Physiol (Oxf) 26(1):22–63

    Google Scholar 

  35. Kendon A (1990) Conducting interaction: patterns of behavior in focused encounters. Studies in interactional sociolinguistics. Cambridge University Press, Cambridge

    Google Scholar 

  36. Kennedy J, Baxter P, Belpaeme T (2015) The robot who tried too hard: social behaviour of a robot tutor can negatively affect child learning. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction, HRI’15. Association for Computing Machinery, New York, pp 67–74. https://doi.org/10.1145/2696454.2696457

  37. Kipp M, Martin JC (2009) Gesture and emotion: can basic gestural form features discriminate emotions? In: 2009 3rd International conference on affective computing and intelligent interaction and workshops, pp 1–8

  38. Kirchner N, Alempijevic A, Dissanayake G (2011) Nonverbal robot-group interaction using an imitated gaze cue. In: Proceedings of the 6th international conference on human–robot interaction, HRI’11. Association for Computing Machinery, New York, pp 497–504. https://doi.org/10.1145/1957656.1957824

  39. Kong APH, Law SP, Kwan C, Lai C, Lam V (2015) A coding system with independent annotations of gesture forms and functions during verbal communication: development of a database of speech and gesture (dosage). J Nonverbal Behav. https://doi.org/10.1007/s10919-014-0200-6

    Article  Google Scholar 

  40. Kruse T, Kirsch A, Sisbot EA, Alami R (2010) Exploiting human cooperation in human-centered robot navigation. In: RO-MAN. IEEE, pp 192–197

  41. Kucherenko T (2018) Data driven non-verbal behavior generation for humanoid robots. In: Proceedings of the 20th ACM international conference on multimodal interaction, ICMI’18. Association for Computing Machinery, New York, pp 520–523. https://doi.org/10.1145/3242969.3264970

  42. Lee MK, Forlizzi J, Kiesler S, Rybski P, Antanitis J, Savetsila S (2012) Personalization in HRI: a longitudinal field experiment. pp 319–326. https://doi.org/10.1145/2157689.2157804

  43. Leichtmann B, Nitsch V (2020) How much distance do humans keep toward robots? Literature review, meta-analysis, and theoretical considerations on personal space in human–robot interaction. J Environ Psychol 68:101386. https://doi.org/10.1016/j.jenvp.2019.101386

    Article  Google Scholar 

  44. Liu C, Ishi CT, Ishiguro H, Hagita N (2012) Generation of nodding, head tilting and eye gazing for human–robot dialogue interaction. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction, HRI’12. Association for Computing Machinery, New York, pp 285–292. https://doi.org/10.1145/2157689.2157797

  45. Lucas GM, Boberg J, Traum D, Artstein R, Gratch J, Gainer A, Johnson E, Leuski A, Nakano M (2018) Getting to know each other: the role of social dialogue in recovery from errors in social robots. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction, HRI’18. Association for Computing Machinery, New York, pp 344–351. https://doi.org/10.1145/3171221.3171258

  46. McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago

    Google Scholar 

  47. Mol L, Krahmer E, Swerts M (2009) Alignment in iconic gestures: does it make sense? In: Theobald BJ, Harvey R (eds) Proceedings of the eight international conference on auditory-visual speech processing (AVSP 2009), School of Computing Sciences, pp 3–8

  48. Moon AJ, Troniak DM, Gleeson B, Pan MK, Zheng M, Blumer BA, MacLean K, Crof EA (2014) Meet me where I’m gazing: how shared attention gaze affects human–robot handover timing. In: 2014 9th ACM/IEEE international conference on human–robot interaction (HRI), pp 334–341

  49. Mumm J, Mutlu B (2011) Human–robot proxemics: physical and psychological distancing in human–robot interaction. In: HRI 2011—proceedings of the 6th ACM/IEEE international conference on human–robot interaction, pp 331–338. https://doi.org/10.1145/1957656.1957786

  50. Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human–robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction, HRI’09. Association for Computing Machinery, New York, pp 61–68. https://doi.org/10.1145/1514095.1514109

  51. Mutlu B, Kanda T, Forlizzi J, Hodgins J, Ishiguro H (2012) Conversational gaze mechanisms for humanlike robots. ACM Trans Interact Intell Syst 1:12. https://doi.org/10.1145/2070719.2070725

    Article  Google Scholar 

  52. Peters R, Broekens J, Neerincx MA (2017) Robots educate in style: the effect of context and non-verbal behaviour on children’s perceptions of warmth and competence. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 449–455. https://doi.org/10.1109/ROMAN.2017.8172341

  53. Peters R, Broekens J, Li K, Neerincx MA (2019) Robots expressing dominance: effects of behaviours and modulation. In: 2019 8th International conference on affective computing and intelligent interaction (ACII), pp 1–7

  54. Pickering MJ, Garrod S (2004) Toward a mechanistic psychology of dialogue. Behav Brain Sci 27(2):169–190. https://doi.org/10.1017/S0140525X04000056

    Article  Google Scholar 

  55. Qureshi AH, Nakamura Y, Yoshikawa Y, Ishiguro H (2017) Robot gains social intelligence through multimodal deep reinforcement learning. CoRR arXiv:1702.07492

  56. Saad E, Neerincx M, Hindriks K (2019) Welcoming robot behaviors for drawing attention. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, United States, pp 368–368. https://doi.org/10.1109/HRI.2019.8673325. Video Abstract; 14th annual ACM/IEEE international conference on human–robot interaction, HRI 2019 Conference date: 11-03-2019 Through 14-03-2019

  57. Salem M, Eyssel FA, Rohlfing K, Kopp S, Joublin F (2013) To Err is Human(-like): effects of robot gesture on perceived anthropomorphism and likability. Int J Soc Robot 5(3):313–323. https://doi.org/10.1007/s12369-013-0196-9

    Article  Google Scholar 

  58. Sandstrom GM, Dunn EW (2014) Social interactions and well-being: the surprising power of weak ties. Pers Soc Psychol Bull 40(7):910–922. https://doi.org/10.1177/0146167214529799 (PMID: 24769739)

    Article  Google Scholar 

  59. Schegloff EA (1987) Analyzing single episodes of interaction: an exercise in conversation analysis. Soc Psychol Q 50(2):101–114. https://doi.org/10.2307/2786745

    Article  Google Scholar 

  60. Schegloff EA (1998) Body torque. Soc Res 65(5):536–596

    Google Scholar 

  61. Sciutti A, Bisio A, Nori F, Metta G, Fadiga L, Pozzo T, Sandini G (2012) Measuring human–robot interaction through motor resonance. Int J Soc Robotics. https://doi.org/10.1007/s12369-012-0143-1

    Article  Google Scholar 

  62. Shi C, Shimada M, Kanda T, Ishiguro H, Hagita N (2011) Spatial formation model for initiating conversation. In: Spatial formation model for initiating conversation, robotics: science and systems

  63. Sidner CL, Lee C, Morency LP, Forlines C (2006) The effect of head-nod recognition in human-robot conversation. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, HRI’06. Association for Computing Machinery, New York, pp 290–296. https://doi.org/10.1145/1121241.1121291

  64. Sisbot EA, Marin-Urias LF, Alami R, Simeon T (2007) A human aware mobile robot motion planner. IEEE Trans Robotics 23(5):874–883

    Article  Google Scholar 

  65. Skantze G, Hjalmarsson A, Oertel C (2014) Turn-taking, feedback and joint attention in situated human–robot interaction. Speech Commun 65:50–66

    Article  Google Scholar 

  66. Strait M, Canning C, Scheutz M (2014) Let me tell you! Investigating the effects of robot communication strategies in advice-giving situations based on robot appearance, interaction modality and distance. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction—HRI’14. ACM Press, New York, pp 479–486 (2014). https://doi.org/10.1145/2559636.2559670

  67. Tatarian K, Chamoux M, Pandey AK, Chetouani M (2021) Robot gaze behavior and proxemics to coordinate conversational roles in group interactions. In: 2021 30th IEEE international conference on robot human interactive communication (RO-MAN), pp 1297–1304. https://doi.org/10.1109/RO-MAN50785.2021.9515550

  68. Thorndike EL (1920) Intelligence and its use. Harper’s Mag 140:227–235

    Google Scholar 

  69. Torrey C, Fussell SR, Kiesler S (2013) How a robot should give advice. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). IEEE, pp 275–282

  70. Turkstra L, Ciccia A, Seaton C (2003) Interactive behaviors in adolescent conversation dyads. Lang Speech Hear Serv Sch 34:117–127. https://doi.org/10.1044/0161-1461(2003/010)

    Article  Google Scholar 

  71. van Baaren RB, Holland RW, Kawakami K, van Knippenberg A (2004) Mimicry and prosocial behavior. Psychol Sci 15(1):71–74

    Article  Google Scholar 

  72. Vertegaal R, Slagter R, van der Veer G, Nijholt A (2001) Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI’01. Association for Computing Machinery, New York, pp 301–308. https://doi.org/10.1145/365024.365119

  73. Vinciarelli A, Pentland AS (2015) New social signals in a new interaction world: the next frontier for social signal processing. IEEE Syst Man Cybern Mag 1(2):10–17. https://doi.org/10.1109/MSMC.2015.2441992

    Article  Google Scholar 

  74. Wang Y, Lucas G, Khooshabeh P, De Melo C, Gratch J (2015) Effects of emotional expressions on persuasion. Soc Influ 10(4):236–249

    Article  Google Scholar 

  75. Wiltshire TJ, Lobato EJ, Garcia DR, Fiore SM, Jentsch FG, Huang WH, Axelrod B (2015) Effects of robotic social cues on interpersonal attributions and assessments of robot interaction behaviors. Proc Hum Factors Ergon Soc Annu Meet 59(1):801–805. https://doi.org/10.1177/1541931215591245

    Article  Google Scholar 

  76. Yonezawa T, Yamazoe H, Utsumi A, Abe S (2007) Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of the 9th international conference on multimodal interfaces, ICMI’07. Association for Computing Machinery, New York, pp 140–145. https://doi.org/10.1145/1322192.1322218

Download references

Acknowledgements

The authors would also like to thank the team at INSEAD for assistance with data collection, as well as Hugues Pellerin for advice with the statistical analyses. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement No. 765955.

Author information

Authors and Affiliations

Authors

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tatarian, K., Stower, R., Rudaz, D. et al. How does Modality Matter? Investigating the Synthesis and Effects of Multi-modal Robot Behavior on Social Intelligence. Int J of Soc Robotics 14, 893–911 (2022). https://doi.org/10.1007/s12369-021-00839-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-021-00839-w

Keywords

Navigation