Skip to main content

Advertisement

Log in

Social Robotics for Nonsocial Teleoperation: Leveraging Social Techniques to Impact Teleoperator Performance and Experience

  • Service and Interactive Robotics (A Tapus, Section Editor)
  • Published:
Current Robotics Reports Aims and scope Submit manuscript

Abstract

Purpose of Review

Research has demonstrated the potential for robotic interfaces to leverage human-like social interaction techniques, for example, autonomous social robots as companions, as professional team members, or as social proxies in robot telepresence. We propose that there is an untapped opportunity to extend the benefits of social robotics to more traditional teleoperation, where the robot does not typically communicate with the operator socially. We argue that teleoperated robots can and should leverage social techniques to shape interactions with the operator, even in use cases such as remote exploration or inspection that do not involve using the robot to communicate with other people.

Recent Findings

The core benefit of social robotics is to leverage human-like and thus familiar social techniques to communicate effectively or shape people’s mood and behavior. Initial results provide proofs of concept for similar benefits of social techniques applied to more traditional teleoperation; for example, we can design teleoperated robots as social agents to facilitate communication or to shape operator behavior, or teleoperated robots can leverage knowledge of operator psychology to change perceptions, potentially improving operation safety and performance.

Summary

This paper provides a proposal and roadmap for leveraging social robotics techniques in more classical teleoperation interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

Papers of particular interest, published recently, have been highlighted as: • Of importance •• Of major importance

  1. Young JE, Hawkins R, Sharlin E, Igarashi T. Toward acceptable domestic robots: applying insights from social psychology. Int J Soc Robot. 2009;1:95–108.

    Google Scholar 

  2. •• Feldmaier J, Stimpfl M, Diepold K. Development of an emotion-competent SLAM agent. In: Human-Robot Interact: ACM Press; 2017. p. 1–9. An example of mechanical and algorithmic variables of a robot process being conveyed in a social way.

  3. Singh A, Young JE. A dog tail for utility robots: Exploring affective properties of tail movement. Lect Notes Comput Sci 8118 LNCS. 2013:403–19.

  4. • Sanoubari E, Seo SH, Garcha D, Young JE, Loureiro-Rodriguez V. Good robot design or Machiavellian? An in-the-wild robot leveraging minimal knowledge of Passersby’s culture. In: Human-Robot Interact: IEEE; 2019. p. 382–91. Demonstrates how social techniques in robotics can be manipulative.

  5. Nakagawa K, Shiomi M, Shinozawa K, Matsumura R, Ishiguro H, Hagita N. Effect of Robot’s whispering behavior on People’s motivation. Int J Soc Robot. 2013;5:5–16.

    Google Scholar 

  6. Breazeal C, Kidd CD, Thomaz AL, Hoffman G. Berlin M. IEEE Int Conf Intell Robot Syst: Effects of Nonverbal Communication on Efficiency and Robustness of Human-Robot Teamwork.pdf; 2005.

    Google Scholar 

  7. Lee JR, Nass CI. Trust in computers: the computers-are-social-actors (CASA) paradigm and trustworthiness perception in human-computer communication. In: Trust Technol. a Ubiquitous Mod. Environ. Theor. Methodol. Perspect. IGI Global; 2010. p. 1–15.

    Google Scholar 

  8. Kristoffersson A, Coradeschi S, Loutfi A. A review of mobile robotic telepresence. Adv Human-Computer Interact. 2013;2013:1–17.

    Google Scholar 

  9. Tsui KM, Dalphond JM, Brooks DJ, Medvedev MS, McCann E, Allspaw J, et al. Accessible human-robot interaction for telepresence robots: a case study. Paladyn, J Behav Robot. 2015;6:1–29.

  10. •• Rea DJ. Now you’re teleoperating with power: learning from video games to improve teleoperation interfaces: University of Manitoba; 2020. A thesis demonstrating the similarity of video games and teleoperation, including suggesting how social techniques in games could be applicable to telerobotics

  11. Steinfeld A, Fong T, Field M, Lewis M, Scholtz J, Schultz A (2006) Common metrics for human-robot interaction. Human-Robot Interact.

    Google Scholar 

  12. Chen JYC, Haas EC, Barnes MJ (2007) Human performance Issues and User interface design for teleoperated robots. IEEE Trans Syst Man Cybern Part C (applications rev 37:1231–1245.

  13. Endsley MR. Designing for situation awareness: an approach to user-centered design, Second: CRC Press; 2016.

  14. Endsley MR. Design and evaluation for situation awareness enhancement. Proc Hum Factors Soc Annu Meet. 1988;32:97–101.

    Google Scholar 

  15. Leeper A, Hsiao K, Ciocarlie M, Takayama L, Gossow D (2012) Strategies for human-in-the-loop robotic grasping. Human-Robot Interact (HRI), 2012 7th ACM/IEEE Int Conf 1–8.

  16. •• Rea DJ, Seo SH, Bruce N, Young JE. Movers, shakers, and those who stand still: visual attention-grabbing techniques in robot teleoperation. In: human-robot interact. New York, USA: ACM/IEEE; 2017. p. 398–407. Demonstrates how awareness of how an operator can be distracted can be used to design interfaces to improve task performance and reduce workload.

    Google Scholar 

  17. Seo SH, Young JE, Irani P. Where are the robots? In-feed embedded techniques for visualizing robot team member locations. In: Robot Hum. Commun: Interact; 2017. p. 522–7.

    Google Scholar 

  18. • Bartneck C, Belpaeme T, Eyssel F, Kanda T, Keijsers M, Sabanovic S. Human-robot interaction: an introduction: Cambridge University Press; 2020. A summary of the findings of social human-robot interaction

  19. Gleeson B, Maclean K, Haddadi A, Croft E, Alcazar J. Gestures for industry: intuitive human-robot communication from human observation. In: Human-robot interact. Piscataway: IEEE Press; 2013. p. 349–56.

    Google Scholar 

  20. Admoni H, Scassellati B. Social eye gaze in human-robot interaction: a review. J Human-Robot Interact. 2017;6:25.

    Google Scholar 

  21. Ohshima N, Kimijima K, Yamato J, Mukawa N. A conversational robot with vocal and bodily fillers for recovering from awkward silence at turn-takings. In: Int. Work. Robot Hum. Interact. Commun. IEEE; 2015. p. 325–30.

    Google Scholar 

  22. Seo SH, Griffin K, Young JE, Bunt A, Prentice S, Loureiro-Rodríguez V. Investigating People’s rapport building and hindering behaviors when working with a collaborative robot. Int J Soc Robot. 2018;10:147–61.

    Google Scholar 

  23. Ammi M, Demulier V, Caillou S, Gaffary Y, Tsalamlal Y, Martin J-C, et al. Haptic human-robot affective interaction in a handshaking social protocol. In: Human-Robot Interact. New York: ACM Press; 2015. p. 263–70.

  24. Tsalamlal MY, Martin J-C, Ammi M, Tapus A, Amorim M-A. Affective handshake with a humanoid robot: how do participants perceive and combine its facial and haptic expressions? In: Affect. Intell. Interact. IEEE: Comput; 2015. p. 334–40.

    Google Scholar 

  25. Brooks JA, Freeman JB. Neuroimaging of person perception: a social-visual interface. Neurosci Lett. 2019;693:40–3.

    Google Scholar 

  26. Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R. Communicating affect via flight path exploring use of the laban effort system for designing affective locomotion paths. ACM/IEEE Int Conf Human-Robot Interact. 2013:293–300.

  27. Young JE, Xin M, Sharlin E (2007) Robot expressionism through cartooning. In: Human-Robot Interact. ACM Press, New York, p 309.

  28. • Ciocirlan S-D, Agrigoroaie R, Tapus A (2019) Human-robot team: effects of communication in analyzing trust. In: robot hum. Interact. Commun. IEEE, pp 1–7. And example of social techniques increasing trust between users and robots.

  29. Lee M, Forlizzi J, Kiesler S. Personalization in HRI: a longitudinal field experiment. In: Human-Robot Interact; 2012. p. 319–26.

    Google Scholar 

  30. Riek LD, Paul PC, Robinson P. When my robot smiles at me: enabling human-robot rapport via real-time head gesture mimicry. J Multimodal User Interfaces. 2010;3:99–108.

    Google Scholar 

  31. Bainbridge WA, Hart JW, Kim ES, Scassellati B. The benefits of interactions with physically present robots over video-displayed agents. Int J Soc Robot. 2011;3:41–52.

    Google Scholar 

  32. Reeves B, Nass C (1996) How people treat computers, television, and new media like real people and places.

  33. Leshed G, Velden T, Rieger O, Kot B, Sengers P (2008) In-car gps navigation: engagement with and disengagement from the environment. Proc SIGCHI Conf hum factors Comput Syst (CHI ‘08) 1675–1684.

  34. • Lopatovska I, Rink K, Knight I, Raines K, Cosenza K, Williams H, et al. Talk to me: exploring user interactions with the Amazon Alexa. J Librariansh Inf Sci. 2019;51:984–97. Exploring how socially-enabled devices are interpreted and integrated into home-life.

    Google Scholar 

  35. •• Rea DJ, Young JE. Backseat Teleoperator : affective feedback with on-screen agents to influence Teleoperation. Human-robot interact. 2019:19–28. A proof of concept of how social techniques can act as a robot state summary to impact operator emotions.

  36. Hart SG, Staveland LE. Development of NASA-TLX (task load index): results of empirical and theoretical research. In: Hum. Ment. Workload; 1988. p. 139–83.

    Google Scholar 

  37. Butler EA, Egloff B, Wilhelm FH, Smith NC, Erickson EA, Gross JJ. The social consequences of expressive suppression. Emotion. 2003;3:48–67.

    Google Scholar 

  38. •• Precht L, Keinath A, Krems JF. Effects of driving anger on driver behavior – results from naturalistic driving data. Transp Res Part F Traffic Psychol Behav. 2017;45:75–92. A link between emotion and performance for operating vehicles.

    Google Scholar 

  39. Hart SG. NASA-task load index (NASA-TLX); 20 years later. Hum Factors Ergon Soc Annu Meting. 2006;50:904–8.

    Google Scholar 

  40. Chatterjee P. Drone pilots are quitting in record numbers: a combination of lower-class status in the military, overwork, and psychological trauma appears to be taking a mental toll on drone pilots. In: Mother Jones; 2015. https://www.motherjones.com/politics/2015/03/drone-pilots-are-quitting-record-numbers/.

    Google Scholar 

  41. Erden MS. Emotional postures for the humanoid-robot Nao. Int J Soc Robot. 2013;5:441–56.

    Google Scholar 

  42. Sakamoto D, Ono T (2006) Sociality of robots: do robots construct or collapse human relations? In: human-robot interact. ACM Press, New York, p 355.

  43. Geiskkovitch D, Seo S, Young JE. Autonomy, embodiment, and obedience to robots. In: Human-robot interact. Abstr. ACM: Ext; 2015. p. 235–6.

    Google Scholar 

  44. Short E, Hart J, Vu M, Scassellati B (2010) No fair!! An interaction with a cheating robot. 2010 5th ACM/IEEE Int Conf human-robot interact 219–226.

  45. Vázquez M, Steinfeld A, Hudson SE, Forlizzi J (2014) Spatial and other social engagement cues in a child-robot interaction. In: Proc. 2014 ACM/IEEE Int. Conf. Human-robot interact. - HRI ‘14. Pp 391–398.

  46. Salem M, Eyssel F, Rohlfing K, Kopp S, Joublin F. To err is human(−like): effects of robot gesture on perceived anthropomorphism and likability. In: Int. J. Soc. Robot; 2013. p. 313–23.

    Google Scholar 

  47. •• Holthaus P, Menon C, Amirabdollahian F. How a robot ’ s social credibility affects safety performance. In: Int. Conf. Soc. robot; 2019. p. 1–10. An example of a robot's social behavior impacting a human's behavior.

    Google Scholar 

  48. Graether E, Mueller F. Joggobot: a flying robot as jogging companion. Conf Hum Factors Comput Syst - Proc. 2012:1063–6.

  49. Gockley R, Forlizzi J, Simmons R. Natural person following behavior for social robots. Proc ACM/IEEE Int Conf Human-robot Interact. 2007:17–24.

  50. Kahn PH, Kanda T, Ishiguro H, Gill BT, Shen S, Gary HE, et al. Will people keep the secret of a humanoid robot? In: Human-Robot Interact. In; 2015. p. 173–80.

  51. Banh A, Rea DJ, Young JE, Sharlin E (2015) Inspector Baxter : the social aspects of integrating a robot as a quality inspector in an assembly line. Human-Agent Interact.

    Google Scholar 

  52. Mota RCR, Rea DJ, Le Tran A, Young JE, Sharlin E, Sousa MC. Playing the ‘trust game’ with robots: social strategies and experiences. In: Robot hum. Commun. IEEE: Interact; 2016. p. 519–24.

    Google Scholar 

  53. •• Rea DJ, Young JE. It’s all in your head. In: human-robot interact. New York: ACM press; 2018. p. 32–40. Demonstrates how describing robots in certain ways can impact operator perceptions and operation behaviors

    Google Scholar 

  54. Schramm LT, Dufault D, Young JE. Warning: this robot is not what it seems! Exploring expectation discrepancy resulting from robot design. In: Companion human-robot interact. New York: ACM; 2020. p. 439–41.

    Google Scholar 

  55. Young JE, Sung J, Voida A, Sharlin E, Igarashi T, Christensen HI, et al. Evaluating human-robot interaction. Int J Soc Robot. 2010;3:53–67.

  56. Chatterjee P. Drone pilots are quitting in record numbers: a combination of lower-class status in the military, overwork, and psychological trauma appears to be taking a mental toll on drone pilots. In: Mother Jones; 2015.

    Google Scholar 

  57. Swartz L. Why people hate the paperclip: labels, appearance, behavior and social responses to user interface. Stanford: AGENTS; 2003.

    Google Scholar 

  58. Whitworth B. Polite computing. Behav Inf Technol. 2005;24:353–63.

    Google Scholar 

  59. Srinivasan R, Jovanis PP. Effect of selected in-vehicle route guidance systems on driver reaction times. Hum Factors J Hum Factors Ergon Soc. 1997;39:200–15.

    Google Scholar 

  60. •• Rueben M, Bernieri FJ, Grimm CM, Smart WD. Framing effects on privacy concerns about a home Telepresence robot. In: human-robot interact: ACM; 2017. p. 435–44. Demonstrates how the description of a teloperated robot impacts the social perception of that robot's behaviors.

  61. • Brandstetter J, Beckner C, Sandoval EB, Bartneck C. Persistent lexical entrainment in HRI. In: human-robot interact: ACM; 2017. p. 63–72. A demonstrating of how robot actions can subconsciounsly influence human behavior.

  62. • Mizumaru K, Satake S, Kanda T, Ono T (2019) Stop doing it! Approaching strategy for a robot to admonish pedestrians. In: human-robot interact. IEEE, pp 449–457. An example of a social robot behavior that can influence how people behave towards the robot.

  63. •• Rea DJ, Hanzaki MR, Bruce N, Young JE (2017) Tortoise and the Hare Robot Slow and steady almost wins the race , but finishes more safely. In: Robot Hum. Interact. Commun. IEEE, pp 1–6. Demonstrates how a teleoperated robot's capabilities are linked to performance, workload, and user experience.

  64. •• Seo SH, Young JE, Irani P. How are your robot friends doing? A design exploration of graphical techniques supporting awareness of robot team members in Teleoperation. Int J of Soc Robotics. 2020; An approach exploring how virtual social representations of robot state can convey information similarly to traditional interface techniques.

  65. Wang J, Lewis M (2007) Human control for cooperating robot teams. HRI 2007 - Proc 2007 ACM/IEEE Conf human-robot interact - robot as team Memb 9–16.

  66. Kortenkamp D, Bonasso RP, Ryan D, Schreckenghost D. Traded control with autonomous robots as mixed initiative interaction. AAAI Tech Rep. 1997;04:89–94.

    Google Scholar 

  67. • Seo SH, Geiskkovitch D, Nakane M, King C, Young JE (2015) Poor thing ! Would you feel sorry for a simulated robot ? A comparison of empathy toward a physical and a simulated robot. In: human-robot interact. Pp 125–132. An example of how understanding how operators process visual information can improve interfaces by lowering workload.

  68. Cambria E. Affective computing and sentiment analysis. IEEE Intell Syst. 2016;31:102–7.

    Google Scholar 

  69. Drury JL, Scholtz J, Yanco H a. (2003) Awareness in human-robot interactions. IEEE Int Conf Syst Man Cybern.

    Google Scholar 

  70. Jia Y, Xi N, Liu S, Wang Y, Li X, Bi S. Quality of teleoperator adaptive control for telerobotic operations. Int J Robot Res. 2014;33:1765–81.

    Google Scholar 

  71. Seo SH, Rea DJ, Wiebe J, Young JE (2017) Monocle: interactive detail-in-context using two pan-and-tilt cameras to improve Teleoperation effectiveness. RO-MAN.

    Google Scholar 

  72. • Phillips E, Zhao X, Ullman D, Malle BF. What is human-like?: decomposing robots’ human-likeAppearance using the anthropomorphic roBOT (ABOT)database. In: In: human-robot interact. New York: ACM; 2018. p. 105–13. A robot survey that finds how certain physical features are linked to to anthropomorphism and social agency.

    Google Scholar 

  73. Osawa H, Ohmura R, Imai M. Using attachable humanoid parts for realizing imaginary intention and body image. Int J Soc Robot. 2009;1:109–23.

    Google Scholar 

  74. Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R (2013) Communicating affect via flight path. In: hum. Robot interact. 2013. Pp 293–300.

  75. Young JE, Sharlin E, Igarashi T. Teaching robots style: designing and evaluating style-by-demonstration for interactive robotic locomotion. Human–Computer Interact. 2013;28:379–416.

    Google Scholar 

  76. Tsui KM, Norton A, Brooks DJ, McCann E, Medvedev MS, Yanco HA. Design and development of two generations of semi-autonomous social telepresence robots. In: 2013 IEEE Conf. Pract. Robot Appl. IEEE: Technol; 2013. p. 1–6.

    Google Scholar 

  77. Quigley M, Goodrich MA, Beard RW. Semi-autonomous human-UAV interfaces for fixed-wing mini-UAVs. Intell Robot Syst. 2004;3:2457–62.

    Google Scholar 

  78. Bartneck C, Kanda T, Mubin O, Al Mahmud A. Does the design of a robot influence its animacy and perceived intelligence? Int J Soc Robot. 2009;1:195–204.

    Google Scholar 

  79. Okamura AM. Methods for haptic feedback in teleoperated robot-assisted surgery. Ind Robot An Int J. 2004;31:499–508.

    Google Scholar 

  80. Hacinecipoglu A, Konukseven EI, Koku AB (2013) Evaluation of haptic feedback cues on vehicle teleoperation performance in an obstacle avoidance scenario. 2013 World Haptics Conf WHC 2013.

  81. Marquardt N, Nacenta MA, Young JE, Carpendale S, Greenberg S, Sharlin E. The haptic tabletop Puck. In: interact. Tabletops surfaces - ITS ‘09. New York: ACM Press; 2009. p. 85.

    Google Scholar 

  82. Guo C, Sharlin E. Exploring the use of tangible user interfaces for human-robot interaction. In: Hum. factors Comput. New York: Syst. ACM Press; 2008. p. 121.

    Google Scholar 

  83. • Klopfenstein LC, Delpriori S, Malatini S, Bogliolo A. The rise of bots: a survey of conversational interfaces, patterns, and paradigms. In: In: des. Interact. Syst. New York: ACM; 2017. p. 555–65. A survey of social techniques and their use in chatbots.

    Google Scholar 

  84. • Paiva A, Leite I, Boukricha H, Wachsmuth I. Empathy in virtual agents and robots. Interact Intell Syst. 2017;7:1–40. A survey of how the importance and effects of empathy as a social skill in virtual and robotic agents.

    Google Scholar 

  85. Klemmer SR, Hartmann B, Takayama L. How bodies matter. In: Proc. 6th ACM Conf. Des. Interact. Syst. - DIS ‘06. New York: ACM Press; 2006. p. 140.

    Google Scholar 

  86. Picard RW, Vyzas E, Healey J. Toward machine emotional intelligence: analysis of affective physiological state. Trans Pattern Anal Mach Intell. 2001;23:1175–91.

    Google Scholar 

  87. Picard RW, Fedor S, Ayzenberg Y. Multiple arousal theory and daily-life electrodermal activity asymmetry. Emot Rev. 2016;8:62–75.

    Google Scholar 

  88. • Griol D, Molina JM, Callejas Z. Combining speech-based and linguistic classifiers to recognize emotion in user spoken utterances. Neurocomputing. 2019;(326–327):132–40. An example of modern biometrics used for affect recognition.

  89. • Li P, Liu H, Si Y, et al (2019) EEG based emotion recognition by combining functional connectivity network and local activations. IEEE Transactions biomed Eng 66:2869–2881. An example of modern biometrics being used for affect recognition.

  90. Vermun K, Senapaty M, Sankhla A, Patnaik P, Routray A. Gesture-based affective and cognitive states recognition using Kinect for effective feedback during e-learning. In: Technol. IEEE: Educ; 2013. p. 107–10.

    Google Scholar 

  91. • Dafoulas G, Tsiakara A, Samuels-Clarke J, Maia CC, Neilson D, Ali AA. Investigating patterns of emotion and expressions using smart learning spaces. In: In: Inf. Commun. Syst: IEEE; 2019. p. 238–44. An example of how technology can affect social interaction and affect.

  92. Balaguer C, Giménez A, Jardón A, Correal R, Martínez S, Sabatini AM, et al. Proprio and teleoperation of a robotic system for disabled persons’ assistance in domestic environments. Springer Tracts Adv Robot. 2007;31:415–27.

  93. Hutt S, Mills C, White S, Donnelly PJ, D’Mello SK (2016) The eyes have it: gaze-based detection of mind wandering during learning with an intelligent tutoring system. Educ Data Min 86–93.

  94. Kanade T, Cohn JF, Tian Y. Comprehensive database for facial expression analysis. Face Gesture Recognit: Autom; 2000.

    Google Scholar 

  95. Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, Matthews I. The extended Cohn-Kanade dataset (CK+): a complete expression dataset for action unit and emotion-specified expression. Anal: Work. CVPR Hum. Commun. Behav; 2010.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel J. Rea.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Human and Animal Rights and Informed Consent

This article does not contain any studies with human or animal subjects performed by any of the authors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the Topical Collection on Service and Interactive Robotics

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Rea, D.J., Seo, S.H. & Young, J.E. Social Robotics for Nonsocial Teleoperation: Leveraging Social Techniques to Impact Teleoperator Performance and Experience. Curr Robot Rep 1, 287–295 (2020). https://doi.org/10.1007/s43154-020-00020-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s43154-020-00020-7

Keywords

Navigation