How are Your Robot Friends Doing? A Design Exploration of Graphical Techniques Supporting Awareness of Robot Team Members in Teleoperation


While teleoperated robots continue to proliferate in domains including search and rescue, field exploration, or the military, human error remains a primary cause for accidents or mistakes. One challenge is that teleoperating a remote robot is cognitively taxing as the operator needs to understand the robot’s state and monitor all its sensor data. In a multi-robot team, an operator needs to additionally monitor other robots’ progress, states, notifications, errors, and so on to maintain team cohesion. We conducted a design exploration of novel graphical representations of robot team-member state, to support a person controlling one robot to maintain awareness of other robots in the team. Through a series of evaluations, we examined several design parameters (text, icon, facial expression, use of color, animation, and number of team robots), resulting in a set of guidelines for graphically representing team robot states in the remote team teleoperation.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17


  1. 1.

    Bartram L, Ware C, Calvert T (2003) Moticons: detection, distraction and task. Int J Hum Comput Stud 58(5):515–545.

    Article  Google Scholar 

  2. 2.

    Bruemmer DJ, Few DA, Boring RL, Marble JL, Walton MC, Nielsen CW (2005) Shared understanding for collaborative control. IEEE Trans Syst Man Cybern Part A Syst Hum 35(4):494–504.

    Article  Google Scholar 

  3. 3.

    Calhoun G, Warfield L, Wright N, Spriggs S, Ruff H (2012) Automated aid evaluation for transitioning UAS camera views. Proc Hum Factors and Ergon. Soc. Annu. Meet. 54(4):413–417.

    Article  Google Scholar 

  4. 4.

    Chen Jessie YC, Barnes MJ (2012) Supervisory control of multiple robots in dynamic tasking environments. Ergonomics 55(9):1043–1058.

    Article  Google Scholar 

  5. 5.

    Chen JYC, Haas EC, Barnes MJ (2007) Human performance issues and user interface design for teleoperated robots. IEEE Trans Syst Man Cybern 37(6):1231–1245.

    Article  Google Scholar 

  6. 6.

    Chen J, Glover M, Li C, Yang C (2016) Development of a user experience enhanced teleoperation approach. In: 2016 International conference on advanced robotics and mechatronics (ICARM). pp 171–177.

  7. 7.

    Cummings M, Bruni S, Mercier S, Mitchell PJ (2007) Automation architecture for single operator, multiple UAV command and control. Int C2 J 1:1–24

    Article  Google Scholar 

  8. 8.

    Demir M, McNeese NJ, Cooke NJ (2017) Team situation awareness within the context of human-autonomy teaming. Cognit Syst Res 46:3–12.

    Article  Google Scholar 

  9. 9.

    Draper M, Calhoun G, Ruff H, Mullins B, Ayala A, Wright N (2008) Transition display aid for changing camera views in UAV operations. In: Proceedings of the first conference on humans operating unmanned systems (HUMOUS’08)

  10. 10.

    Dubé AK, McEwen RN (2015) Do gestures matter? The implications of using touchscreen devices in mathematics instruction. Learn Instr 40:89–98.

    Article  Google Scholar 

  11. 11.

    Endsley MR (2015) Situation awareness: operationally necessary and scientifically grounded. Cognit Technol Work 17(2):163–167.

    Article  Google Scholar 

  12. 12.

    Gittins D (1986) Icon-based human–computer interaction. Int J Man Mach Stud 24(6):519–543.

    Article  Google Scholar 

  13. 13.

    Gombolay M, Bair A, Huang C, Shah J (2017) Computational design of mixed-initiative human–robot teaming that considers human factors: situational awareness, workload, and workflow preferences. Int J Robot Res 36(5–7):597–617.

    Article  Google Scholar 

  14. 14.

    Guo C, Young JE, Sharlin E (2009) Touch and toys: new techniques for interaction with a remote group of robots. In: Proceedings of the 27th international conference on Human factors in computing systems—CHI 09. p 491.

  15. 15.

    Hart SG, Staveland LE (1988) Development of NASA-TLX (task load index): results of empirical and theoretical research. Hum Ment Workload.

    Article  Google Scholar 

  16. 16.

    Hayes B, Shah JA (2017) Improving robot controller transparency through autonomous policy explanation. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction—HRI’17. pp 303–312.

  17. 17.

    Hess U, Hareli S (2015) The role of social context for the interpretation of emotional facial expressions. In: Mandal MK, Awasthi A (eds) Understanding facial expressions in communication. Springer India, New Delhi, pp 119–141.

    Chapter  Google Scholar 

  18. 18.

    Jiang X, Zheng B, Bednarik R, Stella Atkins M (2015) Pupil responses to continuous aiming movements. Int J Hum Comput Stud 83:1–11.

    Article  Google Scholar 

  19. 19.

    Kline TJB, Ghali LM, Kline DW, Brown S (1990) Visibility distance of highway signs among young, middle-aged, and older observers: icons are better than text. Hum Factors J Hum Factors Ergon Soc 32(5):609–619.

    Article  Google Scholar 

  20. 20.

    Lee D, Franchi A, Son HI, Ha C, Bulthoff HH, Giordano PR (2013) Semiautonomous haptic teleoperation control architecture of multiple unmanned aerial vehicles. IEEE/ASME Trans Mechatron 18(4):1334–1345.

    Article  Google Scholar 

  21. 21.

    Long GM, Kearns DF (1996) Visibility of text and icon highway signs under dynamic viewing conditions. Hum Factors J Hum Factors Ergon Soc 38(4):690–701.

    Article  Google Scholar 

  22. 22.

    Murch GM (1985) Using color effectively: designing to human specifications. Tech Commun 32(4):14–20

    Google Scholar 

  23. 23.

    O’Keeffe S, Ward TE, Villing R (2016) Improving task performance through high level shared control of multiple robots with a context aware human–robot interface. In: 2016 International conference on autonomous robot systems and competitions (ICARSC). pp 277–282.

  24. 24.

    Omidshafiei S, Agha-Mohammadi A-K, Amato C, Liu S-Y, How JP, Vian J (2017) Decentralized control of multi-robot partially observable Markov decision processes using belief space macro-actions. Int J Robot Res 36(2):231–258.

    Article  Google Scholar 

  25. 25.

    Parkinson B (2005) Do facial movements express emotions or communicate motives? Personal Soc Psychol Rev 9(4):278–311.

    Article  Google Scholar 

  26. 26.

    Paul C, Komlodi A (2012) Emotion as an indicator for future interruptive notification experiences. In: Proceedings of the 2012 ACM annual conference extended abstracts on human factors in computing systems extended abstracts—CHI EA’12. vol 2003.

  27. 27.

    Phillips EK, Jentsch FG (2017) Supporting situation awareness through robot-to-human information exchanges under conditions of visuospatial perspective taking. J Hum Robot Interact 6(3):92–117.

    Article  Google Scholar 

  28. 28.

    Price TF, LaFiandra M (2017) The perception of team engagement reduces stress induced situation awareness overconfidence and risk-taking. Cognit Syst Res 46:52–60.

    Article  Google Scholar 

  29. 29.

    Rea DJ, Seo SH, Bruce N, Young JE (2017) Movers, Shakers, and Those who stand still: visual attention-grabbing techniques in robot teleoperation. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction—HRI’17. pp 398–407.

  30. 30.

    Rosenfeld A, Agmon N, Maksimov O, Kraus S (2017) Intelligent agent supporting human–multi-robot team collaboration. Artif Intell 252:211–231.

    MathSciNet  Article  MATH  Google Scholar 

  31. 31.

    Selkowitz AR, Lakhmani SG, Chen JYC (2017) Using agent transparency to support situation awareness of the autonomous squad member. Cognit Syst Res 46:13–25.

    Article  Google Scholar 

  32. 32.

    Seo SH, Geiskkovitch D, Nakane M, King C, Young JE (2015) Poor thing! would you feel sorry for a simulated robot? In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction—HRI’15. pp 125–132.

  33. 33.

    Seo SH, Rea DJ, Wiebe J, Young JE (2017) Monocle: interactive detail-in-context using two pan-and-tilt cameras to improve teleoperation effectiveness. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). pp 962–967.

  34. 34.

    Seo SH, Young JE, Irani P (2017) Where are the robots? In-feed embedded techniques for visualizing robot team member locations. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). pp 522–527.

  35. 35.

    Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R (2013) Communicating affect via flight path exploring use of the Laban effort system for designing affective locomotion paths. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). pp 293–300.

  36. 36.

    Singh A, Young JE (2013) A dog tail for utility robots: exploring affective properties of tail movement. Hum Comput Interact 8118:403–419.

    Article  Google Scholar 

  37. 37.

    Sternberg S (1969) Memory-scanning: mental processes revealed by reaction-time experiments. Am Sci 57(4):421–457

    Google Scholar 

  38. 38.

    Van Gerven PWM, Paas F, Van Merriënboer JJG, Schmidt HG (2004) Memory load and the cognitive pupillary response in aging. Psychophysiology 41(2):167–174.

    Article  Google Scholar 

  39. 39.

    Yang J, Kamezaki M, Sato R, Iwata H, Sugano S (2015) Inducement of visual attention using augmented reality for multi-display systems in advanced tele-operation. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS). pp 5364–5369.

  40. 40.

    Zheng K, Glas DF, Kanda T, Ishiguro H, Hagita N (2013) Supervisory control of multiple social robots for navigation. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction. pp 17–24.

Download references

Author information



Corresponding author

Correspondence to Stela H. Seo.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Authors retain copyright and grant the International Journal of Social Robotics right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.


Appendix: State Inquiry Questionnaire (All Text Options): One Team Robot/Text Representation


Appendix: State Inquiry Questionnaire (All Text Options): One Team Robot/Text Representation


Appendix: State Inquiry Questionnaire (Matching Options): One Team Robot/Icon Representation


Appendix: State Inquiry Questionnaire (Matching Options): Two Team Robot/Icon Representation


Appendix: State Inquiry Questionnaire (Matching Options): One Team Robot/Emoji Representation


Appendix: State Inquiry Questionnaire (Matching Options): Two Team Robot/Emoji Representation


Appendix: State Inquiry Questionnaire (Simpler Version): Left Team Robot/Text Representation


Appendix: State Inquiry Questionnaire (Simpler Version): Left Team Robot/Icon Representation


Appendix: State Inquiry Questionnaire (Simpler Version): Left Team Robot/Emoji Representation


Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Seo, S.H., Young, J.E. & Irani, P. How are Your Robot Friends Doing? A Design Exploration of Graphical Techniques Supporting Awareness of Robot Team Members in Teleoperation. Int J of Soc Robotics 13, 725–749 (2021).

Download citation


  • Teleoperation
  • Robot state representations
  • Multi-robot monitoring
  • Team robot states
  • Interface design