Towards Modeling Real-Time Trust in Asymmetric Human–Robot Collaborations

Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 114)


We are interested in enhancing the efficiency of human–robot collaborations, especially in “supervisor-worker” settings where autonomous robots work under the supervision of a human operator. We believe that trust serves a critical role in modeling the interactions within these teams, and also in streamlining their efficiency. We propose an operational formulation of human–robot trust on a short interaction time scale, which is tailored to a practical tele-robotics setting. We also report on a controlled user study that collected interaction data from participants collaborating with an autonomous robot to perform visual navigation tasks. Our analyses quantify key correlations between real-time human–robot trust assessments and diverse factors, including properties of failure events reflecting causal trust attribution, as well as strong influences from each user’s personality. We further construct and optimize a predictive model of users’ trust responses to discrete events, which provides both insights on this fundamental aspect of real-time human–machine interaction, and also has pragmatic significance for designing trust-aware robot agents.


Human Robot Collaboration Real-time Trust Combined Assessment Combined Response Boundary Tracking 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



We would like to acknowledge the NSERC Canadian Field Robotics Network (NCFRN) for its funding support. We would also like to thank all of the participants who contributed to our user study.


  1. 1.
    Arkin, R.C., Ulam, P., Wagner, A.R.: Moral decision making in autonomous systems: Enforcement, moral emotions, dignity, trust, and deception. Proc. of the IEEE 100(3), 571–589 (2012)CrossRefGoogle Scholar
  2. 2.
    Bisantz, A.M., Seong, Y.: Assessment of operator trust in and utilization of automated decision-aids under different framing conditions. Ind. Ergon. 28(2), 85–97 (2001)CrossRefGoogle Scholar
  3. 3.
    Choi, B.C., Pak, A.W.: A catalog of biases in questionnaires. Preventing Chronic Disease 2(1) (2005)Google Scholar
  4. 4.
    de Vries, P., Midden, C., Bouwhuis, D.: The effects of errors on system trust, self-confidence, and the allocation of control in route planning. Hum.-Comp. Studies 58(6), 719–735 (2003)CrossRefGoogle Scholar
  5. 5.
    Desai, M.: Modeling trust to improve human-robot interaction. Ph.D. thesis, Computer Science Department, U. Massachusetts Lowell (2012)Google Scholar
  6. 6.
    Desai, M., Medvedev, M., Vázquez, M., McSheehy, S., Gadea-Omelchenko, S., Bruggeman, C., Steinfeld, A., Yanco, H.: Effects of changing reliability on trust of robot systems. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI’12), pp. 73–80 (2012)Google Scholar
  7. 7.
    Dzindolet, M.T., Peterson, S.A., Pomranky, R.A., Pierce, L.G., Beck, H.P.: The role of trust in automation reliance. Hum.-Comp. Studies 58(6), 697–718 (2003)CrossRefGoogle Scholar
  8. 8.
    Fong, T., Thorpe, C., Baur, C.: Collaboration, dialogue, and human-robot interaction. In: Proceedings of the International Symposium on Robotics Research (ISRR’01), pp. 255–266 (2002)Google Scholar
  9. 9.
    Freedy, E., DeVisser, E., Weltman, G., Coeyman, N.: Measurement of trust in human-robot collaboration. In: Proceedings of International Symposium on Collaborative Technologies and Systems (CTS’07), pp. 106–114 (2007)Google Scholar
  10. 10.
    Gao, F., Clare, A., Macbeth, J., Cummings, M.: Modeling the impact of operator trust on performance in multiple robot control. In: Proceedings of AAAI Spring Symposium: Trust and Autonomous Systems (2013)Google Scholar
  11. 11.
    Hall, R.J.: Trusting your assistant. In: Proceedings of the 11th Knowledge-Based Software Engineering Conference, pp. 42–51 (1996)Google Scholar
  12. 12.
    Hancock, P., Billings, D., Schaefer, K., Chen, J., De Visser, E., Parasuraman, R.: A meta-analysis of factors affecting trust in human-robot interaction. Hum. Factors: J. Hum. Factors Ergon. Soc. 53(5), 517–527 (2011)CrossRefGoogle Scholar
  13. 13.
    Hart, S.G.: NASA-Task Load Index (NASA-TLX); 20 years later. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 50, pp. 904–908 (2006)Google Scholar
  14. 14.
    Jian, J.Y., Bisantz, A.M., Drury, C.G.: Foundations for an empirically determined scale of trust in automated systems. Int. J. Cogn. Ergon. 4(1), 53–71 (2000)CrossRefGoogle Scholar
  15. 15.
    Lee, J., Moray, N.: Trust, control strategies and allocation of function in human-machine systems. Ergonomics 35(10), 1243–1270 (1992)CrossRefGoogle Scholar
  16. 16.
    McKnight, D.H., Chervany, N.L.: The meanings of trust. Tech. rep, U. Minnesota (1996)Google Scholar
  17. 17.
    Muir, B.M.: Operators trust in and use of automatic controllers in a supervisory process control task. Ph.D. thesis, U. Toronto (1989)Google Scholar
  18. 18.
    Reips, U.D., Funke, F.: Interval-level measurement with visual analogue scales in internet-based research: VAS generator. Behav. Res. Methods 40(3), 699–704 (2008)Google Scholar
  19. 19.
    Xu, A., Dudek, G.: A vision-based boundary following framework for aerial vehicles. In: Proceedings of the IEEE/RSJ International Conference on International Robots and Systems (IROS’10), pp. 81–86 (2010)Google Scholar
  20. 20.
    Xu, A., Dudek, G.: Trust-driven interactive visual navigation for autonomous robots. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’12), pp. 3922–3929 (2012)Google Scholar
  21. 21.
    Yagoda, R.E., Gillan, D.J.: You want me to trust a ROBOT? the development of a human-robot interaction trust scale. Soc. Robot. 4(3), 235–248 (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.School of Computer ScienceMcGill UniversityMontrealCanada

Personalised recommendations