Advertisement

A Comparison of Trust Measures in Human–Robot Interaction Scenarios

  • Theresa T. KesslerEmail author
  • Cintya Larios
  • Tiffani Walker
  • Valarie Yerdon
  • P. A. Hancock
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 499)

Abstract

When studying Human–Robot Interaction (HRI), we often employ measures of trust. Trust is essential in HRI, as inappropriate levels of trust result in misuse, abuse, or disuse of that robot. Some measures of trust specifically target automation, while others specifically target HRI. Although robots are a type of automation, it is unclear which of the broader factors that define automation are shared by robots. However, measurements of trust in automation and trust in robots should theoretically still yield similar results. We examined an HRI scenario using (1) an automation trust scale and (2) a robotic trust scale. Findings indicated conflicting results coming from these respective trust scales. It may well be that these two trust scales examine separate constructs and are therefore not interchangeable. This discord shows us that future evaluations are required to identify scale appropriate context applications for either automation or robotic operations.

Keywords

Human–Robot interaction Trust Trust scale Trust measures 

Notes

Acknowledgments

The research reported in this document was performed in connection with Contract No. W911NF-10-2-0016 with the U.S. Army Research Laboratory, under UCF, P. A. Hancock, Principal Investigator. The views and conclusions contained in this document are those of the authors and should not be interpreted as presenting the official policies or position, either expressed or implied, of the U.S. Army Research Laboratory or the U.S. government unless so designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or approval of the use thereof. The U.S. government is authorized to reproduce and distribute reprints for government purposes notwithstanding any copyright notation herein.

References

  1. 1.
    Chen, J.Y., Barnes, M.J.: Supervisory control of multiple robots effects of imperfect automation and individual differences. Hum. Factors J. Hum. Factors Ergon. Soc. 54(2), 157–174 (2012)CrossRefGoogle Scholar
  2. 2.
    Heerink, M., Krose, B., Evers, V., Wielinga, B.: Assessing acceptance of assistive social agent technology by older adults: the almere model. Int. J. Soc. Robots 2, 361–375 (2010)CrossRefGoogle Scholar
  3. 3.
    Hinds, P.J., Roberts, T.L., Jones, H.: Whose job is it anyway? A study of human-robot interaction in a collaborative task. Hum-Comput. Interac. 19, 151–181 (2004)CrossRefGoogle Scholar
  4. 4.
    Parasuraman, R., Cosenzo, K.A., de Visser, E.: Adaptive automation for human supervision of multiple uninhabited vehicles: effects on change detection, situation awareness, and mental workload. Mil. Psychol. 21, 270–297 (2009)CrossRefGoogle Scholar
  5. 5.
    Tsui, K.M., Yanco, H.A.: Assistive, Surgical, and Rehabilitation Robots from the Perspective of Medical and Healthcare Professionals. The AAAI Workshop on Human Implications of Human-Robot Interaction, pp. 34–39. AAAI Press, Vancouver, Canada (2007)Google Scholar
  6. 6.
    Parasuraman, R., Riley, V.: Humans and automation: use, misuse, disuse, abuse. Hum. Factors J. Hum. Factors Ergon. Soc. 39(2), 230–253 (1997)CrossRefGoogle Scholar
  7. 7.
    Yagoda, R.E., Gillan, D.J.: You want me to trust a robot? The development of a human–robot interaction trust scale. Int. J. Soc. Robot. 4(3), 235–248 (2012)CrossRefGoogle Scholar
  8. 8.
    Hancock, P.A., Billings, D.R., Schaefer, K.E., Chen, J.Y., De Visser, E.J., Parasuraman, R.: A meta-analysis of factors affecting trust in human-robot interaction. Hum. Factors: J. Hum. Factors Ergon. Soc. 53(5), 517–527 (2011)CrossRefGoogle Scholar
  9. 9.
    Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors J Hum. Factors Ergon. Soc. 46(1), 50–80 (2004)CrossRefGoogle Scholar
  10. 10.
    Hancock, P.A., Billings, D.R., Schaefer, K.E.: Can you trust your robot? Ergon. Des. Q. Hum. Factors Appl. 19(3), 24–29 (2011)CrossRefGoogle Scholar
  11. 11.
    Bitan, Y., Meyer, J.: Self-Initiated and respondent actions in a simulated control task. Ergonomics 50(5), 763–788 (2007)CrossRefGoogle Scholar
  12. 12.
    Linegang, M., Stoner, H.A., Patterson, M.J., Seppelt, B.D., Hoffman, J.D., Crittendon, Z.B., Lee, J.D.: Human-Automation Collaboration in Dynamic Mission Planning: A Challenge Requiring an Ecological Approach. In: Proceedings of the 50th Human Factors and Ergonomics Society Annual Meeting; HFES: San Diego, CA, pp. 2482–2486 (2006)Google Scholar
  13. 13.
    Chen, J.Y., Procci, K., Boyce, M., Wright, J., Garcia, A., Barnes, M. Situation Awareness-Based Agent Transparency (No. ARL-TR-6905). Army Research Laboratory, Aberdeen Proving Ground, MD. Human Research and Engineering Directorate (2014)Google Scholar
  14. 14.
    Ross, J.M., Szalma, J.L., Hancock, P.A., Barnett, J.S., Taylor, G.: The effect of automation reliability on user automation trust and reliance in a search-and-rescue scenario. Proceed. Hum. Factors Ergon. Soc. Annu. Meet. 52(19), 1340–1344 (2008) (SAGE Publications)Google Scholar
  15. 15.
    Cook, M., Smallman, H.: Human factors of the confirmation bias in intelligence analysis: decision support from graphical evidence landscapes. Hum. Factors 50, 745–754 (2008)CrossRefGoogle Scholar
  16. 16.
    Neyedli, H.F., Hollands, J.G., Jamieson, G.A.: Beyond identity incorporating system reliability information into an automated combat identification system. Hum. Factors: J. Hum. Factors Ergon. Soc. 53(4), 338–355 (2011)CrossRefGoogle Scholar
  17. 17.
    Kim, T., Hinds, P.: Who should I blame? Effects of autonomy and transparency on attributions in human-robot interaction. In: Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication. ROMAN 2006, pp. 80–85. IEEE. Hertfordshire, United Kingdom (2006)Google Scholar
  18. 18.
    Parasuraman, R., Miller, C.A.: Trust and Etiquette in high-criticality automated systems. Commun. ACM 47(4), 51–55 (2004)CrossRefGoogle Scholar
  19. 19.
    Field, A.: Discovering Statistics Using IBM SPSS Statistics, 4th edn. SAGE Publications Ltd., Thousand Oaks, California (2013)Google Scholar
  20. 20.
    Crano, W.D., Brewer, M.B., Lac, A.: Principles and Methods of Social Research, 2nd edn. Psychology Press, Abingdon, England (2002)Google Scholar
  21. 21.
    Desai, M., Stubbs, K., Steinfeld, A., Yanco, H.: Creating trustworthy robots: lessons and inspirations from automated systems. In: Proceedings of the AISB Convention, New Front. Hum–Robot. Interac. (2009)Google Scholar
  22. 22.
    Groom, V., Nass, C.: Can robots be teammates? Benchmarks Hum–Robot Teams. Interac. Stud. 8(3), 483–500 (2007)Google Scholar
  23. 23.
    Schaefer, K.E.: The Perception and Measurement of Human Robot Trust. (Doctoral Dissertation), University of Central Florida, Orlando, Fl (2013)Google Scholar
  24. 24.
    Jian, J.Y., Bisantz, A.M., Drury, C.G.: Foundations for an Empirically Determined Scale of Trust in Automated Systems. Int. J. Cogn. Ergon. 4(1), 53–71 (2000)CrossRefGoogle Scholar
  25. 25.
    Scholtz, J., Young, J., Drury, J.L., Yanco, H.A.: Evaluation of Human-Robot Interaction Awareness in Search and Rescue. ICRA’04. IEEE Int. Conf. Robot. Autom. 3, 2327–2332 (2004)Google Scholar
  26. 26.
    Jian, J., Bisantz, A., Drury, C.G., Llinas, J.: Foundations for an Empirically Determined Scale of Trust in Automated Systems (AFRL-HE-WP-TR-2000-0102). Air Force Research. Laboratory, Wright-Patterson AFB, OH (1998)Google Scholar
  27. 27.
    Sheridan, T.B., Verplank, W.L.: Human and Computer Control of Undersea Teleoperators. Institute of Technology Cambridge, Man-Machine System Lab, Massachusetts (1978)Google Scholar
  28. 28.
    Sheridan, T.B.: Humans and Automation: System Design and Research Issues. Wiley/Human Factors and Ergonomics Society, New York (2002)Google Scholar
  29. 29.
    Parasuraman, R., Sheridan, T.B., Wickens, C.D.: A model for types and levels of human interaction with automation. Syst. Man Cybern. Part A: Syst. Hum. IEEE Trans. on 30(3), 286–297 (2000)CrossRefGoogle Scholar
  30. 30.
  31. 31.
    Čapek, K., Selver, P.R.U.R.: (Rossum’s Universal Robots): A fantastic melodrama. Doubleday, Garden City, N.Y (1923)Google Scholar
  32. 32.
    Asimov, I.I.: Robot. Fawcett Publications, Greenwich, Conn (1950)Google Scholar
  33. 33.
    Siciliano, B., Oussama, K.: Springer Handbook of Robotics. Springer, Berlin (2008)CrossRefzbMATHGoogle Scholar
  34. 34.
  35. 35.
    Nomura, T., Kanda, T., Suzuki, T.: Experimental Investigation into Influence of Negative Attitudes Toward Robots on Human–Robot Interaction. AI & Soc. 20(2), 138–150 (2006)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Theresa T. Kessler
    • 1
    Email author
  • Cintya Larios
    • 1
  • Tiffani Walker
    • 1
  • Valarie Yerdon
    • 1
  • P. A. Hancock
    • 1
  1. 1.University of Central FloridaOrlandoUSA

Personalised recommendations