Skip to main content

‘If You Agree with Me, Do I Trust You?’: An Examination of Human-Agent Trust from a Psychological Perspective

  • Conference paper
  • First Online:
Intelligent Systems and Applications (IntelliSys 2019)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 1038))

Included in the following conference series:

Abstract

Applications of automated agent systems in daily life have changed the role of human operators from a controller to a teammate. However, this ‘teammate’ relationship between humans and agents raises an important but challenging question: how do humans develop trust when interacting with automated agents that are human-like? In this study, a two-phase online experiment was conducted to examine the effect of attitudinal congruence and individual personalities on users’ trust toward an anthropomorphic agent. Our results suggest that the degree of an agent’s response congruence had no significant impacts on users’ trust toward the agent. In terms of individual personalities, we found one personality trait that has significant impact on users’ formation of human-agent trust. Although our data does not support the effect of attitudinal congruence on human-agent trust formation, this study provides the essential empirical evidence that benefits future research in this field. More importantly, in this paper we address the unusual challenges in our experimental design and what our null results imply about the formation of human-agent trust. This study not only sheds light on trust formation in human-agent collaboration but also provides insight for the future design of automated agent systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bailey, N.R., Scerbo, M.W.: Automation-induced complacency for monitoring highly reliable systems: the role of task complexity, system experience, and operator trust. Theoret. Issues Ergon. Sci. 8(4), 321–348 (2007)

    Article  Google Scholar 

  2. Behrenbruch, K., Söllner, M., Leimeister, J.M., Schmidt, L.: Understanding diversity–the impact of personality on technology acceptance. In: IFIP Conference on Human-Computer Interaction, pp. 306–313. Springer, Heidelberg, September 2013

    Chapter  Google Scholar 

  3. Biros, D.P., Fields, G., Gunsch, G.: The effect of external safeguards on human-information system trust in an information warfare environment. In: 2003 Proceedings of the 36th Annual Hawaii International Conference on System Sciences, p. 10. IEEE, January 2003

    Google Scholar 

  4. Biros, D.P., Daly, M., Gunsch, G.: The influence of task load and automation trust on deception detection. Group Decis. Negot. 13(2), 173–189 (2004)

    Article  Google Scholar 

  5. Huang, H.Y., Bashir, M.: Personal influences on dynamic trust formation in human-agent interaction. In: Proceedings of the 5th International Conference on Human Agent Interaction, pp. 233–243. ACM, October 2017

    Google Scholar 

  6. Cahour, B., Forzy, J.F.: Does projection into use improve trust and exploration? An example with a cruise control system. Safety Sci. 47(9), 1260–1270 (2009)

    Article  Google Scholar 

  7. Chavaillaz, A., Wastell, D., Sauer, J.: System reliability, performance and trust in adaptable automation. Appl. Ergon. 52, 333–342 (2016)

    Article  Google Scholar 

  8. Klien, G., Woods, D.D., Bradshaw, J.M., Hoffman, R.R., Feltovich, P.J.: Ten challenges for making automation a team player in joint human-agent activity. IEEE Intell. Syst. 19(6), 91–95 (2004)

    Article  Google Scholar 

  9. Cummings, M.L., Clare, A., Hart, C.: The role of human-automation consensus in multiple unmanned vehicle scheduling. Hum. Factors 52(1), 17–27 (2010)

    Article  Google Scholar 

  10. de Visser, E., Parasuraman, R.: Adaptive aiding of human-robot teaming: Effects of imperfect automation on performance, trust, and workload. J. Cogn. Eng. Decis. Making 5(2), 209–231 (2011)

    Article  Google Scholar 

  11. Donmez, B., Boyle, L.N., Lee, J.D., McGehee, D.V.: Drivers’ attitudes toward imperfect distraction mitigation strategies. Transp. Res. Part F: Traffic Psychol. Behav. 9(6), 387–398 (2006)

    Article  Google Scholar 

  12. Dzindolet, M.T., Peterson, S.A., Pomranky, R.A., Pierce, L.G., Beck, H.P.: The role of trust in automation reliance. Int. J. Hum.-Comput. Stud. 58(6), 697–718 (2003)

    Article  Google Scholar 

  13. Merritt, S.M.: Affective processes in human–automation interactions. Hum. Factors 53(4), 356–370 (2011)

    Article  Google Scholar 

  14. Epley, N., Waytz, A., Cacioppo, J.T.: On seeing human: a three-factor theory of anthropomorphism. Psychol. Rev. 114(4), 864 (2007)

    Article  Google Scholar 

  15. Hancock, P.A., Billings, D.R., Schaefer, K.E., Chen, J.Y., De Visser, E.J., Parasuraman, R.: A meta-analysis of factors affecting trust in human-robot interaction. Hum. Factors 53(5), 517–527 (2011)

    Article  Google Scholar 

  16. Ho, G., Wheatley, D., Scialfa, C.T.: Age differences in trust and reliance of a medication management system. Interact. Comput. 17(6), 690–710 (2005)

    Article  Google Scholar 

  17. Wetzel, J.M.: Driver trust, annoyance, and compliance for an automated calendar system. Doctoral dissertation. Retrieved from ProQuest Dissertations and Theses (2006)

    Google Scholar 

  18. Hoff, K.A., Bashir, M.: Trust in automation: integrating empirical evidence on factors that influence trust. Hum. Factors 57(3), 407–434 (2015)

    Article  Google Scholar 

  19. Dadashi, N., Stedmon, A.W., Pridmore, T.P.: Semi-automated CCTV surveillance: the effects of system confidence, system accuracy and task complexity on operator vigilance, reliance and workload. Appl. Ergon. 44(5), 730–738 (2013)

    Article  Google Scholar 

  20. Singh, I.L., Molloy, R., Parasuraman, R.: Automation-induced complacency: development of the complacency-potential rating scale. Int. J. Aviat. Psychol. 3(2), 111–122 (1993)

    Article  Google Scholar 

  21. Kircher, K., Thorslund, B.: Effects of road surface appearance and low friction warning systems on driver behaviour and confidence in the warning system. Ergonomics 52(2), 165–176 (2009)

    Article  Google Scholar 

  22. Dzindolet, M.T., Pierce, L.G., Beck, H.P., Dawe, L.A.: The perceived utility of human and automated aids in a visual detection task. Hum. Factors 44(1), 79–94 (2002)

    Article  Google Scholar 

  23. Kohn, A.: Brain science: The forgetting curve–the dirty secret of corporate training (2016). http://www.learningsolutionsmag.com/articles/1379/brain-science-the-forgetting-curvethe-dirty-secret-of-corporate-training. Accessed 7 Sept 2016

  24. Costa Jr., P.T., McCrae, R.R.: Four ways five factors are basic. Pers. Individ. Differ. 13(6), 653–665 (1992)

    Article  Google Scholar 

  25. Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors 46(1), 50–80 (2004)

    Article  Google Scholar 

  26. Lee, M.N., Lee, J.D.: The influence of distraction and driving context on driver response to imperfect collision warning systems. Ergonomics 50(8), 1264–1286 (2007)

    Article  Google Scholar 

  27. Looije, R., Neerincx, M.A., Cnossen, F.: Persuasive robotic assistant for health self-management of older adults: design and evaluation of social behaviors. Int. J. Hum.-Comput. Stud. 68(6), 386–397 (2010)

    Article  Google Scholar 

  28. Seong, Y., Bisantz, A.M.: The impact of cognitive feedback on judgment performance and trust with decision aids. Int. J. Ind. Ergon. 38(7–8), 608–625 (2008)

    Article  Google Scholar 

  29. Vlasic, B., Boudette, N.E.: As US investigates fatal Tesla crash, company defends Autopilot system. The New York Times (2016). http://www.nytimes.com/2016/07/13/business/tesla-autopilot-fatal-crash-investigation.html. Accessed 19 September 2017

  30. Mayer, R.C., Davis, J.H.: The effect of the performance appraisal system on trust for management: a field quasi-experiment. J. Appl. Psychol. 84(1), 123 (1999)

    Article  Google Scholar 

  31. Jennings, N.R., Moreau, L., Nicholson, D., Ramchurn, S., Roberts, S., Rodden, T., Rogers, A.: Human-agent collectives. Commun. ACM 57(12), 80–88 (2014)

    Article  Google Scholar 

  32. Madhavan, P., Wiegmann, D.A.: Similarities and differences between human–human and human–automation trust: an integrative review. Theoret. Issues Ergon. Sci. 8(4), 277–301 (2007)

    Article  Google Scholar 

  33. Merritt, S.M., Ilgen, D.R.: Not all trust is created equal: dispositional and history-based trust in human-automation interactions. Hum. Factors 50(2), 194–210 (2008)

    Article  Google Scholar 

  34. Miramontes, A., Tesoro, A., Trujillo, Y., Barraza, E., Keeler, J., Boudreau, A., Strybel, T.Z., Vu, K.P.L.: Training student air traffic controllers to trust automation. Procedia Manuf. 3, 3005–3010 (2015)

    Article  Google Scholar 

  35. Szalma, J.L., Taylor, G.S.: Individual differences in response to automation: the five factor model of personality. J. Exp. Psychol.: Appl. 17(2), 71 (2011)

    Google Scholar 

  36. Mosier, K.L., Skitka, L.J., Korte, K.J.: Cognitive and social psychological issues in flight crew/automation interaction. In: Human Performance in Automated Systems: Current Research and Trends, pp. 191–197 (1994)

    Google Scholar 

  37. Nass, C., Moon, Y., Carney, P.: Are people polite to computers? Responses to computer‐based interviewing systems 1. J. Appl. Soc. Psychol. 29(5), 1093–1109 (1999)

    Article  Google Scholar 

  38. Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 72–78. ACM (1994)

    Google Scholar 

  39. Nomura, T., Kanda, T., Suzuki, T., Kato, K.: Prediction of human behavior in human–robot interaction using psychological scales for anxiety and negative attitudes toward robots. IEEE Trans. Robot. 24(2), 442–451 (2008)

    Article  Google Scholar 

  40. Oduor, K.F., Campbell, C.S.: Deciding when to trust automation in a policy-based city management game: policity. In: Proceedings of the 2007 Symposium on Computer Human Interaction for the Management of Information Technology, p. 2. ACM, March 2007

    Google Scholar 

  41. Pak, R., Fink, N., Price, M., Bass, B., Sturre, L.: Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics 55(9), 1059–1072 (2012)

    Article  Google Scholar 

  42. Pak, R., Rovira, E., McLaughlin, A.C., Baldwin, N.: Does the domain of technology impact user trust? Investigating trust in automation across different consumer-oriented domains in young adults, military, and older adults. Theoret. Issues Ergon. Sci. 18(3), 199–220 (2017)

    Article  Google Scholar 

  43. Parasuraman, R., Miller, C.A.: Trust and etiquette in high-criticality automated systems. Commun. ACM 47(4), 51–55 (2004)

    Article  Google Scholar 

  44. Tung, F.W.: Influence of gender and age on the attitudes of children towards humanoid robots. In: International Conference on Human-Computer Interaction, pp. 637–646. Springer, Heidelberg, July 2011

    Google Scholar 

  45. Pearson, C.J., Welk, A.K., Boettcher, W.A., Mayer, R.C., Streck, S., Simons-Rudolph, J.M., Mayhorn, C.B.: Differences in trust between human and automated decision aids. In: Proceedings of the Symposium and Bootcamp on the Science of Security, pp. 95–98. ACM, April 2016

    Google Scholar 

  46. Reeves, B., Nass, C.I.: The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge University Press, Cambridge (1996)

    Google Scholar 

  47. Rempel, J.K., Holmes, J.G., Zanna, M.P.: Trust in close relationships. J. Pers. Soc. Psychol. 49(1), 95 (1985)

    Article  Google Scholar 

  48. Ross, J.M., Szalma, J.L., Hancock, P.A., Barnett, J.S., Taylor, G.: The effect of automation reliability on user automation trust and reliance in a search-and-rescue scenario. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 52, no. 19, pp. 1340–1344. Sage, Los Angeles, September 2008

    Article  Google Scholar 

  49. Salem, M., Lakatos, G., Amirabdollahian, F., Dautenhahn, K.: Would you trust a (faulty) robot?: effects of error, task type and personality on human-robot cooperation and trust. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pp. 141–148. ACM, March 2015

    Google Scholar 

  50. Sanchez, J., Fisk, A.D., Rogers, W.A.: Reliability and age-related effects on trust and reliance of a decision support aid. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 48, no. 3, pp. 586–589. Sage, Los Angeles, September 2004

    Article  Google Scholar 

  51. Sanchez, J., Rogers, W.A., Fisk, A.D., Rovira, E.: Understanding reliance on automation: effects of error type, error distribution, age and experience. Theoret. Issues Ergon. Sci. 15(2), 134–160 (2014)

    Article  Google Scholar 

  52. Schaefer, K.E., Chen, J.Y., Szalma, J.L., Hancock, P.A.: A meta-analysis of factors influencing the development of trust in automation: implications for understanding autonomy in future systems. Hum. Factors 58(3), 377–400 (2016)

    Article  Google Scholar 

  53. Genesereth, M.R.: Software Agents Logic Group. Computer Science, Department Stanford University (1994)

    Google Scholar 

  54. Hommel, B., Colzato, L.S.: Interpersonal trust: an event-based account. Front. Psychol. 6, 1399 (2015)

    Google Scholar 

  55. Sitkin, S.B., Roth, N.L.: Explaining the limited effectiveness of legalistic “remedies” for trust/distrust. Organ. Sci. 4(3), 367–392 (1993)

    Article  Google Scholar 

  56. Sharples, S., Stedmon, A., Cox, G., Nicholls, A., Shuttleworth, T., Wilson, J.: Flightdeck and air traffic control collaboration evaluation (FACE): evaluating aviation communication in the laboratory and field. Appl. Ergon. 38(4), 399–407 (2007)

    Article  Google Scholar 

  57. Spain, R.D., Bliss, J.P.: The effect of sonification display pulse rate and reliability on operator trust and perceived workload during a simulated patient monitoring task. Ergonomics 51(9), 1320–1337 (2008)

    Article  Google Scholar 

  58. Lee, E.J.: Flattery may get computers somewhere, sometimes: the moderating role of output modality, computer gender, and user gender. Int. J. Hum.-Comput. Stud. 66(11), 789–800 (2008)

    Article  Google Scholar 

  59. Cheshire, C.: Online trust, trustworthiness, or assurance? Daedalus 140(4), 49–58 (2011)

    Article  Google Scholar 

  60. Bates, D., Maechler, M., Bolker, B., Walker, S., Christensen, R.H.B., Singmann, H., Bolker, M.B.: Package ‘lme4, p. 12. Vienna, Austria: R Foundation for Statistical Computing (2014)

    Google Scholar 

  61. de Visser, E.J., Krueger, F., McKnight, P., Scheid, S., Smith, M., Chalk, S., Parasuraman, R.: The world is not enough: trust in cognitive agents. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 56, no. 1, pp. 263–267. Sage, Los Angeles, September 2012

    Google Scholar 

  62. Griffitt, W., Veitch, R.: Preacquaintance attitude similarity and attraction revisited: ten days in a fall-out shelter. Sociometry 163–173 (1974)

    Article  Google Scholar 

  63. Kandel, D.B.: Homophily, selection, and socialization in adolescent friendships. Am. J. Sociol. 84(2), 427–436 (1978)

    Article  Google Scholar 

  64. Lee, E.J., Nass, C.: Experimental tests of normative group influence and representation effects in computer-mediated communication: when interacting via computers differs from interacting with computers. Hum. Commun. Res. 28(3), 349–381 (2002)

    Google Scholar 

  65. Bickmore, T.W., Picard, R.W.: Establishing and maintaining long-term human-computer relationships. ACM Trans. Comput.-Hum. Interact. (TOCHI) 12(2), 293–327 (2005)

    Article  Google Scholar 

  66. Gong, L.: How social is social responses to computers? The function of the degree of anthropomorphism in computer representations. Comput. Hum. Behav. 24(4), 1494–1509 (2008)

    Article  Google Scholar 

  67. Merritt, S.M., Unnerstall, J.L., Lee, D., Huber, K.: Measuring individual differences in the perfect automation schema. Hum. Factors 57(5), 740–753 (2015)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masooda Bashir .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Huang, HY., Twidale, M., Bashir, M. (2020). ‘If You Agree with Me, Do I Trust You?’: An Examination of Human-Agent Trust from a Psychological Perspective. In: Bi, Y., Bhatia, R., Kapoor, S. (eds) Intelligent Systems and Applications. IntelliSys 2019. Advances in Intelligent Systems and Computing, vol 1038. Springer, Cham. https://doi.org/10.1007/978-3-030-29513-4_73

Download citation

Publish with us

Policies and ethics