Skip to main content

What Makes Users Trust a Chatbot for Customer Service? An Exploratory Interview Study

  • Conference paper
  • First Online:
Book cover Internet Science (INSCI 2018)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11193))

Included in the following conference series:

Abstract

Chatbots are increasingly offered as an alternative source of customer service. For users to take up chatbots for this purpose, it is important that users trust chatbots to provide the required support. However, there is currently a lack in knowledge regarding the factors that affect users’ trust in chatbots. We present an interview study addressing this knowledge gap. Thirteen users of chatbots for customer service were interviewed regarding their experience with the chatbots and factors affecting their trust in these. Users’ trust in chatbots for customer service was found to be affected (a) by factors concerning the specific chatbot, specifically the quality of its interpretation of requests and advise, its human-likeness, its self-presentation, and its professional appearance, but also (b) by factors concerning the service context, specifically the brand of the chatbot host, the perceived security and privacy in the chatbot, as well as general risk perceptions concerning the topic of the request. Implications for the design and development of chatbots and directions for future work are suggested.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Botsman, R.: Who Can You Trust?: How Technology Brought Us Together–And Why It Could Drive Us Apart. Penguin, London (2017)

    Google Scholar 

  2. Brandtzaeg, P.B., Følstad, A.: Why people use chatbots. In: Kompatsiaris, I., et al. (eds.) INSCI 2017. LNCS, vol. 10673, pp. 377–392. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70284-1_30

    Chapter  Google Scholar 

  3. Corritore, C.L., Kracher, B., Wiedenbeck, S.: On-line trust: concepts, evolving themes, a model. Int. J. Hum. Comput. Stud. 58(6), 737–758 (2003). https://doi.org/10.1016/S1071-5819(03)00041-7

    Article  Google Scholar 

  4. Crutzen, R., Peters, G.J.Y., Portugal, S.D., Fisser, E.M., Grolleman, J.J.: An artificially intelligent chat agent that answers adolescents’ questions related to sex, drugs, and alcohol: an exploratory study. J. Adolesc. Health 48(5), 514–519 (2011). https://doi.org/10.1016/j.jadohealth.2010.09.002

    Article  Google Scholar 

  5. Dixon, M., Freeman, K., Toman, N.: Stop trying to delight your customers. Harvard Bus. Rev. 88(7/8), 116–122 (2010)

    Google Scholar 

  6. Ezzy, D.: Qualitative Analysis: Practice and Innovation. Routledge, London (2002)

    Google Scholar 

  7. Fitzpatrick, K.K., Darcy, A., Vierhile, M.: Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment. Health (2017). https://doi.org/10.2196/mental.7785

    Article  Google Scholar 

  8. Følstad, A., Brandtzæg, P.B.: Chatbots and the new world of HCI. Interactions 24(4), 38–42 (2017). https://doi.org/10.1145/3085558

    Article  Google Scholar 

  9. Følstad, A., Kvale, K., Haugstveit, I.M.: Customer support as a source of usability insight: why users call support after visiting self-service websites. In: Proceedings of NordiCHI 2014, pp. 167–170. ACM, New York (2014). https://doi.org/10.1145/2639189.2639232

  10. Friedman, B., Khan Jr., P.H., Howe, D.C.: Trust online. Commun. ACM 43(12), 34–40 (2000). https://doi.org/10.1145/355112.355120

    Article  Google Scholar 

  11. Fryer, L.K., Carpenter, R.: Bots as language learning tools. Lang. Learn. Technol. 10(3) (2006). http://dx.doi.org/10125/44068

  12. Hancock, P.A., Billings, D.R., Schaefer, K.E., Chen, J.Y., de Visser, E.J., Parasuraman, R.: A meta-analysis of factors affecting trust in human-robot interaction. Hum. Factors 53(5), 517–527 (2011). https://doi.org/10.1177/0018720811417254

    Article  Google Scholar 

  13. Liu, X., Xu, A., Sinha, V., Akkiraju, R.: Voice of customer: a tone-based analysis system for online user engagement. In: Extended Abstracts of CHI 2018. ACM, New York (2018). https://doi.org/10.1145/3170427.3188454

  14. Luger, E., Sellen, A.: Like having a really bad PA: the gulf between user expectation and experience of conversational agents. In: Proceedings of CHI 2016, pp. 5286–5297. ACM (2016). https://doi.org/10.1145/2858036.2858288

  15. Mayer, R.C., Davis, J.H., Schoorman, F.D.: An integrative model of organizational trust. Acad. Manag. Rev. 20(3), 709–734 (1995). https://doi.org/10.5465/amr.1995.9508080335

    Article  Google Scholar 

  16. Nielsen, J.: Designing Web Usability: The Practice of Simplicity. New Riders Publishing, Indianapolis (1999)

    Google Scholar 

  17. Rousseau, D.M., Sitkin, S.B., Burt, R.S., Camerer, C.: Not so different after all: a cross-discipline view of trust. Acad. Manag. Rev. 23(3), 393–404 (1998). https://doi.org/10.5465/amr.1998.926617

    Article  Google Scholar 

  18. Schoorman, F.D., Mayer, R.C., Davis, J.H.: An integrative model of organizational trust: past, present, and future. Acad. Manag. Rev. 32(2), 344–354 (2007). https://doi.org/10.5465/amr.2007.24348410

    Article  Google Scholar 

  19. Tezcan, T., Zhang, J.: Routing and staffing in customer service chat systems with impatient customers. Oper. Res. 62(4), 943–956 (2014). https://doi.org/10.1287/opre.2014.1284

    Article  MathSciNet  MATH  Google Scholar 

  20. Vinyals, O., Le, Q.: A neural conversational model. arXiv preprint arXiv:1506.05869 (2015)

  21. Weizenbaum, J.: ELIZA—a computer program for the study of natural language communication between man and machine. Commun. ACM 9(1), 36–45 (1966). https://doi.org/10.1145/365153.365168

    Article  Google Scholar 

  22. Xu, A., Liu, Z., Guo, Y., Sinha, V., Akkiraju, R.: A new chatbot for customer service on social media. In: Proceedings of CHI 2017, pp. 3506–3510. ACM, New York (2017). https://doi.org/10.1145/3025453.3025496

Download references

Acknowledgement

This work was supported by the Research Council of Norway grant no. 270940.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Asbjørn Følstad .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Følstad, A., Nordheim, C.B., Bjørkli, C.A. (2018). What Makes Users Trust a Chatbot for Customer Service? An Exploratory Interview Study. In: Bodrunova, S. (eds) Internet Science. INSCI 2018. Lecture Notes in Computer Science(), vol 11193. Springer, Cham. https://doi.org/10.1007/978-3-030-01437-7_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-01437-7_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-01436-0

  • Online ISBN: 978-3-030-01437-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics