Abstract
Chatbots are increasingly offered as an alternative source of customer service. For users to take up chatbots for this purpose, it is important that users trust chatbots to provide the required support. However, there is currently a lack in knowledge regarding the factors that affect users’ trust in chatbots. We present an interview study addressing this knowledge gap. Thirteen users of chatbots for customer service were interviewed regarding their experience with the chatbots and factors affecting their trust in these. Users’ trust in chatbots for customer service was found to be affected (a) by factors concerning the specific chatbot, specifically the quality of its interpretation of requests and advise, its human-likeness, its self-presentation, and its professional appearance, but also (b) by factors concerning the service context, specifically the brand of the chatbot host, the perceived security and privacy in the chatbot, as well as general risk perceptions concerning the topic of the request. Implications for the design and development of chatbots and directions for future work are suggested.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Botsman, R.: Who Can You Trust?: How Technology Brought Us Together–And Why It Could Drive Us Apart. Penguin, London (2017)
Brandtzaeg, P.B., Følstad, A.: Why people use chatbots. In: Kompatsiaris, I., et al. (eds.) INSCI 2017. LNCS, vol. 10673, pp. 377–392. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70284-1_30
Corritore, C.L., Kracher, B., Wiedenbeck, S.: On-line trust: concepts, evolving themes, a model. Int. J. Hum. Comput. Stud. 58(6), 737–758 (2003). https://doi.org/10.1016/S1071-5819(03)00041-7
Crutzen, R., Peters, G.J.Y., Portugal, S.D., Fisser, E.M., Grolleman, J.J.: An artificially intelligent chat agent that answers adolescents’ questions related to sex, drugs, and alcohol: an exploratory study. J. Adolesc. Health 48(5), 514–519 (2011). https://doi.org/10.1016/j.jadohealth.2010.09.002
Dixon, M., Freeman, K., Toman, N.: Stop trying to delight your customers. Harvard Bus. Rev. 88(7/8), 116–122 (2010)
Ezzy, D.: Qualitative Analysis: Practice and Innovation. Routledge, London (2002)
Fitzpatrick, K.K., Darcy, A., Vierhile, M.: Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment. Health (2017). https://doi.org/10.2196/mental.7785
Følstad, A., Brandtzæg, P.B.: Chatbots and the new world of HCI. Interactions 24(4), 38–42 (2017). https://doi.org/10.1145/3085558
Følstad, A., Kvale, K., Haugstveit, I.M.: Customer support as a source of usability insight: why users call support after visiting self-service websites. In: Proceedings of NordiCHI 2014, pp. 167–170. ACM, New York (2014). https://doi.org/10.1145/2639189.2639232
Friedman, B., Khan Jr., P.H., Howe, D.C.: Trust online. Commun. ACM 43(12), 34–40 (2000). https://doi.org/10.1145/355112.355120
Fryer, L.K., Carpenter, R.: Bots as language learning tools. Lang. Learn. Technol. 10(3) (2006). http://dx.doi.org/10125/44068
Hancock, P.A., Billings, D.R., Schaefer, K.E., Chen, J.Y., de Visser, E.J., Parasuraman, R.: A meta-analysis of factors affecting trust in human-robot interaction. Hum. Factors 53(5), 517–527 (2011). https://doi.org/10.1177/0018720811417254
Liu, X., Xu, A., Sinha, V., Akkiraju, R.: Voice of customer: a tone-based analysis system for online user engagement. In: Extended Abstracts of CHI 2018. ACM, New York (2018). https://doi.org/10.1145/3170427.3188454
Luger, E., Sellen, A.: Like having a really bad PA: the gulf between user expectation and experience of conversational agents. In: Proceedings of CHI 2016, pp. 5286–5297. ACM (2016). https://doi.org/10.1145/2858036.2858288
Mayer, R.C., Davis, J.H., Schoorman, F.D.: An integrative model of organizational trust. Acad. Manag. Rev. 20(3), 709–734 (1995). https://doi.org/10.5465/amr.1995.9508080335
Nielsen, J.: Designing Web Usability: The Practice of Simplicity. New Riders Publishing, Indianapolis (1999)
Rousseau, D.M., Sitkin, S.B., Burt, R.S., Camerer, C.: Not so different after all: a cross-discipline view of trust. Acad. Manag. Rev. 23(3), 393–404 (1998). https://doi.org/10.5465/amr.1998.926617
Schoorman, F.D., Mayer, R.C., Davis, J.H.: An integrative model of organizational trust: past, present, and future. Acad. Manag. Rev. 32(2), 344–354 (2007). https://doi.org/10.5465/amr.2007.24348410
Tezcan, T., Zhang, J.: Routing and staffing in customer service chat systems with impatient customers. Oper. Res. 62(4), 943–956 (2014). https://doi.org/10.1287/opre.2014.1284
Vinyals, O., Le, Q.: A neural conversational model. arXiv preprint arXiv:1506.05869 (2015)
Weizenbaum, J.: ELIZA—a computer program for the study of natural language communication between man and machine. Commun. ACM 9(1), 36–45 (1966). https://doi.org/10.1145/365153.365168
Xu, A., Liu, Z., Guo, Y., Sinha, V., Akkiraju, R.: A new chatbot for customer service on social media. In: Proceedings of CHI 2017, pp. 3506–3510. ACM, New York (2017). https://doi.org/10.1145/3025453.3025496
Acknowledgement
This work was supported by the Research Council of Norway grant no. 270940.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Følstad, A., Nordheim, C.B., Bjørkli, C.A. (2018). What Makes Users Trust a Chatbot for Customer Service? An Exploratory Interview Study. In: Bodrunova, S. (eds) Internet Science. INSCI 2018. Lecture Notes in Computer Science(), vol 11193. Springer, Cham. https://doi.org/10.1007/978-3-030-01437-7_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-01437-7_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01436-0
Online ISBN: 978-3-030-01437-7
eBook Packages: Computer ScienceComputer Science (R0)