Skip to main content

Vertrauen in automatisierte Kundendialoge

  • Chapter
  • First Online:
Kundendialog-Management

Zusammenfassung

Vertrauen ist ein wichtiger Aspekt in Kundenbeziehungen. Haben Kundinnen und Kunden kein Vertrauen in eine Unternehmung, eine Marke oder ein Produkt, werden sie sich für eine andere/ein anderes – vertrauenswürdigere/s – entscheiden. Die digitale Transformation schreitet zunehmend voran, und Unternehmen müssen Prozesse effizienter gestalten, wobei der Einsatz von automatisierten Kundendialogen ein wichtiger Aspekt ist. In diesem Kapitel wird erläutert, wie Vertrauen in automatisierten Kundendialogen entstehen kann. Zu Beginn des Kapitels wird beschrieben, wie Vertrauen definiert werden kann und welche Aspekte beim Thema Vertrauen relevant sind. In Verkaufsgesprächen spielt die Wahrnehmung der Verkaufsperson eine wichtige Rolle, und dies beeinflusst auch die Entscheidung für oder gegen einen Kauf. Das Konzept kann auf digitale Kundendialoge übertragen werden. Denn auch bei Konversationen mit einem Chatbot (automatisierte Kundendialoge) spielt das Vertrauen eine wichtige Rolle. Das Kapitel erläutert verschiedene Erfolgsfaktoren und endet mit Tipps für die Umsetzung in der Praxis.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Literatur

  • Anton, E., Oesterreich, T. D., Schuir, J., & Teuteberg, F. (2022). Painting a holistic picture of trust in and adoption of conversational agents: A meta-analytic structural equation modeling approach. https://doi.org/10.24251/HICSS.2022.714.

  • Balasubramanian, S., Konana, P., & Menon, N. M. (2003). Customer satisfaction in virtual environments: A study of online investing. Management Science, 49(7), 871–889. https://doi.org/10.1287/mnsc.49.7.871.16385.

    Article  Google Scholar 

  • Bauer, P., & Freitag, M. (2018). Measuring trust. In E. M. Uslaner (Hrsg.), The Oxford handbook of social and political trust (S. 15–36). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190274801.013.1.

  • Benhardy, K. A., Hardiyansyah, H., Putranto, A., & Ronadi, M. (2020). Brand image and price perceptions impact on purchase intentions: Mediating brand trust. Management Science Letters, 3425–3432. https://doi.org/10.5267/j.msl.2020.5.035.

  • Bove, C., Aigrain, J., & Detyniecki, M. (2021). Building trust in artificial conversational agents. Joint Proceedings of the ACM IUI 2021 Workshops, April 13–17, 2021, College Station, USA.

    Google Scholar 

  • Brandtzaeg, P. B., & Følstad, A. (2018). Chatbots. Interactions, 25(5), 38–43. https://doi.org/10.1145/3236669.

    Article  Google Scholar 

  • Cuddy, A. J. C., Fiske, S. T., & Glick, P. (2008). Warmth and competence as universal dimensions of social perception: The stereotype content model and the BIAS map. In advances in experimental social psychology (Bd. 40, S. 61–149). Academic Press. https://doi.org/10.1016/S0065-2601(07)00002-0.

  • Demeure, V., Niewiadomski, R., & Pelachaud, C. (2011). How is believability of a virtual agent related to warmth, competence, personification, and embodiment? Presence: Teleoperators and Virtual Environments, 20(5), 431–448. https://doi.org/10.1162/PRES_a_00065.

  • Deneçli, S., Yıldız Balaban, Ö., & Deneçli Arıbakan, C. (2022). Examining the relationship between consumer innovativeness and trust in chatbot applications: A study on Turkish banking sector. Connectist: Istanbul University Journal of Communication Sciences, 0(63), 59–85. https://doi.org/10.26650/CONNECTIST2022-1171397.

  • Erdem, F., & Ozen, J. (2003). Cognitive and affective dimensions of trust in developing team performance. Team Performance Management, 9(5/6), 131–135. https://doi.org/10.1108/13527590310493846.

    Article  Google Scholar 

  • Ernst and Young AG, Bergman, H., & Volery, T. (2008). Vertrauen zahlt sich aus. https://www.alexandria.unisg.ch/48330/1/Vertrauensstudie%20d_ernstyoung.pdf.

  • Fiske, S. T., Cuddy, A. J. C., & Glick, P. (2007). Universal dimensions of social cognition: Warmth and competence. Trends in Cognitive Sciences, 11(2), 77–83. https://doi.org/10.1016/j.tics.2006.11.005.

    Article  Google Scholar 

  • Fiske, S. T., Cuddy, A. J. C., Glick, P., & Xu, J. (2002). A model of (often mixed) stereotype content: Competence and warmth respectively follow from perceived status and competition. Journal of Personality and Social Psychology, 82(6), 878–902. https://doi.org/10.1037/0022-3514.82.6.878.

    Article  Google Scholar 

  • Følstad, A., Nordheim, C. B., & Bjørkli, C. A. (2018). What makes users trust a Chatbot for customer service? An exploratory interview study (S. 194–208). https://doi.org/10.1007/978-3-030-01437-7_16.

  • Gnewuch, U., Feine, J., Morana, S., & Maedche, A. (2020). Soziotechnische Gestaltung von Chatbots. In Cognitive Computing (S. 169–189). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-27941-7_7.

  • Gnewuch, U., Morana, S., & Maedche, A. (2017). Towards designing cooperative and social conversational agents for customer service. Proceedings of the International Conference on Information Systems (ICIS) 2017, Thirty Eighth International Conference on Information Systems, South Korea.

    Google Scholar 

  • Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304–316. https://doi.org/10.1016/j.chb.2019.01.020.

  • Grotenhermen, J.-G., Schönberg, N., & Schewe, G. (2021). Wahrnehmungen und Vertrauen gegenüber Conversational Agents im Kundenservice von Finanzdienstleistern – Eine vergleichende Analyse. In M. Bruhn & K. Hadwich (Hrsg.), Künstliche Intelligenz im Dienstleistungsmanagement: Band 2: Einsatzfelder – Akzeptanz – Kundeninteraktionen (S. 289–308). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-34326-2_11.

  • Guenzi, P., & Georges, L. (2010). Interpersonal trust in commercial relationships: Antecedents and consequences of customer trust in the salesperson. European Journal of Marketing, 44(1/2), 114–138. https://doi.org/10.1108/03090561011008637.

    Article  Google Scholar 

  • Haupt, M., & Rozumowski, A. (2021). Sorry I am still learning: Active expectation management of chatbots (P. Kommers & P. Isaías, Eds.). In Proceedings of the 19th International Conference e-Society 2021 (S. 275–278). IADIS. https://www.esociety-conf.org/wp-content/uploads/2021/03/02_202101C036.pdf.

  • Herriman, M., Meer, E., Rosin, R., Lee, V., Washington, V., & Volpp, K. (2020). Asked and answered: Building a Chatbot to address Covid-19-related concerns. https://Catalyst.Nejm.Org/Doi/Full/10.1056/CAT.20.0230.

    Google Scholar 

  • Ismagilova, E., Slade, E. L., Rana, N. P., & Dwivedi, Y. K. (2020). The effect of electronic word of mouth communications on intention to buy: A meta-analysis. Information Systems Frontiers, 22(5), 1203–1226. https://doi.org/10.1007/s10796-019-09924-y.

    Article  Google Scholar 

  • Kaplan, A. D., Kessler, T. T., Brill, J. C., & Hancock, P. A. (2021). Trust in artificial intelligence: Meta-analytic findings. Human Factors: The Journal of the Human Factors and Ergonomics Society, 001872082110139. https://doi.org/10.1177/00187208211013988.

  • Keeling, K., McGoldrick, P., & Beatty, S. (2010). Avatars as salespeople: Communication style, trust, and intentions. Journal of Business Research, 63(8), 793–800. https://doi.org/10.1016/j.jbusres.2008.12.015.

    Article  Google Scholar 

  • Kepuska, V., & Bohouta, G. (2018). Next-generation of virtual personal assistants (Microsoft Cortana, Apple Siri, Amazon Alexa and Google Home). 2018 IEEE 8th Annual Computing and Communication Workshop and Conference (CCWC), 99–103. https://doi.org/10.1109/CCWC.2018.8301638.

  • Khurana, A., Alamzadeh, P., & Chilana, P. K. (2021). ChatrEx: Designing Explainable Chatbot Interfaces for Enhancing Usefulness, Transparency, and Trust. 2021 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), 1–11. https://doi.org/10.1109/VL/HCC51201.2021.9576440.

  • Kim, Y., & Peterson, R. A. (2017). A meta-analysis of online trust relationships in E-commerce. Journal of Interactive Marketing, 38, 44–54. https://doi.org/10.1016/j.intmar.2017.01.001.

    Article  Google Scholar 

  • Maedche, A., Legner, C., Benlian, A., Berger, B., Gimpel, H., Hess, T., Hinz, O., Morana, S., & Söllner, M. (2019). AI-based digital assistants. Business & Information Systems Engineering, 61(4), 535–544. https://doi.org/10.1007/s12599-019-00600-8.

    Article  Google Scholar 

  • Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. Academy of Management Review, 20(3), 709–734. https://doi.org/10.5465/amr.1995.9508080335.

    Article  Google Scholar 

  • Munnukka, J., Talvitie-Lamberg, K., & Maity, D. (2022). Anthropomorphism and social presence in Human-Virtual service assistant interactions: The role of dialog length and attitudes. Computers in Human Behavior, 135, 107343. https://doi.org/10.1016/j.chb.2022.107343.

    Article  Google Scholar 

  • Nitzl, C., & Hirsch, B. (2013). Die Entstehung von interpersonellem Vertrauen am Beispiel der Manager-Controller-Interaktion. In Macht des Vertrauens (S. 37–51). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-8349-4453-5_2.

  • Pangaro, P. (2010). How can I put that? Applying cybernetics to “Conversational Media”. Cybernetics & Human Knowing, 17, 59–75.

    Google Scholar 

  • Peter, M. K., Rozumowski, A., Niedermann, A., Casanova, M., Dalla Vecchia, M., Gnocchi, A., Lindeque, J., Mändli Lerch, K., & Zachlod, C. (2023). Digitales Content Marketing und Branding. FHNW Hochschule für Wirtschaft.

    Google Scholar 

  • van Pinxteren, M. M. E., Pluymaekers, M., & Lemmink, J. G. A. M. (2020). Humanlike communication in conversational agents: A literature review and research agenda. Journal of Service Management, 31(2), 203–225. https://doi.org/10.1108/JOSM-06-2019-0175.

    Article  Google Scholar 

  • Pütz, C., Düppre, S., Roth, S., & Weiss, W. (2021). Akzeptanz und Nutzung von Chat-/Voicebots. In M. Bruhn & K. Hadwich (Hrsg.), Künstliche Intelligenz im Dienstleistungsmanagement: Band 2: Einsatzfelder – Akzeptanz – Kundeninteraktionen (S. 361–383). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-34326-2_14.

  • Rajavi, K., Kushwaha, T., & Steenkamp, J.-B. E. M. (2019). In brands we trust? A multicategory, multicountry investigation of sensitivity of consumers’ trust in brands to marketing-mix activities. Journal of Consumer Research, 46(4), 651–670. https://doi.org/10.1093/jcr/ucz026.

    Article  Google Scholar 

  • Rathje, R., Laschet, F.-Y., & Kenning, P. (2021). Künstliche Intelligenz in der Finanzdienstleistungsbranche – Welche Bedeutung hat das Kundenvertrauen? In M. Bruhn & K. Hadwich (Hrsg.), Künstliche Intelligenz im Dienstleistungsmanagement: Band 2: Einsatz-felder – Akzeptanz – Kundeninteraktionen (S. 265–286). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-34326-2_10.

  • Reinkemeier, F., & Gnewuch, U. (2023, February 16). Mehr Personalisierung bei Sprachassistenten nötig. https://www.Kmu-Magazin.Ch/Digitalisierung-Transformation/Mehr-Personalisierung-Bei-Sprachassistenten-Noetig.

  • Rossmann, A. (2013). Vertrauen in Marketing und Vertrieb. In J. Vollmar, R. Becker, & I. Hoffend (Hrsg.), Macht des Vertrauens (S. 221–243). Springer Fachmedien Wiesbaden.

    Chapter  Google Scholar 

  • Rousseau, D., Sitkin, S., Burt, R., & Camerer, C. (1998). Not so different after all: A cross-discipline view of trust. Academy of Management Review, 23(3), 393–404. https://doi.org/10.5465/AMR.1998.926617.

    Article  Google Scholar 

  • Seeger, A.-M., Pfeiffer, J., & Heinzl, A. (2017). When do we need a human? Anthropomorphic design and trustworthiness of conversational agents. SIGHCI 2017 Proceedings, 15.

    Google Scholar 

  • Seiler, R., Müller, S., & Beinert, M. (2019). Stereotype content model (SCM) and chatbots/conversational interfaces: An experiment comparing trust, competence and warmth dimensions. 48th Annual European Marketing Academy Conference (EMAC), 27–28 May 2019. European Marketing Academy, 2019, Hamburg, Germany.

    Google Scholar 

  • Skraaning, G., & Jamieson, G. A. (2021). Human performance benefits of the automation transparency design principle: Validation and variation. Human factors, 63(3), 379–401.

    Article  Google Scholar 

  • Swan, J. E., Bowers, M. R., & Richardson, L. D. (1999). Customer trust in the salesperson: An integrative review and meta-analysis of the empirical literature. Journal of Business Research, 44(2), 93–107. https://doi.org/10.1016/S0148-2963(97)00244-0.

    Article  Google Scholar 

  • Tran, G. A., & Strutton, D. (2020). Comparing email and SNS users: Investigating e-servicescape, customer reviews, trust, loyalty and E-WOM. Journal of Retailing and Consumer Services, 53, 101782. https://doi.org/10.1016/j.jretconser.2019.03.009.

    Article  Google Scholar 

  • Willis, J., & Todorov, A. (2006). First impressions: Making up your mind after a 100-ms exposure to a face. Psychological Science, 17(7), 592–598. https://doi.org/10.1111/j.1467-9280.2006.01750.x.

    Article  Google Scholar 

  • Wirtz, J., Patterson, P. G., Kunz, W. H., Gruber, T., Lu, V. N., Paluch, S., & Martins, A. (2018). Brave new world: Service robots in the frontline. Journal of Service Management, 29(5), 907–931. https://doi.org/10.1108/JOSM-04-2018-0119.

    Article  Google Scholar 

  • Xing, X., Song, M., Duan, Y., & Mou, J. (2022). Effects of different service failure types and recovery strategies on the consumer response mechanism of chatbots. Technology in Society, 70, 102049. https://doi.org/10.1016/j.techsoc.2022.102049.

    Article  Google Scholar 

  • Zhu, Y., Zhang, J., Wu, J., & Liu, Y. (2022). AI is better when I’m sure: The influence of certainty of needs on consumers’ acceptance of AI chatbots. Journal of Business Research, 150, 642–652. https://doi.org/10.1016/j.jbusres.2022.06.044.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anna V. Rozumowski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 Der/die Autor(en), exklusiv lizenziert an Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Rozumowski, A.V., Peter, M.K. (2024). Vertrauen in automatisierte Kundendialoge. In: Hafner, N., Hundertmark, S. (eds) Kundendialog-Management. Springer Gabler, Wiesbaden. https://doi.org/10.1007/978-3-658-42851-8_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-658-42851-8_11

  • Published:

  • Publisher Name: Springer Gabler, Wiesbaden

  • Print ISBN: 978-3-658-42850-1

  • Online ISBN: 978-3-658-42851-8

  • eBook Packages: Business and Economics (German Language)

Publish with us

Policies and ethics