Privacy Concerns in Chatbot Interactions

  • Carolin IschenEmail author
  • Theo Araujo
  • Hilde Voorveld
  • Guda van Noort
  • Edith Smit
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11970)


Chatbots are increasingly used in a commercial context to make product- or service-related recommendations. By doing so, they collect personal information of the user, similar to other online services. While privacy concerns in an online (website-) context are widely studied, research in the context of chatbot-interaction is lacking. This study investigates the extent to which chatbots with human-like cues influence perceptions of anthropomorphism (i.e., attribution of human-like characteristics), privacy concerns, and consequently, information disclosure, attitudes and recommendation adherence. Findings show that a human-like chatbot leads to more information disclosure, and recommendation adherence mediated by higher perceived anthropomorphism and subsequently, lower privacy concerns in comparison to a machine-like chatbot. This result does not hold in comparison to a website; human-like chatbot and website were perceived as equally high in anthropomorphism. The results show the importance of both mediating concepts in regards to attitudinal and behavioral outcomes when interacting with chatbots.


Chatbots Anthropomorphism Privacy concerns 



This study was funded by the Research Priority Area Communication and its Digital Communication Methods Lab ( at the University of Amsterdam.


  1. 1.
    Araujo, T.: Conversational agent research toolkit: an alternative for creating and managing chatbots for experimental research. (2019).
  2. 2.
    Araujo, T.: Living up to the chatbot hype: the influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Comput. Hum. Behav. 85, 183–189 (2018). Scholar
  3. 3.
    Baek, T.H., Morimoto, M.: Stay away from me. J. Advert. 41(1), 59–76 (2012). Scholar
  4. 4.
    Barnes, S.B.: A privacy paradox: social networking in the United States. First Monday 11, 9 (2006)CrossRefGoogle Scholar
  5. 5.
    Baruh, L., et al.: Online privacy concerns and privacy management: a meta-analytical review. J. Commun. 67(1), 26–53 (2017). Scholar
  6. 6.
    Becker-Olsen, K.L.: And now, a word from our sponsor: a look at the effects of sponsored content and banner advertising. J. Advert. 32(2), 17–32 (2003)CrossRefGoogle Scholar
  7. 7.
    Birnbaum, G.E., et al.: What robots can teach us about intimacy: the reassuring effects of robot responsiveness to human disclosure. Comput. Hum. Behav. 63, 416–423 (2016). Scholar
  8. 8.
    Boerman, S.C., et al.: Exploring motivations for online privacy protection behavior: insights from panel data. Commun. Res. (2018). Scholar
  9. 9.
    Bol, N., et al.: Understanding the effects of personalization as a privacy calculus: analyzing self-disclosure across health, news, and commerce contexts. J. Comput. Commun. 23, 370–388 (2018). Scholar
  10. 10.
    Chung, M., et al.: Chatbot e-service and customer satisfaction regarding luxury brands. J. Bus. Res., 1–9 (2018).
  11. 11.
    Croes, E., Antheunis, M.L.: Can we be friends with a chatbot? A longitudinal study on the process of friendship formation between humans and a social chatbot. Paper presented at the 69th Annual International Communication Association (ICA) Conference (2019)Google Scholar
  12. 12.
    Crutzen, R., et al.: An artificially intelligent chat agent that answers adolescents’ questions related to sex, drugs, and alcohol: an exploratory study. J. Adolesc. Heal. 48(5), 514–519 (2011). Scholar
  13. 13.
    Dabholkar, P.A., Sheng, X.: Consumer participation in using online recommendation agents: effects on satisfaction, trust, and purchase intentions. Serv. Ind. J. 32(9), 1433–1449 (2012). Scholar
  14. 14.
    Dienlin, T., Trepte, S.: Is the privacy paradox a relict of the past? An in-depth analysis of privacy attitudes and privacy behaviors. Eur. J. Soc. Psychol. 45, 285–297 (2015). Scholar
  15. 15.
    Dinev, T., Hart, P.: An extended privacy calculus model for e-commerce transactions. Inf. Syst. Res. 17(1), 61–80 (2006). Scholar
  16. 16.
    Van Eeuwen, M.: Mobile conversational commerce: messenger chatbots as the next interface between businesses and consumers. University of Twente (2017)Google Scholar
  17. 17.
    Epley, N., et al.: On seeing human: a three-factor theory of anthropomorphism. Psychol. Rev. 114(4), 864–886 (2007). Scholar
  18. 18.
    Følstad, A., Nordheim, C.B., Bjørkli, C.A.: What makes users trust a chatbot for customer service? An exploratory interview study. In: Bodrunova, S.S. (ed.) INSCI 2018. LNCS, vol. 11193, pp. 194–208. Springer, Cham (2018). Scholar
  19. 19.
    Fournier, S.: Consumers and their brands: developing relationship theory in consumer research. J. Consum. Res. 24(4), 343–353 (1998). Scholar
  20. 20.
    Go, E., Sundar, S.S.: Humanizing chatbots: the effects of visual, identity and conversational cues on humanness perceptions. Comput. Hum. Behav. (2019). Scholar
  21. 21.
    Griol, D., et al.: An automatic dialog simulation technique to develop and evaluate interactive conversational agents. Appl. Artif. Intell. 27(9), 759–780 (2013). Scholar
  22. 22.
    Guzman, A.L.: Voices in and of the machine: source orientation toward mobile virtual assistants. Comput. Hum. Behav. 90, 343–350 (2019). Scholar
  23. 23.
    Hassanein, K., Head, M.: Manipulating perceived social presence through the web interface and its impact on attitude towards online shopping. Int. J. Hum Comput Stud. 65(8), 689–708 (2007). Scholar
  24. 24.
    Hayes, A.F.: PROCESS: a versatile computational tool for observed variable mediation, moderation, and conditional process modeling. White Paper, pp. 1–39 (2012). ISBN 978-1-60918-230-4Google Scholar
  25. 25.
    Hayes, A.F., Preacher, K.J.: Statistical mediation analysis with a multicategorical independent variable. Br. J. Math. Stat. Psychol. 67(3), 451–470 (2014). Scholar
  26. 26.
    Ischen, C., et al.: How important is agency? The persuasive consequences of interacting with a chatbot as a new entity. Paper presented at the Human-Machine Communication ICA Pre-Conference, Washington, D.C. (2019)Google Scholar
  27. 27.
    Kim, Y., Sundar, S.S.: Anthropomorphism of computers: is it mindful or mindless? Comput. Hum. Behav. 28(1), 241–250 (2012). Scholar
  28. 28.
    Ledbetter, A.M.: Measuring online communication attitude: instrument development and validation. Commun. Monogr. 76(4), 463–486 (2009). Scholar
  29. 29.
    Marathe, S., et al.: Who are these power users anyway? Building a psychological profile (2007)Google Scholar
  30. 30.
    Nass, C., Moon, Y.: Machines and mildlessness: social responses to computers. J. Soc. Issues 56(1), 86–103 (2000)CrossRefGoogle Scholar
  31. 31.
    van Noort, G., Willemsen, L.M.: Online damage control: the effects of proactive versus reactive webcare interventions in consumer-generated and brand-generated platforms. J. Interact. Mark. 26(3), 131–140 (2012). Scholar
  32. 32.
    Powers, A., Kiesler, S.: The advisor robot: tracing people’s mental model from a robot’s physical attributes. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, pp. 218–225 (2006).
  33. 33.
    Reeves, B., Nass, C.: The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambrigde University Press, New York (1996)Google Scholar
  34. 34.
    Sundar, S.S., Kim, J.: Machine heuristic: when we trust computers more than humans with our personal information. In: Proceedings of the 2019 Conference on Human Factors in Computing Systems - CHI 2019, pp. 1–9 (2019).
  35. 35.
    Sundar, S.S., Nass, C.: Source orientation in human-computer interaction: programmer, networker, or independent social actor. Commun. Res. 27(6), 683–703 (2000)CrossRefGoogle Scholar
  36. 36.
    Verhagen, T., et al.: Virtual customer service agents: using social presence and personalization to shape online service encounters. J. Comput. Commun. 19(3), 529–545 (2014). Scholar
  37. 37.
    Westin, A.F.: Privacy and freedom. Wash. Lee Law Rev. 25(1), 166 (1967)Google Scholar
  38. 38.
    Whang, C.: Voice shopping: the effect of the consumer-voice assistant parasocial relationship on the consumer’s perception and decision making (2018)Google Scholar
  39. 39.
    Wottrich, V.M., et al.: The role of customization, brand trust, and privacy concerns in advergaming. Int. J. Advert. 36(1), 60–81 (2017). Scholar
  40. 40.
    Xu, H., et al.: Examining the formation of individual’s privacy concerns: toward an integrative view. In: Proceedings of the International Conference on Information Systems, pp. 1–16 (2008).
  41. 41.
    Zajonc, R.B.: Mere exposure: a gateway to the subliminal. Curr. Dir. Psychol. Sci. 10(6), 224–228 (2001). Scholar
  42. 42.
    Zarouali, B., et al.: Predicting consumer responses to a chatbot on Facebook. Cyberpsychol. Behav. Soc. Netw. 21(8), 491–497 (2018). Scholar
  43. 43.
    Zhou, L., et al.: Non-local or local brands? A multi-level investigation into confidence in brand origin identification and its strategic implications. J. Acad. Mark. Sci. 38(2), 202–218 (2010). Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Carolin Ischen
    • 1
    Email author
  • Theo Araujo
    • 1
  • Hilde Voorveld
    • 1
  • Guda van Noort
    • 1
  • Edith Smit
    • 1
  1. 1.ASCoRUniversity of AmsterdamAmsterdamThe Netherlands

Personalised recommendations