Gender Bias in Chatbot Design

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11970)


A recent UNESCO report reveals that most popular voice-based conversational agents are designed to be female. In addition, it outlines the potentially harmful effects this can have on society. However, the report focuses primarily on voice-based conversational agents and the analysis did not include chatbots (i.e., text-based conversational agents). Since chatbots can also be gendered in their design, we used an automated gender analysis approach to investigate three gender-specific cues in the design of 1,375 chatbots listed on the platform We leveraged two gender APIs to identify the gender of the name, a face recognition API to identify the gender of the avatar, and a text mining approach to analyze gender-specific pronouns in the chatbot’s description. Our results suggest that gender-specific cues are commonly used in the design of chatbots and that most chatbots are – explicitly or implicitly – designed to convey a specific gender. More specifically, most of the chatbots have female names, female-looking avatars, and are described as female chatbots. This is particularly evident in three application domains (i.e., branded conversations, customer service, and sales). Therefore, we find evidence that there is a tendency to prefer one gender (i.e., female) over another (i.e., male). Thus, we argue that there is a gender bias in the design of chatbots in the wild. Based on these findings, we formulate propositions as a starting point for future discussions and research to mitigate the gender bias in the design of chatbots.


Chatbot Gender-specific cue Gender bias Conversational agent 


  1. 1.
    ACM: Code of Ethics and Professional Conduct. (2019). Accessed 26 July 2019
  2. 2.
    de Angeli, A., Brahnam, S.: Sex Stereotypes and Conversational Agents (2006)Google Scholar
  3. 3.
    Araujo, T.: Living up to the chatbot hype: the influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Comput. Hum. Behav. 85, 183–189 (2018). Scholar
  4. 4.
    Artz, N., Munger, J., Purdy, W.: Gender issues in advertising language. Women Lang. 22(2), 20 (1999)Google Scholar
  5. 5.
    Beldad, A., Hegner, S., Hoppen, J.: The effect of virtual sales agent (VSA) gender – product gender congruence on product advice credibility, trust in VSA and online vendor, and purchase intention. Comput. Hum. Behav. 60, 62–72 (2016). Scholar
  6. 6.
    Bhagyashree, R.: A chatbot toolkit for developers: design, develop, and manage conversational UI (2019). Accessed 22 July 2019
  7. 7.
    Bickmore, T.W., Picard, R.W.: Establishing and maintaining long-term human-computer relationships. ACM Trans. Comput.-Hum. Interact. 12(2), 293–327 (2005). Scholar
  8. 8.
    Bohnet, I.: What Works. Harvard University Press (2016)Google Scholar
  9. 9.
    Brahnam, S., de Angeli, A.: Gender affordances of conversational agents. Interact. Comput. 24(3), 139–153 (2012). Scholar
  10. 10.
    Brandtzaeg, P.B., Følstad, A.: Chatbots: changing user needs and motivations. Interactions 25(5), 38–43 (2018). Scholar
  11. 11.
    Burnett, M., et al.: GenderMag: a method for evaluating software’s gender inclusiveness. Interact. Comput. 28(6), 760–787 (2016). Scholar
  12. 12.
    Council of Europe: Discrimination, artificial intelligence, and algorithmic decision-making (2018).
  13. 13.
    Cowell, A.J., Stanney, K.M.: Manipulation of non-verbal interaction style and demographic embodiment to increase anthropomorphic computer character credibility. Int. J. Hum.-Comput. Stud. 62(2), 281–306 (2005). Scholar
  14. 14.
    Dale, R.: The return of the chatbots. Nat. Lang. Eng. 22(5), 811–817 (2016). Scholar
  15. 15.
    EU: Ethics Guidelines for Trustworthy AI (2019). Accessed 30 July 2019
  16. 16.
    Feine, J., Gnewuch, U., Morana, S., Maedche, A.: A taxonomy of social cues for conversational agents. Int. J. Hum.-Comput. Stud. 132, 138–161 (2019). Scholar
  17. 17.
    Feine, J., Morana, S., Maedche, A.: Designing a chatbot social cue configuration system. In: Proceedings of the 40th International Conference on Information Systems (ICIS). AISel, Munich (2019)Google Scholar
  18. 18.
    Feine, J., Morana, S., Maedche, A.: Leveraging machine-executable descriptive knowledge in design science research – the case of designing socially-adaptive chatbots. In: Tulu, B., Djamasbi, S., Leroy, G. (eds.) DESRIST 2019. LNCS, vol. 11491, pp. 76–91. Springer, Cham (2019). Scholar
  19. 19.
    Følstad, A., Brandtzæg, P.B.: Chatbots and the new world of HCI. Interactions 24(4), 38–42 (2017). Scholar
  20. 20.
    Følstad, A., Brandtzaeg, P.B., Feltwell, T., Law, E.L.-C., Tscheligi, M., Luger, E.A.: SIG: chatbots for social good. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, SIG06:1‐SIG06:4. ACM, New York (2018).
  21. 21.
    Følstad, A., Skjuve, M., Brandtzaeg, P.: Different chatbots for different purposes: towards a typology of chatbots to understand interaction design, pp. 145–156 (2019)Google Scholar
  22. 22.
    Gnewuch, U., Morana, S., Maedche, A.: Towards designing cooperative and social conversational agents for customer service. In: Proceedings of the 38th International Conference on Information Systems (ICIS). AISel, Seoul (2017)Google Scholar
  23. 23.
    Hayashi, Y.: Lexical network analysis on an online explanation task. Effects of affect and embodiment of a pedagogical agent. IEICE Trans. Inf. Syst. 99(6), 1455–1461 (2016). Scholar
  24. 24.
    Hone, K.: Empathic agents to reduce user frustration. The effects of varying agent characteristics. Interact. Comput. 18(2), 227–245 (2006). Scholar
  25. 25.
    Johannsen, F., Leist, S., Konadl, D., Basche, M., de Hesselle, B.: Comparison of commercial chatbot solutions for supporting customer interaction. In: Proceedings of the 26th European Conference on Information Systems (ECIS), Portsmouth, United Kingdom, 23–28 June 2018Google Scholar
  26. 26.
    Kraemer, N.C., Karacora, B., Lucas, G., Dehghani, M., Ruether, G., Gratch, J.: Closing the gender gap in STEM with friendly male instructors? On the effects of rapport behavior and gender of a virtual agent in an instructional interaction. Comput. Educ. 99, 1–13 (2016). Scholar
  27. 27.
    Louwerse, M.M., Graesser, A.C., Lu, S.L., Mitchell, H.H.: Social cues in animated conversational agents. Appl. Cogn. Psychol. 19(6), 693–704 (2005). Scholar
  28. 28.
    McDonnell, M., Baxter, D.: Chatbots and gender stereotyping. Interact. Comput. 31(2), 116–121 (2019). Scholar
  29. 29.
    McTear, M.F.: The rise of the conversational interface: a new kid on the block? In: Quesada, J.F., Martín Mateos, F.J., López-Soto, T. (eds.) FETLT 2016. LNCS (LNAI), vol. 10341, pp. 38–49. Springer, Cham (2017). Scholar
  30. 30.
    Microsoft: Face recognition API (2019). Accessed 22 July 2019
  31. 31.
    Myers, M.D., Venable, J.R.: A set of ethical principles for design science research in information systems. Inf. Manag. 51(6), 801–809 (2014). Scholar
  32. 32.
    Nass, C., Moon, Y.: Machines and mindlessness social responses to computers. J. Soc. Issues 56(1), 81–103 (2000). Scholar
  33. 33.
    Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 72–78. ACM, New York (1994).
  34. 34.
    Nass, C., Moon, Y., Green, N.: Are machines gender neutral? Gender-stereotypic responses to computers with voices. J. Appl. Soc. Pyschol. 27(10), 864–876 (1997). Scholar
  35. 35.
    Niculescu, A., Hofs, D., van Dijk, B., Nijholt, A.: How the agent’s gender influence users’ evaluation of a QA system. In: International Conference on User Science and Engineering (i-USEr) (2010)Google Scholar
  36. 36.
    npmjs: Gender-detection (2019). Accessed 22 July 2019
  37. 37.
    Nunamaker, J.E., Derrick, D.C., Elkins, A.C., Burgoon, J.K., Patton, M.W.: Embodied conversational agent-based kiosk for automated interviewing. J. Manag. Inf. Syst. 28(1), 17–48 (2011). Scholar
  38. 38.
  39. 39.
    United Nations: Sustainability development goals. Goal 5: gender equality (2015). Accessed 30 Oct 2019
  40. 40.
    Vala, M., Blanco, G., Paiva, A.: Providing gender to embodied conversational agents. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, Kristinn R. (eds.) IVA 2011. LNCS (LNAI), vol. 6895, pp. 148–154. Springer, Heidelberg (2011). Scholar
  41. 41.
    Verhagen, T., van Nes, J., Feldberg, F., van Dolen, W.: Virtual customer service agents. Using social presence and personalization to shape online service encounters. J. Comput.-Mediat. Commun. 19(3), 529–545 (2014). Scholar
  42. 42.
    Weizenbaum, J.: ELIZA - a computer program for the study of natural language communication between man and machine. Commun. ACM 9(1), 36–45 (1966)CrossRefGoogle Scholar
  43. 43.
    West, M., Kraut, R., Chew, H.E.: I’d blush if I could: closing gender divides in digital skills through education (2019).

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Institute of Information Systems and Marketing (IISM)Karlsruhe Institute of Technology (KIT)KarlsruheGermany

Personalised recommendations