Advertisement

Gender Bias in Chatbot Design

  • Jasper FeineEmail author
  • Ulrich Gnewuch
  • Stefan Morana
  • Alexander Maedche
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11970)

Abstract

A recent UNESCO report reveals that most popular voice-based conversational agents are designed to be female. In addition, it outlines the potentially harmful effects this can have on society. However, the report focuses primarily on voice-based conversational agents and the analysis did not include chatbots (i.e., text-based conversational agents). Since chatbots can also be gendered in their design, we used an automated gender analysis approach to investigate three gender-specific cues in the design of 1,375 chatbots listed on the platform chatbots.org. We leveraged two gender APIs to identify the gender of the name, a face recognition API to identify the gender of the avatar, and a text mining approach to analyze gender-specific pronouns in the chatbot’s description. Our results suggest that gender-specific cues are commonly used in the design of chatbots and that most chatbots are – explicitly or implicitly – designed to convey a specific gender. More specifically, most of the chatbots have female names, female-looking avatars, and are described as female chatbots. This is particularly evident in three application domains (i.e., branded conversations, customer service, and sales). Therefore, we find evidence that there is a tendency to prefer one gender (i.e., female) over another (i.e., male). Thus, we argue that there is a gender bias in the design of chatbots in the wild. Based on these findings, we formulate propositions as a starting point for future discussions and research to mitigate the gender bias in the design of chatbots.

Keywords

Chatbot Gender-specific cue Gender bias Conversational agent 

References

  1. 1.
    ACM: Code of Ethics and Professional Conduct. https://www.acm.org/code-of-ethics (2019). Accessed 26 July 2019
  2. 2.
    de Angeli, A., Brahnam, S.: Sex Stereotypes and Conversational Agents (2006)Google Scholar
  3. 3.
    Araujo, T.: Living up to the chatbot hype: the influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Comput. Hum. Behav. 85, 183–189 (2018).  https://doi.org/10.1016/j.chb.2018.03.051CrossRefGoogle Scholar
  4. 4.
    Artz, N., Munger, J., Purdy, W.: Gender issues in advertising language. Women Lang. 22(2), 20 (1999)Google Scholar
  5. 5.
    Beldad, A., Hegner, S., Hoppen, J.: The effect of virtual sales agent (VSA) gender – product gender congruence on product advice credibility, trust in VSA and online vendor, and purchase intention. Comput. Hum. Behav. 60, 62–72 (2016).  https://doi.org/10.1016/j.chb.2016.02.046CrossRefGoogle Scholar
  6. 6.
    Bhagyashree, R.: A chatbot toolkit for developers: design, develop, and manage conversational UI (2019). https://hub.packtpub.com/chatbot-toolkit-developers-design-develop-manage-conversational-ui/. Accessed 22 July 2019
  7. 7.
    Bickmore, T.W., Picard, R.W.: Establishing and maintaining long-term human-computer relationships. ACM Trans. Comput.-Hum. Interact. 12(2), 293–327 (2005).  https://doi.org/10.1145/1067860.1067867CrossRefGoogle Scholar
  8. 8.
    Bohnet, I.: What Works. Harvard University Press (2016)Google Scholar
  9. 9.
    Brahnam, S., de Angeli, A.: Gender affordances of conversational agents. Interact. Comput. 24(3), 139–153 (2012).  https://doi.org/10.1016/j.intcom.2012.05.001CrossRefGoogle Scholar
  10. 10.
    Brandtzaeg, P.B., Følstad, A.: Chatbots: changing user needs and motivations. Interactions 25(5), 38–43 (2018).  https://doi.org/10.1145/3236669CrossRefGoogle Scholar
  11. 11.
    Burnett, M., et al.: GenderMag: a method for evaluating software’s gender inclusiveness. Interact. Comput. 28(6), 760–787 (2016).  https://doi.org/10.1093/iwc/iwv046MathSciNetCrossRefGoogle Scholar
  12. 12.
    Council of Europe: Discrimination, artificial intelligence, and algorithmic decision-making (2018). https://rm.coe.int/discrimination-artificial-intelligence-and-algorithmic-decision-making/1680925d73
  13. 13.
    Cowell, A.J., Stanney, K.M.: Manipulation of non-verbal interaction style and demographic embodiment to increase anthropomorphic computer character credibility. Int. J. Hum.-Comput. Stud. 62(2), 281–306 (2005).  https://doi.org/10.1016/j.ijhcs.2004.11.008CrossRefGoogle Scholar
  14. 14.
    Dale, R.: The return of the chatbots. Nat. Lang. Eng. 22(5), 811–817 (2016).  https://doi.org/10.1017/S1351324916000243CrossRefGoogle Scholar
  15. 15.
    EU: Ethics Guidelines for Trustworthy AI (2019). https://ec.europa.eu/futurium/en/ai-alliance-consultation. Accessed 30 July 2019
  16. 16.
    Feine, J., Gnewuch, U., Morana, S., Maedche, A.: A taxonomy of social cues for conversational agents. Int. J. Hum.-Comput. Stud. 132, 138–161 (2019).  https://doi.org/10.1016/j.ijhcs.2019.07.009CrossRefGoogle Scholar
  17. 17.
    Feine, J., Morana, S., Maedche, A.: Designing a chatbot social cue configuration system. In: Proceedings of the 40th International Conference on Information Systems (ICIS). AISel, Munich (2019)Google Scholar
  18. 18.
    Feine, J., Morana, S., Maedche, A.: Leveraging machine-executable descriptive knowledge in design science research – the case of designing socially-adaptive chatbots. In: Tulu, B., Djamasbi, S., Leroy, G. (eds.) DESRIST 2019. LNCS, vol. 11491, pp. 76–91. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-19504-5_6CrossRefGoogle Scholar
  19. 19.
    Følstad, A., Brandtzæg, P.B.: Chatbots and the new world of HCI. Interactions 24(4), 38–42 (2017).  https://doi.org/10.1145/3085558CrossRefGoogle Scholar
  20. 20.
    Følstad, A., Brandtzaeg, P.B., Feltwell, T., Law, E.L.-C., Tscheligi, M., Luger, E.A.: SIG: chatbots for social good. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, SIG06:1‐SIG06:4. ACM, New York (2018).  https://doi.org/10.1145/3170427.3185372
  21. 21.
    Følstad, A., Skjuve, M., Brandtzaeg, P.: Different chatbots for different purposes: towards a typology of chatbots to understand interaction design, pp. 145–156 (2019)Google Scholar
  22. 22.
    Gnewuch, U., Morana, S., Maedche, A.: Towards designing cooperative and social conversational agents for customer service. In: Proceedings of the 38th International Conference on Information Systems (ICIS). AISel, Seoul (2017)Google Scholar
  23. 23.
    Hayashi, Y.: Lexical network analysis on an online explanation task. Effects of affect and embodiment of a pedagogical agent. IEICE Trans. Inf. Syst. 99(6), 1455–1461 (2016).  https://doi.org/10.1587/transinf.2015CBP0005CrossRefGoogle Scholar
  24. 24.
    Hone, K.: Empathic agents to reduce user frustration. The effects of varying agent characteristics. Interact. Comput. 18(2), 227–245 (2006).  https://doi.org/10.1016/j.intcom.2005.05.003CrossRefGoogle Scholar
  25. 25.
    Johannsen, F., Leist, S., Konadl, D., Basche, M., de Hesselle, B.: Comparison of commercial chatbot solutions for supporting customer interaction. In: Proceedings of the 26th European Conference on Information Systems (ECIS), Portsmouth, United Kingdom, 23–28 June 2018Google Scholar
  26. 26.
    Kraemer, N.C., Karacora, B., Lucas, G., Dehghani, M., Ruether, G., Gratch, J.: Closing the gender gap in STEM with friendly male instructors? On the effects of rapport behavior and gender of a virtual agent in an instructional interaction. Comput. Educ. 99, 1–13 (2016).  https://doi.org/10.1016/j.compedu.2016.04.002CrossRefGoogle Scholar
  27. 27.
    Louwerse, M.M., Graesser, A.C., Lu, S.L., Mitchell, H.H.: Social cues in animated conversational agents. Appl. Cogn. Psychol. 19(6), 693–704 (2005).  https://doi.org/10.1002/acp.1117CrossRefGoogle Scholar
  28. 28.
    McDonnell, M., Baxter, D.: Chatbots and gender stereotyping. Interact. Comput. 31(2), 116–121 (2019).  https://doi.org/10.1093/iwc/iwz007CrossRefGoogle Scholar
  29. 29.
    McTear, M.F.: The rise of the conversational interface: a new kid on the block? In: Quesada, J.F., Martín Mateos, F.J., López-Soto, T. (eds.) FETLT 2016. LNCS (LNAI), vol. 10341, pp. 38–49. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-69365-1_3CrossRefGoogle Scholar
  30. 30.
    Microsoft: Face recognition API (2019). https://azure.microsoft.com/en-us/services/cognitive-services/face/. Accessed 22 July 2019
  31. 31.
    Myers, M.D., Venable, J.R.: A set of ethical principles for design science research in information systems. Inf. Manag. 51(6), 801–809 (2014).  https://doi.org/10.1016/j.im.2014.01.002CrossRefGoogle Scholar
  32. 32.
    Nass, C., Moon, Y.: Machines and mindlessness social responses to computers. J. Soc. Issues 56(1), 81–103 (2000).  https://doi.org/10.1111/0022-4537.00153CrossRefGoogle Scholar
  33. 33.
    Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 72–78. ACM, New York (1994).  https://doi.org/10.1145/191666.191703
  34. 34.
    Nass, C., Moon, Y., Green, N.: Are machines gender neutral? Gender-stereotypic responses to computers with voices. J. Appl. Soc. Pyschol. 27(10), 864–876 (1997).  https://doi.org/10.1111/j.1559-1816.1997.tb00275.xCrossRefGoogle Scholar
  35. 35.
    Niculescu, A., Hofs, D., van Dijk, B., Nijholt, A.: How the agent’s gender influence users’ evaluation of a QA system. In: International Conference on User Science and Engineering (i-USEr) (2010)Google Scholar
  36. 36.
    npmjs: Gender-detection (2019). https://www.npmjs.com/package/gender-detection. Accessed 22 July 2019
  37. 37.
    Nunamaker, J.E., Derrick, D.C., Elkins, A.C., Burgoon, J.K., Patton, M.W.: Embodied conversational agent-based kiosk for automated interviewing. J. Manag. Inf. Syst. 28(1), 17–48 (2011).  https://doi.org/10.2753/mis0742-1222280102CrossRefGoogle Scholar
  38. 38.
  39. 39.
    United Nations: Sustainability development goals. Goal 5: gender equality (2015). https://www.sdgfund.org/goal-5-gender-equality. Accessed 30 Oct 2019
  40. 40.
    Vala, M., Blanco, G., Paiva, A.: Providing gender to embodied conversational agents. In: Vilhjálmsson, H.H., Kopp, S., Marsella, S., Thórisson, Kristinn R. (eds.) IVA 2011. LNCS (LNAI), vol. 6895, pp. 148–154. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-23974-8_16CrossRefGoogle Scholar
  41. 41.
    Verhagen, T., van Nes, J., Feldberg, F., van Dolen, W.: Virtual customer service agents. Using social presence and personalization to shape online service encounters. J. Comput.-Mediat. Commun. 19(3), 529–545 (2014).  https://doi.org/10.1111/jcc4.12066CrossRefGoogle Scholar
  42. 42.
    Weizenbaum, J.: ELIZA - a computer program for the study of natural language communication between man and machine. Commun. ACM 9(1), 36–45 (1966)CrossRefGoogle Scholar
  43. 43.
    West, M., Kraut, R., Chew, H.E.: I’d blush if I could: closing gender divides in digital skills through education (2019). https://unesdoc.unesco.org/ark:/48223/pf0000367416

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Jasper Feine
    • 1
    Email author
  • Ulrich Gnewuch
    • 1
  • Stefan Morana
    • 1
  • Alexander Maedche
    • 1
  1. 1.Institute of Information Systems and Marketing (IISM)Karlsruhe Institute of Technology (KIT)KarlsruheGermany

Personalised recommendations