Introduction

Chatbots are “artificial intelligence (AI) software that can simulate a conversation (or a chat) with a user in natural language through messaging applications, websites, mobile apps or through the telephone”.Footnote 1 In Africa, there has been a significant increase in the development and use of chatbots and their adoption is becoming widespread amongst data and service providers who use their capabilities to meet customer demands and needs amid an expanding customer base. The situation reflects how more businesses are turning towards AI to power their products and services in Africa, a growth not unrelated to the explosion of data and data collection and processing capabilities on the continent.

The use of chatbots is not limited to specific institutions as their use has been noted in banking and other financial institutions as well as in insurance, transportation and health care. However, although chatbots have garnered glowing reviews from various angles for the benefits they bring towards improving productivity and profit-making, the frequent anthropomorphisation of them as female, merits close scrutiny because of the impact on perceptions of women, as well as existing socio-cultural expectations, stereotypes and demands regarding how women are expected to act in society. In a UNESCO report titled “I’d blush if I Could”(UNESCO, 2019), UNESCO detailed the potential negative impacts of chatbots or voice-based conversational agents on societal perceptions of gender. According to the report, the proliferation of female-gendered conversational agents was primarily driven by customer preference and a non-critical examination of the product development decisions by product teams, which could entrench and perpetuate biases about women today.

Based on these reflections, this chapter will look at the deployment and integration of gendered chatbots in Nigerian institutions and the potential impact of this deployment on Nigerian women. Drawing from statistics about the industries and positions typically occupied and dominated by women, as well as the presence of women in the finance and technology development space, our contribution will evaluate the origins of the preference for these bots and proffer recommendations on ways to curb the negative effects of their deployment.

As chatbots are increasingly being used in the financial sector this chapter will focus on the use of chatbots in Nigerian commercial banks followed by a review of their use in other sectors. In 2018, Nigeria’s United Bank for Africa (UBA) launched the very first chatbot by a commercial bank in the country (Eleanya, 2018). This move was precipitated by the growing need to improve financial inclusion and improve customer experience by simplifying financial transactions (Nelson, 2019).

Since then, chatbots have been used to provide a variety of services by banks in Nigeria, ranging from airtime top-up, account enquiry and customer complaints. In late 2021, almost 50% of the 22 commercial banks listed on the Central Bank of Nigeria’s website (CBN Nigeria, 2021) had deployed chatbots in some shape or form. Although the rationale justifying their deployment by banks does not say anything about the gender selection process, it is clear that the logic behind their deployment has largely revolved around the need to promote customer satisfaction, convenience and safety (Adesanya, 2020; Zenith Bank, n.d.) on the part of the banks, in line with global trends.

The following section will cover the methodological approach used to collect the data for this study.

Methodology

Using secondary data collection methods, this study analysed chatbots deployed in commercial banks and other institutions in Nigeria. This is based on information available on websites such as those of the Central Bank of Nigeria (CBN), and national newspapers. This research is therefore based on the findings of the analysis conducted.

This study focuses primarily on the 22 commercial banks on the CBN website. The Supervisory Framework of the Central Bank of Nigeria is structured into two departments: “Banking Supervision” and “Other Financial Institutions”.Footnote 2 Commercial banks, alternatively described as deposit money banks, as well as discount houses, are under the jurisdiction of the former, while financial institutions such as Microfinance Banks, Bureaux-de-Change, Development Finance Institutions, Primary Mortgage Institutions and Finance Companies, which are described as “other”, are the purview of the latter.

Information regarding the presence and availability as well as the features and descriptions of chatbots on the websites of the 22 commercial banks was collected. For the purposes of the research, chatbots available on the websites of the banks, in addition to those deployed on social media platforms such as Facebook, WhatsApp, Instagram and Telegram were considered. It was possible to access chatbots hosted on the latter three through links provided on the social media of the banks, and through phone numbers registered to the chatbots.

The availability or existence of the bots was predicated on live links on the websites of each of the banks, social media posts alluding to them, and news reports. In cases where there were social media posts and news reports alluding to their existence, but no live links were found, for the purposes of this study, such banks were described as not having chatbots.

The analysis was based on three identifiable chatbot cues: name, avatar and a last category, tagged “other descriptor” for cues that did not fit the first two categories, to categorise and classify their gender presentation into three categories: male/masculine, female/feminine and gender-neutral. For the purpose of this study, gender-neutral means that “something is not associated with either women or men”Footnote 3 as the European Institute for Gender Equality opined. In this study, the term is used to refer to features that could be borne by both men and women as well as those which are neither stereotypically male nor female.

In addition to searches conducted on the websites of these banks, the research also sourced information about the chatbots from blog posts, news reports and journal publications, leveraging the information available on these platforms.

Findings—The Use of Chatbots in Nigerian Banks

The research found that 10 of Nigeria’s 22 commercial banks have either currently or in the past integrated chatbots into their product and service delivery. Based on their gender presentation, these chatbots are categorised based on either of these three features (Table 1).

Table 1 Table illustrating the three cues used in determining the gender of chatbots

Of these ten, based on the name given to the bot, the assigned gender based on pronouns or descriptions on the banks’ social media or official website and the graphic depiction of the avatars, 7 chatbots are gendered female. These are bots belonging to Zenith Bank, United Bank for Africa, Sterling Bank, First City Monument Bank, Fidelity Bank, Ecobank and Access Bank. On the other hand, Heritage Bank’s chatbot, which was integrated into its Octopus app was not gendered in any way and merely appeared as a feature of the mobile app while Keystone Bank’s Oxygen was presented with a robot arm or a stylised “O2”. Leo, United Bank for Africa’s chatbot, was the only male-gendered chatbot on the list.

The breakdown of the chatbots in Table 2, which shows the names, assigned gender and avatar of the bots in the 10 banks, reveals that the majority of them are female-gendered and designed with a preference for female characteristics revealed in stereotypically female names or avatars or both, except for three banks. These banks are Keystone Bank, which has a robot as its avatar and a neutral name, Heritage Bank, which has a neutral name for its bot and no avatar and United Bank for Africa, whose chatbot has a male-gendered avatar and name.

Table 2 List of Commercial Banks in Nigeria using chatbots as at December 2021

Analysis of the Table 2 reveals the following about the chatbot names:

  1. a.

    Only one of the names, Leo, is a stereotypical male name. It accounts for 10% of the chatbot names.

  2. b.

    Only 30% of the chatbot names are stereotypically female names. These are Kiki, Ivy and Tamada (a portmanteau of two feminine names), Tamara (a Hebrew word for date) and Ada (an Igbo name given to the first female child of a family).

  3. c.

    60% of the chatbot names are gender-neutral. These names are Sami (which could be short for Samantha, a female name or a Samuel, a common male name), Temi (a Yoruba word meaning “my own” which precedes many gender-neutral Yoruba names such as Temiloluwa and Temitope), ZIVA (an acronym for Zenith Intelligent Virtual Assistant) and Rafiki (a Swahili word for friend). Heritage Bank’s chatbot had no name and was simply a feature provided on their Octobus bank app and Oxygen, Keystone Bank’s chatbot is a colourless, odourless gas.

For the avatars:

  1. a.

    10% of the avatars had a male avatar who sported a low cut and wore a T-shirt and jeans, both commonly worn by Nigerian men.

  2. b.

    20% of the avatars presented as gender-neutral. These were Oxygen, which was often depicted with a stylised “O2” or a robot or robot arm and Heritage Bank’s chatbot which had no avatar.

  3. c.

    70% of the avatars are presented wearing common Nigerian female makeup, hairstyles and clothes. These are ZIVA, Kiki, Ivy, Sami, Temi, Rafiki and Tamada.

Other descriptors

Other descriptors used to verify the assigned gender granted to the chatbots where gender-neutral names but feminine avatars were present include pronouns such as “she”.

It is clear that based on the names, it is difficult to ascertain the assigned gender presentation of some of the chatbots such as Sami and ZIVA. However, from the gender presentation of their avatars and the pronouns used in their description, it is clear that they are unambiguously gendered females. From our analysis, therefore, 70% of the identified chatbots are gendered female.

Discussion on Gendered Chatbots

The criticism and support of the female-gendered chatbots and other conversational agents have been the subject of various conversations in AI ethics. Support for chatbots has been predicated on their efficiency with the conversational agents being forecast to save businesses as much as $7.3 billion by 2023 (Vogeler, 2019). The female-gendering of chatbots has also been pursued to promote the wider acceptance of bots. According to Borau et al. (2021), in specific domains such as health care, and with self-driving cars, where consumers are loath to trust recommendations and replies proffered by chatbots and other AI systems, the female-gendering of AI systems plays on human perceptions and stereotypes of warmth and friendliness associated with women. This is interesting to note, particularly as the world’s first chatbot was named Eliza.Footnote 4 Footnote 5

However, there are important considerations stemming from the development, use and impact of female-gendered chatbots. The attribution of the female gender to chatbots appears to be a predominant one as research analysing the design of 1,375 chatbots showed (Feine et al., 2019). The trend was especially visible in such sectors as customer service, sales and brand representation which are usually customer-facing roles. Women operating in these fields, or who work in fields requiring them to operate in the public eye, often face intense scrutiny while contending with stereotypes about their personality and appearance. For instance, when asked to describe two female presenters, one of whom spoke in a more stereotypically female voice than the other, respondents classified the former as less intelligent and trustworthy but more empathetic and warm than the latter (Voelker, 1994). Long-held social stereotypes and assumptions regarding women such as these may inform the creation and build of female-gendered conversational agents, which in turn may inform and influence social norms regarding women’s capacity and nature. The descriptions used by banks and blog writers to describe their roles and personalities are also telling of certain societal biases about the roles persons in the service industry, where women are well represented (PWC, 2020), are expected to play. In addition, they are revealing of societal stereotypes about the expected behaviour of women which is assumed to be solicitous, polite, always available and good communicators, hence their customer-facing roles. For instance, in a bid to encourage its use, Sterling Bank’s Kiki is said to be one to never “air”—a slang that means to ignore—messagesFootnote 6 (Sterling Bank Plc, 2020). This suggests an existing belief that   female-gendered chatbots are just as good as—or better than women—at communicating, a field considered well-suited to women. This belief underpins hiring and training practices which have seen women relegated and restricted to soft and feminine roles and their exclusion from those considered hard and serious such as programming or engineering, considered better-suited to men. In essence, such a gendered categorisation of women essentialises women’s position as good communicators, or even more concerning,  sexualises women in so far as female chatbots may be used to attract more customers or grow an institution’s profit through their soft, feminine voices and looks. Thus ignoring serious political questions of AI for social good, ethically and responsibly and as a technology that can play a role in empowering women.

A 2021 report from the International Finance Corporation (IFC)Footnote 7 (International Finance Corporation, 2021) which assessed the workplace gender equality policies of the 30 most capitalised companies listed on the Nigerian exchange revealed that although Nigerian women made up one-third of the workforce, this number lagged behind the global average by 5%. In addition, according to the report, women’s representation at the highest leadership levels ranged from 20% to 27%, in line with global trends which range from 17% to 25%. At the managerial level, in the Central Bank of Nigeria and amongst the five biggest Nigerian banks, namely: First Bank, United Bank for Africa, Guaranty Trust Bank, Access Bank, and Zenith Bank, which are collectively known as FUGAZ, 80% of the appointed Executive Directors are male (Ushedo, 2021). However, while it is difficult to establish the precise number of women in the banking sector due to the absence of gender-disaggregated data collected by the Nigerian Bureau of Statistics,Footnote 8 Nigerian women are well represented in the service industry, which the financial sector is a part of, although they are, in contrast, underrepresented in management positions in the same sector. On the other hand, there is also a corresponding dearth in the number of women working in ICT. According to UNESCO’s report, which details the general preference for female-gendered chatbots and other conversational agents, “a related or concurrent explanation for the predominance of female voice assistants may lie in the fact that they are designed by workforces that are overwhelmingly male” (UNESCO, 2019). Another report from the IFC reportFootnote 9 also revealed that just 18% of the total developer population in Nigeria were women. The homogenous male nature of technology talent and the overrepresentation of women in the service industry, when considered with the prevalence of social stereotypes about the role of women, therefore reveal a direct relationship with the ubiquity of female-gendered chatbots in financial institutions in Nigeria. The reinforcement of gender difference in technology is further seen through Green’s (2002) work when she states that because more women work in net-based professional settings, men are creating niche areas of internet-focused activity such as security. Green (2002, p. 188) indicates that such areas are “the ‘masculinised’ technocultural domain: which command high fees”. As a result, women are more likely to work in less high-profile areas. This kind of occupational segregation can be observed in work carried out by (Kotamraju, 2003) in the US, where she discovered that even though web design employs both men and women, they work in different areas, noting the division between graphic designers who did design and layouts and programmers who develop software. The graphic designers who were mostly women were lowly paid while the software development team was on higher rates. This was despite both groups being considered as web designers and dependent on each other to achieve shared company goals. Hafkin (2006) confirms this finding when she reveals that there are far fewer women who are systems analysts and programmers and even fewer women working as software and hardware engineers.

While the link between the performance of chatbots and corresponding assigned genders remains unclear, there is some evidence that developers often imbue conversational agents with specific ideas and feelings flowing from their personal perceptions of how their creations should work. In the case of Cortana, the Microsoft UX Lead, Jonathan Foster, had this to say: “we continue to endow her with make-believe feelings, opinions, challenges, likes and dislikes, even sensitivities and hopes. Smoke and mirrors, sure, but we dig in knowing that this imaginary world is invoked by real people who want detail and specificity” (Foster, 2019). But who are the real people who want detail and specificity? More likely than not, it is the creators and developers as Foster (2019) pointed out. Therefore as the UNESCO (2019) report indicates, designers and developers are overwhelmingly male who embed their own values and sense of what a female chatbot should sound or look like. As such, there is a serious argument to be made for a more serious examination of AI beyond what may be perceived as its neutrality by developers and designers by looking more closely at its negative impact on women.

The gendering of chatbots can have the unintended impact of entrenching and reflecting unfounded biases about the capabilities and abilities of both genders. Such a situation could occur when male-gendered bots handle requests more efficiently than female-gendered ones. This has been noted from the perception of the United Bank of Africa’s Leo which is Nigeria’s only male-gendered banking chatbot  which has been severally tagged the smartest banking chatbot in Nigeria (Moses-Ashike, 2021; Nweze, 2021). This could also be the case where male-gendered chatbots are deployed for use in typically male-dominated settings. Complications posed by the former can be likened to and result in what feminist researchers have tagged the glass elevator, a term coined to describe “the advantages that men receive in the so-called women's professions (nursing, teaching, librarianship, and social work), including the assumption that they are better suited than women for leadership positions” (Williams, 2013). The implications of this rather niche variety of technology-facilitated gender-based injustice against women could mean female representation in digital spaces is worth considering.

Similarly, the deployment of male-gendered bots to perform roles typically carried out by men also embeds conventions about the types of persons most suited to perform these roles. This is the case with Translators without Borders’ Shehu, a chatbot designed to facilitate understanding and answer questions about COVID-19 in North-Eastern Nigerian. Shehu is a multilingual bot that speaks English, Hausa and Kanuri, commonly-spoken languages in the region (TWB Communications, 2021b). According to the Translators without Borders website, the word “Shehu” “is an official title for a scholar, and refers to someone learned and knowledgeable”. However, while by definition, Shehu appears to be gender-neutral and is described using the gender-neutral pronoun “it” (TWB Communications, 2021a), it is both a title and a name typically borne by male scholars and male children. Furthermore, Shehu is also visually depicted as male. Shehu is depicted as wearing a traditional cap, typically worn by men as  as part of native attire. The combination of these features is arguably sufficient to determine that in spite of the gender-neutral definition of the name and the use of gender-neutral pronouns, Shehu is in fact designed to be male. It projects the appearance of a strong, intelligent and authoritative male figure which aligns with existing socio-cultural perception of Islamic scholars who are male and rarely female.

Gendered chatbots in communities in Africa is also worth considering as their use could introduce and, in some cases, further complicate gender relations in a way and manner akin to the impact of colonialism on indigenous women’s rights which were eroded in colonial and postcolonial societies. Women’s rights scholars have noted that in an attempt to undermine existing social hierarchies and structures, the activities of women’s groups were suppressed, resulting in the removal of institutions such as women’s support groups and chieftaincy titles (Alapo, 2014). It has also been noted that even Western education, typically appreciated for its role in the emancipation of women from traditional oppression, did not always have this result, as colonial education emphasised the preparation and training  women for domestic roles rather than leadership within society (Okome, 2002). Already, several studies have highlighted the many ways technology today is a tool of neocolonialism reminiscent of colonial extractivist activities (Iyer, 2018). AI-powered chatbots could therefore introduce and impose new forms of gendered expectations upon women. Many chatbots today already are marketed on the premise of an ever-ready, ever-available, polite assistant. For instance, First City Monument Bank’s Temi is given the following description: “Hi! I’m Temi, your personal person. I'll always have time for you any time of the day. Ready to discuss your plans be it health, travel or even future goals. The good news is, I get things done and I'll never reply to you with a ‘k’”.Footnote 10

Customer expectations regarding chatbots also highlight the need for a chatbot that is more responsive and always present, an expectation said to be defeated by having to deal with customer service representatives on their websites (Abdulquadri et al., 2021). Researchers have also noted the use of online platforms to spread novel and existing, i.e. merely technologically-assisted variants of technology-facilitated violence as the divide between online and offline spaces decreases (Henry and Powell, 2015).

Recommendations

At the moment, there is a huge increase in the use of AI-enabled chatbots; it is crucial to question the need for chatbots at the moment in a country where the unemployment rate has risen to 33.30% in the fourth quarter of 2020 from 27.10% in the second quarter of 2020. Exemplified by the Silicon Valley-esque “move fast and break things” approach to innovation, which has defined the uncritical development and deployment of AI systems and technology generally, today, techno chauvinism has been implicated severally in the decision of Nigerian banks and other institutions to integrate chatbots into service delivery as opposed to simply hiring more staff. In her book, Broussard (2018) defines techno chauvinism to mean the belief that technology is always the solution. In Nigeria, customer services at financial services are notoriously poor and often the subject of numerous social media complaints (Benson, 2018).  This dismal state of affairs may be explained by hiring practices in the industry. According to the National Bureau of Statistics,Footnote 11 approximately 42% of bank employees were contract staffers in 2020, and only 95,026 persons were employed in total throughout the industry, down from 103,610 from the previous year.Footnote 12 Judging from this, perhaps a more worthwhile response to the challenges of a growing customer base might be to channel resources towards the employment and training of more staff rather than the costly pursuit and development of chatbots to improve service delivery.

Corporations that have deployed gendered chatbots and other conversational agents have often justified this move citing the need to innovate in line with customer preferences and expectations for (personality traits such as humaneness and warmth typically associated with) female bots (Guo et al., 2020).  A suitable response to this may be derived from similar arguments in Science and Technology Studies (STS) research where rich criticism of  racialisation and whiteness in Artificial Intelligence development as well as the demeaning depictions of  people of colour  in science fiction exists (Sparrow, 2020).  There, some Science and Technology Studies scholars have  suggested  that one way to address such troubling depictions of people of colour is by deracialising the depiction of robots such that future robots or their avatars are designed to have blue or green skin. These suggestions have been countered by research showing that   the presence of racial and ethnic minorities  in media is  important for representation in some cases. In their commentary, Cave and Dihal (2021) mention that racialised bots might help build trust and increase interactions, particularly for and between marginalised groups. Applying this to gendered chatbots, it might be an argument worth considering that gendered chatbots are useful for representation and  in imbuing a sense of care in the attitudes of engineers and developers towards the final product. However, there are serious concerns worth noting about the underlying profit-making motivations of these corporations who are compelled to make a profit by centring customer needs and playing to societal stereotypes and expectations at the expense of the dehumanisation of women. Moving forward, product development teams need to reflect on the potential cultural harm gendered bots may pose societally. It is crucial for product teams to adequately consider if bots must be gendered during the development and iteration process. To achieve this, there is the need to develop guidelines for gender-equal design of chatbots that will help engineers in the diminishing possible gender stereotypes that could become embedded in the process.

Another option worth considering, as argued by Cave and Dihal (2021), is the use of AI to subvert stereotypes. Subverting stereotypes could be achieved by creating female bots or avatars for positions and roles in which women and even men are traditionally underrepresented. A great example of a male bot performing extremely well in a female-dominated field is United Bank for Africa’s Leo. It is possible to change or subvert the narrative about female capacity by depicting female-gendered bots breaking stereotypes regarding their capabilities without further entrenching gender biases.

For women to be active participants in AI as well as to have a sense of the social and economic potential of AI, much lies in the inclusion of their experiences and needs in technology policies (UNICTTF, 2002; Jorge, 2006) “gender-specific projects and programs, regulations that facilitate affordable access to women and the poor, establishment of universal access programs targeting women, licensing regimes that favour companies with gender-equality policies, and programs that consider women’s needs and realities” (Jorge, (2006, p. 74). These are regarded as crucial considerations in technology policies in order for women to be a part of the technology development process. The problem though, as Jorge concedes, is that although policies may mention gender-equality concerns, in most cases, they are not followed through at the regulation and implementation stage and thus remain merely as desirable add-ons.

On the part of the government, there is the need to develop gender-inclusive policies which prioritise inclusiveness in product and service delivery. This would reflect inclusiveness and respect for female employees and customers on the part of banks. Banking institutions need to reaffirm their commitment to guidelines such as the Nigerian Sustainable Banking Principles 2012 (CBN Nigeria, 2012). This can be achieved through the inclusion of women in various departments of these institutions. Such inclusion will require a political change in attitude simply because it calls for a reflection of and consequent change in power relations in order for women’s needs, aspirations and interests to be realised. For this to happen, women’s situation needs to be understood as a linkage between women’s human poverty, globalisation as well as gender inequality. Chatbots, therefore, need to be designed, developed, implemented and used in a gender-sensitive way that sees women as equal and not categorised as less than or other to men.

Lastly, there is the need for critical discussions on issues around gendered chatbots and societal perceptions. This requires ongoing interaction between researchers, practitioners, developers and users to address pertinent questions such as promoting diversity amongst chatbot developers, identifying gender bias in chatbot development, and avoiding “female-by-default” chatbot designs and ethical considerations of organisations.

Conclusion

Resolving technology-facilitated violence perpetuated and engendered through the use of gendered chatbots requires a combination of efforts from various stakeholders, including the government and private industry, requiring a multi-pronged approach in the form of awareness campaigns, training, research and policy development, and paying close attention to the implications of their development.

Furthermore, solutions cannot be implemented without the ignition of conversations regarding technology and Artificial Intelligence ethics in the technology space. To change the narrative, a greater discussion is about how the dearth of critical Nigerian technology studies and research about societal biases and stereotypes and the huge gap in the number of women working in this space have occasioned the misuse of technology. These conversations will facilitate discussions and create environments where product and engineering teams can work with researchers and individuals in the third sector to develop more humane, gender-responsive and respectful technology.