There are many issues related to methodology and contents, when conceiving a questionnaire. My purpose was to highlight the relation between sex and gender, as biological, cultural and symbolic frames, and the development of technological futures. After attending lessons and developing a dialogue with the students I was going to interview, I realized that most of them were not familiar with Gender Studies or Feminism. I consulted with Professor Warwick; we agreed that the best results would follow the questionnaire being formulated in the most direct and accessible way. Although aware of the postmodern and queer criticism of the traditional female/male binary, the questionnaire employs it as a cultural and symbolic reference, which in no way is to be accounted in an essentialist manner. I would also like to note that race and ethnicity were directly addressed in one question only; a much deeper investigation is still needed in this particular respect. Here, I wish to clarify that I will not offer a sociological analysis of this survey. Instead, this article relies on the empirical data in order to develop a cultural discursive platform to reflect upon “the seeds” of the futures which are in the present, to go back to Masini. Based on the four approaches crucial to foresight, as outlined by Sohail Inayatullah , my approach will take upon the second and the third one, that is, the interpretative and the critical approach, and will not delve into the first (predictive) or the fourth approach (participatory action).
The questionnaire was articulated in eleven questions, administered to first year students, third year students and Ph.D. candidates, and answered by more than one hundred interviewees at the Department of Cybernetics, University of Reading (England). As displayed in Figs. 1 and 2, the gender of the respondents was mostly male, reflecting the current percentage of the students of the Department, as well as the predominant gender of the students enrolled since the beginning of the Program in 2004. The average age was in the early twenties. The prevalent ethnicity was English Caucasian, but a consistent number of students had different ethnic and national backgrounds. Note that, here, I will only focus on the results related to seven of the eleven questions, in order to concentrate on the crucial topics which surfaced. However, I am including the complete list below for scientific transparency. Consider that minor differences would have been applied to the questionnaire if submitted to first year students, third year students or Ph.D. Candidates.
When you think of a cyborg, do you think in terms of he/she/it/none?
When you think of a robot, do you think in terms of he/she/it/none?
Do you think gender has any role in the production of AI?*
Do you think there is any difference if a robot is conceived by a male or by a female scientist?*
Do you think of gender as a significant category in the future?*
Do you think that the new interaction between humans and AI will change the gender balance?*
Do you think that one of the two biological sexes will be more advantaged by the creation of AI?*
Would you consider relevant to address gender in any of the academic courses related to AI?*
Can you think of any experiment in AI where the gender difference would be valuable?
Do you think concepts such as race and ethnicity will be significant in the development of AI?*
Why are you interested in Artificial Intelligence?
*Questions 3/4/5/6/7/8/10 were further formulated in “Can you briefly explain why?”, to provide qualitative data, as well as quantitative. This is the reason why the next section, based on questions 1 and 2, does not present open answers. For all the other sections, I will quote the comments which were most common or most original, in order to maximize the understanding and use of the results to reflect upon the “seeds” of the futures.
Cyborgs and robots
While posing these two questions, I wished to unveil the gendered terms in which the students were thinking of their projects. The results of the questionnaire placed a clear emphasis on male characters: while the cyborg was thought of as neutral or male by the large majority, out of more than one hundred interviewees, no-one thought of robots in feminine terms, as we can see in Figs. 3 and 4. The historical and cultural dimension of technology is a crucial issue, when it comes to a proper understanding of such an unbalanced result. Science and technology are not only performed, they are first imagined. In Albert Einstein’s words: “Imagination is more important than knowledge. Knowledge is limited. Imagination encircles the world” . Envisaging the future does not create the future per se, but it may influence the way people perceive it, and ultimately perform in the actual constitution of reality. In the words of Masini:
“Visions are linked to people who carry the seeds of change, and are not mere abstraction. The ability to nurture the seeds of change and develop visions is even more important than the capacity for future analysis”.
Imagination is not separated from cultural, social and political contexts, although it can transcend them. Alison Adam, in her extensive work “Artificial Knowing” (1998) provides a sustained critique of AI, arguing that “the knowing of women (…) is left out of AI’s thinking machines”.  If the genealogy of knowledge silently informing AI is reduced to a male legacy, social exclusivism and biological essentialism may be re-inscribed in its ontology, with the consequent risk that the difference characterizing robots may be assimilated in human-centric practices of assimilation; parallelly, it may turn into a stigma for new forms of discriminations based on how far such a difference can be placed from the human norm. Posthumanism, the Philosophy of Sexual Difference, Feminist Epistemology, Subaltern Studies and Intersectionality, among other critical frames, offer crucial insights on how to develop emphatic approaches in the interaction with different forms of known and hypothetical entities. Such standpoints, arising from the “others” of the traditional subject of the Western hegemonic discourse, deconstruct the theoretical necessity of the symbolic other/the mirror/the speculum,Footnote 8 offering crucial hermeneutical tools in dealing with the singularianFootnote 9 multiplication of onto-epistemological differences.
Feminist epistemology and AI
In the Nineties, the feminist debate on science produced outstanding approaches, labelled under the encompassing term of Feminist Epistemology. The Standpoint Theory [25, 27], which arose amongst theorists such as Dorothy Smith, Donna Haraway, Sandra Harding and Patricia Hill Collins, emphasizes the starting point of knowledge production. Each human being views the world from a specific standpoint, which is informed by their embodiments, social and cultural structures, religious beliefs, spacetime, among other factors. Within this frame, the pursuit of disembodied neutral objectivity, traditionally claimed by scientific practice, is seen as a rhetorical move which has historically benefited those who claimed it. Technology and science are not free from sexist, racist and Eurocentric biases; their social construction is embedded in their methods and practice. Objectivity, on the other end, is situated and embodied; in Haraway’s words: “Feminist objectivity means quite simply situated knowledges” . Since marginalized and/or oppressed individuals and groups must learn the views of those who belong to the hegemony, while the ones located at the center of the hegemonic discourse are not required to learn about the margins, they can be considered bicultural, and their perspectives may be seen as more objective. This specific claim developed into the notion of “strong objectivity” . Feminist Epistemology sets the constitutive frame for the development of posthuman epistemological approaches. The formation of questions 3 and 4 was informed by these theories. Before proceeding further, I would like to remind the reader that, from question number 3, the questionnaire was further formulated into open answers; some of these will be quoted anon.
The results were mixed, displaying a variety of perspectives, as shown in Figs. 5 and 6. Some of the reasons given by respondents as to why they answered “Yes” are: “More males seem interested in AI” and “Robots made by females will probably look nicer”. The first answer exposes a crucial aspect which has already been addressed in this article. The second emphasizes design as one of the markers of the gender difference in technology. This viewpoint, which is very common, has received a number of criticisms by feminist thinkers. Linda L. Layne, for instance, presents a specific example to make her point: when some manufacturers realized that they had designed their phones for men, and not for people, they simply thought about altering the design. Through Genevieve Bell’s colorful definition , she refers to it as the “shrink it and pink it”  approach: when it comes to include gender in new technology, the first input is simply to change the color to more vivid ones. On one side, such an attitude can be perceived as a reduction and an assimilation; on the other, it is important to notice that design is crucial in the reception of technology by users - think about the centrality of notions such as accessibility and usability in the making of technology - and that the color change is not a neutral passage when accessed in the frame of psychological and socio-symbolic dynamics.
Another answer to question 3 was: “When machines become more autonomous and can more clearly define their identity, gender might be important because society might find it easier to accept them”. Such reflection emphasizes gender identity as a social code which will resist its biological legacies. Let me explain this further. If gender has been historically constructed around the sexual difference, now that no biological nor sexual motives are connected to the genders of the robots, gender finally proceeds in its raw hermeneutical vestiges. In other terms: even if sex will have no biological or physiological significance for robots, gender - its cultural apotheosis - will still be valuable for humans (at least in the near future), in order to relate more easily with our robotic significant others. In their series of experiments, Clifford Nass and Youngme Moon  have illustrated how people tend to relate to computers in the same way they would relate to other humans, including keeping the gender stereotypes and biases untouched, when the robot is given a female or a male voice.Footnote 10 To make humans at ease with robots, roboticists apply features which do not have any function other than reception. For instance, the simulation of emotion through various facial expressions, vocalizations, and movements by the robot Kismet,Footnote 11 was performed for the sole purpose of engaging the human audience. The range of affects involved in the human/robotic interactions are a subject of ongoing research in different fields: from Robopsychology, a specific form of psychology applied to robots, to Affective Computing, the branch of computer science focused on the development of artificial emotions. Philosophically, these fields of enquiry are related to the contemporary interest in the Affective Turn, which, developed out of Spinozian reminiscences, focusses on how affects affect the social, political, economical and cultural realms, and their affective relations .
Let’s now focus on the reasons given by respondents who answered “No” to this question, who offered a variety of interesting insights. For instance: “I don’t think AI is exclusively the pursuit of replicating human intelligence and therefore is free of the boundaries of gender difference”. AI is another type of intelligence, and it should not be reduced to the human range. Kevin Warwick has elaborated greatly on this aspect, in his view: “We need a viewpoint on AI that is much less anthropomorphic than the classical AI” . To clarify what Warwick means by this, we have to recall the human-centrism of classic AI, to which the final prototype of intelligence is human intelligence. Another simple and direct answer was: “It can be thought of as related to a toaster: a machine needs no gender”. The ones who might still need gender attributes are the humans, in order to better interact with the machine. I would like to quote one last “No” response to question 3: “No AI would ever be able to produce sperm nor knit a baby in the womb”. This observation leads to a reflection on the sexual interaction between humans and robots. David Levy , for instance, thinks that humans will be marrying robots in the near future. The fact that no biological reproduction will result from such an exchange may be seen as unproblematic by many: already at present, numerous human couples cannot, or decide not to, procreate.
This is one of the answers responding to “Maybe”: “I feel more women should be involved in the development of AI tools. I feel men in AI are obsessed with ‘creation’, whereas, because women give birth, women in AI are more concerned with building effective tools which enhance humans”. This perspective offers an interesting twist to common biases on female scientists. Their ability to procreate is not seen as an obstacle which might cause them to give priority to building a family instead of pursuing scientific research, as a widespread prejudice recalls. On the contrary, such a capacity is presented as an epistemological advantage, which may allow women to focus on creating “effective tools which enhance humans”, rather than trying to guarantee themselves a symbolic progenies through their researches. This reflection implicitly refers to Moravec’s “Mind Children”, in which he states:
“Unleashed from the plodding pace of biological evolution, the children of our minds will be free to grow to confront immense and fundamental challenges in the larger universe. We humans will benefit for a time from their labors, but sooner or later, like natural children, they will seek their own fortunes while we, their aged parents, silently fade away”. 
Such an oedipal view, sustained by the dualism “us/them”, fails to include concepts such as empathy or care, which characterize the relationship parents/children in the history of affection. Some feminist theorists have elaborated on this recurring metaphor. Adam, for instance, remarks on the notion of “playing god in the creation stories of the artificial A-Life worlds”.Footnote 12 From a psychoanalytical perspective, it can be suggested that a womb envy  may be motivating this type of researcher.
Question 4 received a light predominance of “No”, followed by “Maybe”, and lastly by “Yes”. Among the answers motivating the “Yes”, one of the respondents wrote: “A robotic fridge that targets people and throws beer to them is far more likely to be a male invention. So gender can affect the purpose of a robot”. Even though this example might seem trivial, I would like to briefly reflect on it. The relation between inventions and inventors is not easily predictable, but is still sustained by context and experience. Layne, for instance, remarks on how “the life experience of a designer informs every aspect of design, including problem identification and selection” , consequently, “it is more likely that feminist technologies will be designed by women” (ibidem). Before moving to the next question, I will quote two more answers, one formulated on the “No”: “People like to revolve around standardized robots”; one on the “Maybe”: “Depends if the scientist sees differences in gender roles. This difference may unknowingly come out in their work”. While the former reflection underlines the importance of establishing a common code which humans can employ to interact with different kinds of robots, the latter stresses the urgency for scientists to situate themselves, in order to be aware of the limitations that their standpoints might bear.
PostgenderismFootnote 13 refers to an hypothetical phase of the future during which the human sexual difference might be voluntarily overcome through the application of advanced biotechnologies. Although the term was first found in “A Cyborg Manifesto” (1985),Footnote 14 Donna Haraway has stated: “I have no patience with the term ‘post-gender’. I have never liked it” , as she explains:
“Gender is a verb, not a noun. Gender is always about the production of subjects in relation to other subjects, and in relation to artifacts. (…) Things need not be this way, and in this particular sense (…) I approve of the term ‘post- gender’. But this is not ‘post-gender’ in a utopian, beyond-masculine-and-feminine sense, which it is often taken to mean”. 
I am offering a brief genealogy of the term because, although its semantics might suit the reflections which led me to conceive question n. 5, its pragmatics do not comply with them; actually, the current narratives developing the term mostly fall into a techno-reductionism which does not take into account the cultural and social ramifications of gender identity. In the future, gender will most likely evolve into something different, and thus create a “post”, which does not imply obliterations, assimilations or neutralizations. Such an evolution might as well provide a multiplication of genders, not necessarily related to the feminine and masculine archetypes. The answers given by the students were mixed, reflecting the number of possibilities opened by such a question.
One of the responses given to formulate on the “Yes” was: “It will remain as significant as it has always been, but individuals will have more choices as to whether they want to be identified as male or female”. This answer points out a constitutive aspect of virtual reality. The possibilities related to experimenting with different digital identities, and specifically, to gender-role playing, have been widely discussed by Cyberfeminism since the Nineties, highlighting both its potentials and its limits. For instance, in her book “The War of Desire and Technology at the Close of the Mechanical Age” (1995), Sandy Stone, elaborated on the case of “Julie”, a man who created a well respected female identity online : the negative reception his “true” identity was met with by other on-line participants demonstrated the gap between social expectations and the possibilities inscribed within the virtual realm. More in general, on the relation between identity and technology, it is interesting to observe the development of the thought of Sherry Turkle, one of the pioneers focussing on the sociology and psychology of the growing impact of virtuality on the constitution of human identity. From her enthusiastic work “The Second Self: Computers and the Human Spirits” (1984) , in which she pointed out how computers cannot be seen as external tools, but are part of the social and personal life of their users, to “Life on the Screen: Identity in the Age of the Internet” (1995) , in which she debated that computers affect the ways humans see themselves as humans; to her last work “Alone Together: Why We Expect More from Technology and Less from Each Other” (2011) , in which she argues that social media represent more of an illusion of companionship rather than authentic communication. Back to our questionnaire, let’s present two more answers given to motivate the “Yes”: “As logic and emotion develop in machine learning I believe gender will have a stronger influence”, and “The ‘gender’ of an AI would affect how humans interact with it and thus it would become significant”. The role of gender is reaffirmed both for machines, in their process of identity formation, and for humans, in their interaction with the machines (Figs. 7 and 8).
Consider some of the following quotations from the respondents who answered “No”: “I would hope that over time, sexism and gender stereotypes will disappear”; “As it becomes more and more common to design ourselves (think what plastic surgery will be like in 50 years) or to abandon our original bodies entirely (mental uploading etc.), gender will become obsolete”. The term “obsolete” recurs in posthumanist and transhumanist literature, and needs a brief genealogical introduction. The first person to employ it in such contexts was the Australian artist Stelarc, who notably stated in various occasions: “the body is obsolete”. In his text “From Psycho-Body to Cyber-Systems: Images as Post-Human Entities” (1998), he explains: “It is time to question whether a bipedal, breathing body with binocular vision and a 1400 cc brain is an adequate biological form” . He has gone so far as proposing a “Third Life” [36, 61], where the Second Life formula of biological bodies extending their potentials through avatars will be reversed: in “Third Life”, avatars will be performing in the physical realm through various biological bodies.Footnote 15 Warwick himself has echoed Stelarc, referring to the possibility of developing a technology which will make telepathy possible: “Speech, as we know it, may well become obsolete” . I will conclude this section by mentioning one of the “Maybe” responses: “Technology will eventually level the gender difference with regard to abilities and chances, but opinions need to change first”. Technology is a constitutive aspect of the human: its achievements are not separated from the social and cultural contexts in which they are generated and employed.
When I formulated this question, I was intrigued to learn what the respondents thought in regard to the advantages brought about by their research in gender terms. This is an aspect which is hard to foresee, as Layne remarks: “Some feminist technologies are feminist by accident; that is, the benefit for women is an unintended consequence” .Footnote 16 The most common answer submitted was: “I don’t know”, followed by “Male”, and then “Female”. The following reason was offered by one respondent who answered “Male”: “Female's tasks usually have to have a flexible approach and hence are difficult to ‘automate’ ”. The same point can also apply to the opposite view. For instance, Genevieve Bell, while working as an anthropologist for Intel, recalls her surprise when, doing a research on early adopters of Wi-Fi and wireless technology, discovered that women were in fact the early adopters. She identified the reason specifically in such a flexible approach, and in the fact that women’s lives are generally characterized by larger amounts of multi-tasking . Among the other answers given to this question: “There are more male engineers working on this field” and “Most major breakthroughs are supported by military funding: most armed forces are made up primarily of males”. The latter observation emphasizes a crucial aspect not yet touched upon. The military funding has had a key role in scientific research since the early 20th century, starting with World War I and increasing massively with World War II . Computer sciences were almost entirely funded by the military in the first decades of their development . As of today, AI programs are still largely funded by defense money, which contributed, for instance, to the widely expanded military use of the unmanned aerial vehicles (UAV) (commonly known as “drones”) in the last decade, along with controversies about the growing number of civilian casualties caused by them . Following are some of the reasons given by the people who answered “Female”: “Women live longer than men and so will need to be cared for more at old age”; “Females have higher incidence of Alzheimer disease”. Both answers resonate with the fact that much research is being currently invested in developing robots capable of assisting with activities of daily living. For instance, Pearl was developed at the Carnegie Mellon University in 2004,Footnote 17 as a nursebot that could help the elderly at home. From a gender perspective, it is worth noticing that Pearl was given a female persona, and that part of the scientific challenge was “studying people’s responses to a robot’s perceived gender by changing Pearl’s lips and voice” . The role played by aesthetics was crucial in developing Pearl, and it may as well be seen as determinant for any robot built for social purposes. Another answer to question 6 was: “Robots with AI can do all of the housework which is predominantly done by women”. House-bots have actually proven to be harder to develop than expected. One of the reasons commonly given is that housework is more resistant to automation because it is characterized by constant interaction with different objects of unpredictable shapes; on the contrary, the assembly line in a factory, for instance, consists of repetitive work accomplished with the same type of objects. From a feminist perspective, such a slow advance may be perceived as the result of a lack of interest in developing technologies which would comply with tasks traditionally done by women. Nowadays, the increasing number of single men and of the elderly population in the Western world has given priority to such a commitment, with successful results such as Roomba, the autonomous robotic vacuum cleaner commercialized by iRobot since 2002.
Races and ethnicities
There is no gender separated from race, ethnicity, age, sexual orientation, and many other social and individual differential categories, as the intersectional approach has pointed out . Not having had an opportunity to formulate on this aspect in the questionnaire, I decided to pose one question specifically on the subject of race and ethnicity. A problem I immediately faced was scientific terminology. In Europe the term “race”Footnote 18 has not been reappropriated the way it has been within the US academic debates of the last decades, where the social construction of the term is a given which does not have to be remarked each and every time. Because of the fact that my research was pursued at the University of Reading (England), I decided to include in question 7 both notions of “race” and “ethnicity” - the latter one is often employed in the European political discourse to avoid racist connotations, thus risking, on the other side, to silence the issue of racism itself. I would also like to stress the fact that, within a posthumanist frame, race and its intersections with gender, class, and other categories, have yet to be fully addressed (Fig. 9).
The responses given by the students were mixed. These are some of the answers formulated on the “Yes”: “Advanced AI (one that could beat the Turing TestFootnote 19) will need to have some degree of culture associated with ethnicity”; “The assumed personality of the AI will affect its reception by certain social groups”. As in the case of gender, race is perceived as significant in its hermeneutical role. Humans relate to AI through human knowledge, which is structured through categories and beliefs. As Michael Omi and Howard Winant have pointed out:
“Everybody learns some combination, some version, of the rules of racial classification, and of her own racial identity, often without obvious teaching or conscious inculcation. (…) Race becomes ‘common sense’ - a way of comprehending, explaining, and acting in the world”. 
Far from being immune from these unwritten laws, science has actually held an active part in directing and legitimizing them: for instance, in the 19th and 20th century, the scientific claim of racial superiority was popularized by what will be later defined as social darwinism [22, 31]. Some other answers given as “Yes” remarked on the risk of ethnic and economic disparities being perpetrated: “The robot body will also be provided with voices and accents which will probably be American” and “The subjects of countries (the richest one) will get first access to these technologies”. The limits of technology in terms of accessibility has been pointed out by postcolonial and posthumanist theorists. Katherine Hayles, for instance, notes how “the techno-ecstasies found in various magazines” refer to “the transformation into the posthuman as if it were a universal human condition when in fact it affects only a small fraction of the world’s population” . It is also important to stress that the ethnic features given to the robots (for instance, “voices and accents which will probably be American”, which I would rephrase as “white American”) represent a form of neo-colonization that should not be underestimated.
The following answers were articulated on the “Maybe”: “Human-like robots will look like the country they have been created, e.g. in Japan they look and speak Japanese”; “Intelligence may be defined and seen differently depending on race and culture. Hence when AI is developed, the way of understanding it will be very different”. Humans relate to AI through human categories of comprehension, but these same categories may differ, depending on cultures, nationalities, social, political and religious backgrounds. For instance, in 2010 Japan hosted the first wedding conducted by a robot priest . Naho Kitano, in his article “Animism, Rinri, Modernization: the Base of Japanese Robotics” (2007) , associates such an open-mindedness about the spiritual relevance of robots, to the animist component of Shintoism. As early as 1974, Masahiro Mori, one of the Japanese pioneers of Robotics, presented robots as spiritual beings eligible for attaining buddhahood . Cultural beliefs play a crucial role in the reception and development of advanced AI, so that, while in the West robots are portrayed as the new “other” which might rebel and try to take over the world, like the golem in Jewish folklore or Mary Shelley’s Frankenstein , in Japan they partake of the spiritual quest. Some of the answers formulated on the “No” were: “Market must be international! They won’t spend fortunes with any ethnic limitations”, and “Race and Ethnicity are very abstract concepts. There have always been males and females. Borders and religions always change”. The former response underlines the centrality of economic profits in scientific developments. The latter points out the fact that race and ethnicity are not fixed notions, but are always changing, resonating with Omi and Winant’s view of race as a fluid and dynamic social construct . At the same time, this answer presents gender in a static way, while the concepts of “female” and “male” are constantly performed and re-enacted . Such results highlight the need for a deeper investigation in the topic of race, ethnicity and their intersectional significations in the development of technological futures.