Skip to main content

Anticipating Sex Robots: A Critique of the Sociotechnical Vanguard Vision of Sex Robots as ‘Good Companions’

  • 143 Accesses

Abstract

A number of companies have started to developed humanoid robots that (1) bear some physical resemblance to human beings, (2) have some ability to initiate movements (e.g. blinking; head-turning, gyration, etc.) and (3) possess some AI functionalities enabling quasi-intelligent environment-responsiveness and linguistic expression. The robots I am speaking of are sex robots. A promise frequently voiced by sex-robot developers (and some academics) is that sex robots will be “good companions” who can enrich and transform the romantic lives of human persons, particularly those who – for various reasons – have trouble entering into traditional love relationships with other humans. Curbing this technological enthusiasm, many philosophers have offered more critical anticipations of sex robots (sex-robot-anticipation hereafter) and the idea that they can, will, or should become good companions to human users. While these critical sex-robot-anticipations alert us to some of the potential harms that may follow from a proliferation of sex robots into society, the overarching aim of this chapter is to show that, by and large, these anticipations fall short. Specifically, they continue to frame the anticipation of sex robots around the question of their potential as good companions. In doing so, I argue that many sex-robot-anticipations at best marginalise key ethical questions pertaining to our potential future with sex robots. At their worst, these sex-robot-anticipations are inadvertently contributing to the potential realisation of a technology that they are precisely worried about. In the process of critically engaging with much of today’s philosophical sex-robot-anticipation, I will introduce two criteria I take to be of central importance for good sex-robot-anticipation; what I call ‘reflective anticipation’ and ‘technological groundedness.’

This work is part of the research programme Ethics of Socially Disruptive Technologies, which is funded by the Gravitation programme of the Dutch Ministry of Education, Culture, and Science and the Netherlands Organization for Scientific Research (NWO grant number 024.004.031).

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-88793-3_4
  • Chapter length: 29 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   89.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-88793-3
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Hardcover Book
USD   119.99
Price excludes VAT (USA)

Notes

  1. 1.

    Some have suggested that acts of anticipation should be replaced with other ways of coping with the (unforeseeable) consequences of emerging technologies (e.g. Van de Poel, 2016).

  2. 2.

    My focus on technological functionalities looks primarily at how digital technologies such as deep learning AI, blue tooth and the Internet of Things enable the robot to behave as it does. What I am largely leaving out is a discussion of the technical specifics of sex robots at the hard-ware/design level (though I touch on it implicitly in Sect. 5, when I mention the financial costs associated with getting a robot to approximate even just some of the bodily dimensions characteristic of human action and interaction). For feminist and phenomenological examinations of sex-robots at the level of how they are designed at the hard-ware level see, for instance, Bergen (2020), Danaher (2019) and Devlin (2015). I want to thank an anonymous reviewer for encouraging me to make explicit my focus on the technology in its software dimension as well as the work done by others at the hard-ware level.

  3. 3.

    In this sense, current sex-robot anticipations are departing from the “engineering-oriented” phase in philosophy of technology’s empirical turn, which, in Philip Brey’s words, “argued that the trouble with philosophy of technology was that it was not really about technology, and that its concern with social consequences made it forget about technology itself. … Philosophy of technology should endeavor to carefully describe and analyze the practices and products of engineering and in this way arrive at empirically informed, descriptively adequate philosophical theories of technology and engineering” (2010, 40). I want to thank an anonymous reviewer for encouraging me to situate my argument in relation to the empirical turn.

  4. 4.

    As an anonymous reviewer helpfully pointed out, there is of course a conceptual, legal, and practical distinction between privacy (often understood as referring to private life) and data protection. It is beyond the scope of this paper to delve into those distinctions. What I do hope to bring out (Sect. 5) is that in the context of sex-robot usage, the link between sex-robots as data-mongering systems and sex-robots as systems that enter the most intimate spheres of a person’s private life is what makes them uniquely attractive for marketing purposes.

  5. 5.

    As I discuss, some will dispute—for good reasons—that sex robots are expected to have a large socioeconomic impact. In Sect. 5, I offer a counterview to this stance.

  6. 6.

    See https://realbotix.com/

  7. 7.

    Admittedly, Levy discusses AI and other technological developments underpinning sex robot development in detail in his earlier book Robots Unlimited: Life in a Virtual Age (2005). But it matters that these technical specifics are largely absent in the book that has been so influential in setting the tone for how we talk about sex robots.

  8. 8.

    Of course, people recognize that sex robots will be systems that depend on a variety of technologies and technological systems for their functioning. Danaher notes, for instance that Realbotix’s “RealDoll’s AI will be cloud-based and will learn and adapt to its user’s preferences. This suggests a … significant and serious engagement with the latest AI technologies” (Danaher, 2017). But these technologies are typically merely touched on. My point here, has been that by dwelling on them in a more sustained way, our perception of sex robots—the sorts of things they are and the ethical implications of developing them and embedding them into the lives of often particularly vulnerable people—will likely change.

  9. 9.

    https://vpnoverview.com/privacy/devices/bluetooth/#:~:text=In%20most%20cases%2C%20Bluetooth%20is,have%20access%20to%20your%20information.&text=Bluetooth%20could%20disclose%20a%20great,phone%2C%20laptop%2C%20or%20computer

  10. 10.

    For instance, the Foundation for Responsible Robotics [FRR], maintains that Matt McMullin, CEO of Realbotix, “made a persuasive argument [to them] for the therapeutic use of robots and dolls for a certain sector of the population” (22). When asked by the FRR whether “we will see the prices become more affordable for sex robots?” McMullen’s answer is brief: “Time will tell on this, but we are hoping that the hardware and software we are developing will be affordable” (33). The FRR’s interview with McMullen moves onto the next question, inviting McMullen to elaborate on the technological functionalities Realbotix is focused on: “The AI is the key to all that we are working on” (32). This, to me, seems like a missed opportunity to dig deeper. A company committed to deep learning AI as “the key” is a company equally committed to the availability of big data that is needed to make Harmony appear as a good companion to its users.

  11. 11.

    Similar concerns can be raised about the legal mechanisms of privacy by design, and transparency. For instance, as stated on https://gdpr-info.eu/issues/privacy-by-design/, “there is still uncertainty about what “Privacy by Design” means, and how one can implement it. … Legislation leaves completely open which exact protective measures are to be taken.” With regard to the requirement for transparency, article 12(7) of the GDPR states that “The information to be provided to data subjects pursuant to Articles 13 and 14 may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner a meaningful overview of the intended processing.” However, as Wachter (2018) has argued, when icons are deemed sufficient to provide users with a robust sense of what they are consenting to, “the intended level of sophistication for the information provided appears to be low. It is thus questionable whether the notification duties will provide data subject with meaningful understanding of the risks of machine learning” (448). She concludes that “GDPR standards urgently require further specification and implementation into the design and deployment of IoT technologies” (448).

  12. 12.

    A 2020 survey conducted in Australia, Germany, Japan, New Zealand, the United Kingdom (U.K.) and the United States (U.S.) found that while in abstraction people increasingly express concerns about the privacy of their personal data, in practice, their behavior suggests otherwise. See https://www.prnewswire.com/news-releases/survey-shows-consumers-very-willing-to-trade-personal-data-for-financial-benefits-301106196.html

  13. 13.

    A 2018 report from Deloitte on predictive modeling for insurance underwriting emphasizes that “data, is not subject to the Fair Credit Reporting Act (FCRA) requirements, and does not require signature authority by the insurance applicant to use it in a model.” The report is quick to brush any ethical concerns about this aside by equating societal acceptance with ethical acceptability: “We believe society has accepted this openness, not without hesitation, because on average it provides more of what we want, less of what we do not. In addition to consumer marketing applications, predictive modeling using third- party consumer data has also been accepted for property and casualty insurance underwriting.” https://www.soa.org/globalassets/assets/library/newsletters/product-development-news/2018/june/pro-2018-iss110-stehno-guszcza.pdf

  14. 14.

    I want to thank an anonymous reviewer for encouraging me to explicate the exact type of harm I am concerned with.

References

  • Bergen, J. P. (2020). Love(Rs) in the Making: Moral Subjectivity in the Face of Sexbots. Paladyn, 11(1), 284–300. https://doi.org/10.1515/pjbr-2020-0016

    Google Scholar 

  • Brey, P. (2010, Winter). Philosophy of Technology after the Empirical Turn. Technè, 14(1), 36–48.

    Google Scholar 

  • Brey, P. (2012). Anticipatory Ethics for Emerging Technologies. Nanoethics, 6, 1–13.

    CrossRef  Google Scholar 

  • Calo, R. (2011). Robots and Privacy. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot Ethics: The Ethical and Social Implications of Robotics (pp. 187–202). MIT Press.

    Google Scholar 

  • Canepari, Z., Cooper, D., & Cott, E. (2015). The Uncanny Lover. Robotica video. https://www.nytimes.com/video/technology/100000003731634/the-uncanny-lover.html

    Google Scholar 

  • Coursey, K., Pirzchalski, S., McMullen, M., Lindroth, G., Furuushi, Y., et al. (2019). Living with Harmony: A Personal Companion System by RealbotixTM. In Y. Zhou & M. H. Fischer (Eds.), AI Love you. Springer Nature.

    Google Scholar 

  • Danaher, J. (2017). Should We Be Thinking About Sex Robots? In J. Danaher & N. McArthur (Eds.), Robot Sex: Social and Ethical Implications. MIT Press.

    CrossRef  Google Scholar 

  • Danaher, J. (2019). Building Better Sex Robots: Lessons from Feminist Pornography. In Y. Zhou & M. Fischer (Eds.), AI Love You. Springer. https://doi.org/10.1007/978-3-030-19734-6_7

    Google Scholar 

  • Dennett, D. (1984). Cognitive Wheels: The Frame Problem of AI. In The Robot’s Dilemma (pp. 41–74). (1987) reprinted from “Minds, machines and evolution”, Ablex Publishing.

    Google Scholar 

  • Devlin, K. (2015). In Defense of Sex Robots: Why Trying to Ban Sex Robots Is Wrong. Retrieved March 13, 2021, from https://www.gold.ac.uk/news/kate-devlin-the-conversation-sex-robots/

  • Di Nucci, E. (forthcoming). Sexual Rights, Disability and Sex Robots. In J. Danaher & N. McArthur (Eds.), Sex Robots. MIT Press.

    Google Scholar 

  • Ess, C. (2015). What’s Love Got to Do with It? Robots, Sexuality, and the Arts of Being Human. In M. Nørskov (Ed.), Social Robots: Boundaries, Potential, Challenges (pp. 57–79). Ashgate.

    Google Scholar 

  • Eveleth, R. (2016). The Truth about Sex Robots. BBC Future. Retrieved December 3, 2020, from https://www.bbc.com/future/article/20160209-the-truth-about-sex-robots

  • Frank, L., & Nyholm, S. (2017). From Sex Robots to Love Robots: Is Mutual Love with a Robot Possible? In J. Danaher & N. McArthur (Eds.), Robot Sex: Social and Ethical Implications (pp. 219–244). MIT Press.

    Google Scholar 

  • Heider, F., & Simmel, M. (1944, April 2). An Experimental Study of Apparent Behavior. American Journal of Psychology, 57(2), 243–259.

    CrossRef  Google Scholar 

  • Hilgarten, S. (2015). Capturing the Imaginary: Vanguards, Visions, and the Synthetic Biology Revolution. In Science and Democracy: Making Knowledge and Making Power in the Biosciences and Beyond (pp. 33–55). Routledge.

    CrossRef  Google Scholar 

  • Jasanoff, S. (2015). Future Imperfect: Science, Technology, and the Imaginations of Modernity. In S. Jasanoff & S.-H. Kim (Eds.), Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power. University Press Scholarship Online.

    CrossRef  Google Scholar 

  • Johnson, D. G., & Verdicchio, M. (2020). Constructing the Meaning of Humanoid Sex Robots. International Journal of Social Robotics, 12, 415–424. https://doi.org/10.1007/s12369-019-00586-z

    CrossRef  Google Scholar 

  • Kaye, L. (2016, February 10). Challenging Sex Robots and the Brutal Dehumanisation of Women. Campaign Against Sex Robots. Archived from the original on December 4, 2016.

    Google Scholar 

  • Kelleher, J. D. (2019). Deep Learning. MIT Press.

    CrossRef  Google Scholar 

  • Kelleher, J. D., & Tierney, B. (2018). Data Science. MIT Press.

    CrossRef  Google Scholar 

  • Levy, D. (2005). Robots Unlimited: Life in a Virtual Age. New York: A K Peters/CRC Press. https://doi.org/10.1201/b10697

  • Levy, D. (2009). Love and Sex with Robots: The Evolution of Human-Robot Relationships. Harper Collins.

    Google Scholar 

  • Lugano, G., Hudák, M., Ivančo, M., & Loveček, T. (2019). From the Mind to the Cloud: Personal Data in the Age of the Internet of Things. In Y. Zhou & M. H. Fischer (Eds.), AI Love You. Springer Nature Switzerland.

    Google Scholar 

  • McClelland, T. R. (2017, Summer and Autumn). Confronting Emerging New Technology: The Case of the Sexbots. The Journal of Mind and Behavior, 38(3), 247–270.

    Google Scholar 

  • Nye, D. E. (2006). Technology Matters: Questions to Live With. Cambridge: MIT Press

    Google Scholar 

  • Nyholm, S. (2020). Robots: ethics, Agency, and Anthropomorphism. Rowman & Littlefield.

    Google Scholar 

  • Scheutz, M. (2011). The Inherent Dangers of Unidirectional Emotional Bonds Between Humans and Social Robots. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot Ethics: The Ethical and Social Implications of Robotics (pp. 187–202). MIT Press.

    Google Scholar 

  • Sharkey, N., A. Van Wynsberghe, S. Robbins, & E. Hancock. (2017). Our Sexual Future with Robots: a Foundation for Responsible Robotics Consultation Report. Retrieved March 13, 2021, from https://responsible-robotics-myxf6pn3xr.netdna-ssl.com/wp-content/uploads/2017/11/FRR-Consultation-Report-Our-Sexual-Future-with-robots-1-1.pdf

  • Sullins, J. P. (2012, January). Robots, Love, and Sex: The Ethics of Building a Love Machine. IEEE Transactions on Affective Computing, 3(4), 398–409. https://doi.org/10.1109/T-AFFC.2012.31.S2CID253828

    CrossRef  Google Scholar 

  • Tibbals, C. (2016). Sex Robots Misquoting & Reason #74,193 I Only Do Written Interviews | Dr. Chauntelle Tibbals. Retrieved September 15, 2020.

    Google Scholar 

  • Turkle, S. (2004). Whither Psychoanalysis in the Computer Culture? Psychoanalytic Psychology, 21(1), 16–30.

    CrossRef  Google Scholar 

  • Van de Poel, I. (2016). An Ethical Framework for Evaluating Experimental Technology. Science and Engineering Ethics, 22(3), 667–686.

    CrossRef  Google Scholar 

  • Van de Poel, I. (2020). Three Philosophical Perspectives on the Relation Between Technology and Society, and How They Affect the Current Debate About Artificial Intelligence. Human Affairs, 30, 499–511.

    CrossRef  Google Scholar 

  • Van de Poel, I., & Royakkers, L. (2011). Ethics, Technology, and Engineering : An Introduction. Wiley-Blackwell. http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=510116

    Google Scholar 

  • Van Grunsven, J. (2020). Perceiving ‘Other’ Minds: Autism, 4E Cognition, and the Idea of Neurodiversity. The Journal of Consciousness Studies, 27(7–8), 115–143.

    Google Scholar 

  • Van Grunsven, J., & van Wynsberghe, A. (2019). A Semblance of Aliveness: How the Peculiar Embodiment of Sex Robots Will Matter. Techne: Research in Philosophy and Technology, 23(3), 290–317. https://doi.org/10.5840/techne20191125105

    Google Scholar 

  • Wachter, S. (2018). Normative Challenges of Identification in the Internet of Things: Privacy, Profiling, Discrimination, and the GDPR. Computer Law & Security Review, 34, 436–449.

    CrossRef  Google Scholar 

  • Weinberg, A. M. (1991). Can Technology Replace Social Engineering? In W. B. Thompson (Ed.), Controlling Technology: Contemporary Issues (pp. 41–48). Prometheus Books.

    Google Scholar 

  • Whitby, B. (2011). Do You Want a Robot Lover? In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot Ethics: The Ethical and Social Implications of Robotics. Intelligent Robotics and Autonomous Agents (pp. 233–249). MIT Press. ISBN 9780262016667.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Janna Van Grunsven .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Van Grunsven, J. (2022). Anticipating Sex Robots: A Critique of the Sociotechnical Vanguard Vision of Sex Robots as ‘Good Companions’. In: Terrone, E., Tripodi, V. (eds) Being and Value in Technology. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-88793-3_4

Download citation