Abstract
We propose that virtue ethics can be used to address ethical issues central to discussions about sex robots. In particular, we argue virtue ethics is well equipped to focus on the implications of sex robots for human moral character. Our evaluation develops in four steps. First, we present virtue ethics as a suitable framework for the evaluation of human–robot relationships. Second, we show the advantages of our virtue ethical account of sex robots by comparing it to current instrumentalist approaches, showing how the former better captures the reciprocal interaction between robots and their users. Third, we examine how a virtue ethical analysis of intimate human–robot relationships could inspire the design of robots that support the cultivation of virtues. We suggest that a sex robot which is equipped with a consent-module could support the cultivation of compassion when used in supervised, therapeutic scenarios. Fourth, we discuss the ethical implications of our analysis for user autonomy and responsibility.
Similar content being viewed by others
Notes
Other influential virtue ethical traditions originated with, for example, Confucius or Buddhism. For reasons of space, we shall restrict ourselves to a (neo-)Aristotelian account of virtue, but we suspect that the investigation of other virtue traditions could yield an interesting intercultural approach to the ethics of social robotics. See also [51].
Isaac Asimov’s famous laws of robotics, often cited as illustration in the ethics of AI literature, are modelled after deontological formulations of how one ought to act. They brilliantly showcase the inherent tension between deontological robotic directives and the potentially disastrous consequences that strict adherence to these might have.
It is worth noting that on Sparrow’s account one will have to bite the bullet and say that rape-play by consenting adults is morally wrong as well. Not everyone will be willing to accept this implication.
Obviously, the consent provided by a robot does not amount to legally binding consent, just like the rape of a robot would not constitute legal rape, for the simple reason that a robot is not a legal person and not a sentient being. Hence, we are discussing here the implications of a robot behaving in a certain way, not necessarily implying the existence of human-like cognitive, emotional states or identical legal status.
Sparrow [47] finds it “much less plausible that sustaining kind and loving relationships with robots can be sufficient to make us virtuous” (p. 473). He acknowledges, however, that such a claim needs to be supported by an argument as to why virtues are to be held against a standard different from vices and that this is a topic for further discussion. We do not share his intuition, though we agree with his latter point and would furthermore like to add that more empirical data on how human–robot interaction influences human behaviour is needed—which is one of the motivations for the proposal in Sect. 4 of the present paper.
This also illustrates that robot-sex is not or need not always be wrong. This would be as extravagant a claim as the suggestion that masturbation is always wrong.
The Swedish science-fiction television drama Äkta människor (Real humans, 2012) depicts an example of this when the relationship between Therese (Camilla Larsson) and her husband turns sour because he grows jealous of her ‘hubot’—a humanoid robot capable of exactly the functions Levy discusses. This depiction is fictional of course, but the force of the story at least casts doubt on any outright dismissal of the possibility that humans will become jealous of robots.
On the other hand, one might argue, as Sparrow does, that a non-consenting robot could potentially facilitate (the representation of) rape scenarios even more if the human partner ignores the robot’s consent. We do not have a solution for that problem here (although, for example, a simple ‘complete close-and-shutdown’ routine might be an option), but it is a main reason why we later in this paper suggest to test this kind of human–robot interaction in a therapeutic setting first, as testing under supervision may give us new insights on how to potentially deal with issues such as these. In any case, we are not convinced that this argument is sufficient to not further investigate the potential benefits of consenting robots.
In the spirit of virtue ethics, one could consider Dependent Personality Disorder (DPD) to be the other extreme on the compassion spectrum [6]:
They are willing to submit to what others want, even if the demands are unreasonable. Their need to maintain an important bond will often result in imbalanced or distorted relationships. They may make extraordinary self-sacrifices or tolerate verbal, physical, or sexual abuse.
It would be interesting to investigate how love and sex robots could be relevant for training and therapy for members of this group as well.
References
Abbey A (1991) Acquaintance rape and alcohol consumption on college campuses: how are they linked? J Am Coll Health 39(4):165–169. https://doi.org/10.1080/07448481.1991.9936229
Abbey A (1991) Misperception as an antecedent of acquaintance rape: a consequence of ambiguity in communication between men and women. In: Parrot A, Bechhofer L (eds) Acquaintance rape: the hidden crime. Academic press, New York, pp 96–111
Abney K (2012) Robotics, ethical theory, and metaethics: a guide for the perplexed. In: Lin P, Abney K, Bekey GA (eds) Robot ethics: the ethical and social implications of robotics. MIT Press, Cambridge, pp 35–52
Adams-Curtis LE, Forbes GB (2004) College women’s experiences of sexual coercion. Trauma Violence Abus 5(2):91–122. https://doi.org/10.1177/1524838003262331
Allen C, Varner G, Zinser J (2000) Prolegomena to any future artificial moral agent. J Exp Theor Artif Intell 12(3):251–261. https://doi.org/10.1080/09528130050111428
American Psychiatric Association (2013) Personality disorders. In: Diagnostic and statistical manual of mental disorders, 5th edn. American Psychiatric Association: Philadelphia. https://doi.org/10.1176/appi.books.9780890425596.dsm18
Article 36: Killing by machine: key issues for understanding meaningful human control (2015). http://www.article36.org/autonomous-weapons/killing-by-machine-key-issues-for-understanding-meaningful-human-control/
Banyard VL, Ward S, Cohn ES, Plante EG, Moorhead C, Walsh W (2007) Unwanted sexual contact on campus: a comparison of women’s and men’s experiences. Violence Vict 22(1):52–70
Bedaf S, Draper H, Gelderblom GJ, Sorell T, de Witte L (2016) Can a service robot which supports independent living of older people disobey a command? the views of older people informal carers and professional caregivers on the acceptability of robots. Int J Soc Robot 8(3):409–420. https://doi.org/10.1007/s12369-016-0336-0
Bender DS (2012) Mirror, mirror on the wall: Reflecting on narcissism. J Clin Psychol 68(8):877–885. https://doi.org/10.1002/jclp.21892
Björling EA, Rose E, Davidson A, Ren R, Wong D (2019) Can we keep him forever? Teens’ engagement and desire for emotional connection with a social robot. Int J Soc Robot. https://doi.org/10.1007/s12369-019-00539-6
Borges AM, Banyard VL, Moynihan MM (2008) Clarifying consent: primary prevention of sexual assault on a college campus. J Prev Interv Community 36(1–2):75–88. https://doi.org/10.1080/10852350802022324
Breazeal CL (ed) (2002) Designing sociable robots. MIT Press, Cambridge, MA
Byers ES, Heinlein L (1989) Predicting initiations and refusals of sexual activities in married and cohabiting couples. J Sex Res 26:210–231
Cappuccio ML, Peeters A, McDonald W (2019) Sympathy for Dolores: moral consideration for robots based on virtue and recognition. Philos Technol. https://doi.org/10.1007/s13347-019-0341-y
Clark A (2007) Soft selves and ecological control. In: Ross D, Spurrett D, Kincaid H, Stephens GL (eds) Distributed cognition and the will: individual volition and social context. MIT Press, New York, pp 101–122
Coeckelbergh M (2012) Growing moral relations: critique of moral status ascription. Palgrave, Basingstoke
Danaher J, McArthur N (eds) (2017) Robot sex. Social and ethical implications. MIT Press, Cambridge
Danielson P (ed) (1992) Artificial morality: virtuous robots for virtual games. Routledge, London
Deng B (2015) Machine ethics: the robot’s dilemma. Nature 523(7558):24–26. https://doi.org/10.1038/523024a
Dhawan N, Kunik ME, Oldham J, Coverdale J (2010) Prevalence and treatment of narcissistic personality disorder in the community: a systematic review. Compr Psychiatry 51(4):333–339. https://doi.org/10.1016/j.comppsych.2009.09.003
Di Paolo EA, Buhrmann T, Barandiaran XE (2017) Sensorimotor life: an enactive proposal. Oxford University Press, Oxford. https://doi.org/10.1093/acprof:oso/9780198786849.001.0001
Fernandes K, Cardoso JS, Astrup BS (2018) A deep learning approach for the forensic evaluation of sexual assault. Pattern Anal Appl 21(3):629–640. https://doi.org/10.1007/s10044-018-0694-3
Fischer JM, Ravizza M (1998) Responsibility and control: a theory of moral responsibility. Cambridge University Press, Cambridge
Floridi L, Sanders JW (2004) On the morality of artificial agents. Mind Mach 14(3):349–379. https://doi.org/10.1023/B:MIND.0000035461.63578.9d
Fröding BEE (2011) Cognitive enhancement, virtue ethics and the good life. Neuroethics 4(3):223–234. https://doi.org/10.1007/s12152-010-9092-2
Gips J (1995) Towards the ethical robot. In: Ford KM (ed) Android epistemology. MIT Press, Cambridge, pp 243–252
Goetz JL, Keltner D, Simon-Thomas E (2010) Compassion: an evolutionary analysis and empirical review. Psychol Bull 136(3):351–374. https://doi.org/10.1037/a0018807
Güçlütürk Y, Güçlü U, Baró X, Escalante HJ, Guyon I, Escalera S, van Gerven MAJ, van Lier R (2017) Multimodal first impression analysis with deep residual networks. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2017.2751469
Humphreys TP (2004) Understanding sexual consent: an empirical investigation of the normative script for young heterosexual adults. In: Cowling M, Reynolds P (eds) Making sense of sexual consent. Ashgate, Farnham
Janssen JH, Tacken P, de Vries JGJ, van den Broek EL, Westerink JH, Haselager P, IJsselsteijn WA (2013) Machines outperform laypersons in recognizing emotions elicited by autobiographical recollection. Hum Comput Interact 28(6):479–517. https://doi.org/10.1080/07370024.2012.755421
Levy D (2007) Intimate relationships with artificial partners. Maastricht University, Maastricht
Levy D (2007) Love and sex with robots: the evolution of human–robot relationships. Harper-Perennial, New York
Levy D (2012) The ethics of robot prostitutes. In: Lin P, Abney K, Bekey GA (eds) Robot ethics: the ethical and social implications of robotics. MIT Press, Cambridge, MA, pp 223–232
Lim GY, Roloff ME (1999) Attributing sexual consent. J Appl Commun Res 27(1):1–23. https://doi.org/10.1080/00909889909365521
Miranda JA, Canabal MF, Portela García M, Lopez-Ongil C (2011) Embedded emotion recognition: Autonomous multimodal affective internet of things. In: Palumbo F, Pilato C, Pulina L, Sau C (eds) Proceedings of the cyber-physical systems workshop 2018, vol 2208. Alghero, Italy, pp 22–29
Richardson K (2016) Sex robot matters: slavery, the prostituted, and the rights of machines. IEEE Technol Soc Mag 35(2):46–53. https://doi.org/10.1109/MTS.2016.2554421
Rituerto-González E, Mínguez-Sánchez A, Gallardo-Antolín A, Peláez-Moreno C (2019) Data augmentation for speaker identification under stress conditions to combat gender-based violence. Appl Sci 9(11):2298. https://doi.org/10.3390/app9112298
Scheutz M (2012) The inherent dangers of unidirectional emotional bonds between humans and social robots. In: Lin P, Abney K, Bekey GA (eds) Robot ethics: the ethical and social implications of robotics. MIT Press, Cambridge, pp 205–222
Scheutz M, Arnold T (2017) Intimacy, bonding, and sex robots: examining empirical results and exploring ethical ramifications. In: Danaher J, McArthur N (eds) Robot sex. Social and ethical implications. MIT Press, Cambridge, pp 247–260
Sharkey N (2008) The ethical frontiers of robotics. Science 322(5909):1800–1801. https://doi.org/10.1126/science.1164582
Sharkey N, van Wynsberghe A, Robbins S, Hancock E (eds) (2017) Our sexual future with robots: a foundation for responsible robotics consultation report. Foundation for Responsible Robotics
Shirado H, Christakis N (2017) Locally noisy autonomous agents improve global human coordination in network experiments. Nature 545(7654):370–374. https://doi.org/10.1038/nature22332
Santoni de Sio F, van den Hoven J (2018) Meaningful human control over autonomous systems: a philosophical account. Front Robot AI 5:1–14. https://doi.org/10.3389/frobt.2018.00015
Sparrow R (2002) The march of the robot dogs. Ethics Inf Technol 4(4):305–318. https://doi.org/10.1023/A:1021386708994
Sparrow R (2016) Kicking a robot dog. In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI), IEEE, p 229, https://doi.org/10.1109/HRI.2016.7451756
Sparrow R (2017) Robots, rape, and representation. Int J Social Robot 9(4):465–477. https://doi.org/10.1007/s12369-017-0413-z
Strikwerda L (2017) Legal and moral implications of child sex robots. In: Danaher J, McArthur N (eds) Robot sex. Social and ethical implications. MIT Press, Cambridge, pp 133–152
Tonkens R (2012) Out of character: on the creation of virtuous machines. Ethics Inf Technol 14(2):137–149. https://doi.org/10.1007/s10676-012-9290-1
Traeger M, Sebo S, Jung M, Scassellati B, Christakis N (2019) Vulnerable robots positively shape human conversational dynamics in a human–robot team. Presented at Center for Empirical Research on Stratification and Inequality Spring 2019 Workshop at Yale University on January 31 (Unpublished manuscript)
Vallor S (2016) Technology and the virtues: a philosophical guide to a future worth wanting. Oxford University Press, Oxford
Varela FJ, Thompson E, Rosch E (1991) The embodied mind: cognitive science and human experience. MIT Press, Cambridge
Verbeek PP (2011) Moralizing technology: understanding and designing the morality of things. University of Chicago Press, Chicago
Wallach W, Allen C (2009) Moral machines: teaching robots right from wrong. Oxford University Press, Oxford
Winfield AFT, Blum C, Liu W (2014) Towards an ethical robot: internal, models consequences and ethical action selection. In: Mistry M, Leonardis A, Witkowski M, Melhuish C (eds) Lecture notes in computer science and advances in autonomous robotics systems, vol 8717. Springer, New York, pp 85–96. https://doi.org/10.1007/978-3-319-10401-0_8
Yamaji Y, Miyake T, Yoshiike Y, De Silva PRS, Okada M (2011) STB: child-dependent sociable trash box. Int J Soc Robot 3(4):359–370. https://doi.org/10.1007/s12369-011-0114-y
Acknowledgements
We dedicate this paper to our late colleague, teacher, and friend Louis Vuurpijl, who, with infectious enthusiasm, guided many students in their first steps into the field of robotics. Many thanks to Nick Brancazio, Miguel Segundo-Ortin, and several anonymous reviewers for their feedback on a previous draft of this paper.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Peeters, A., Haselager, P. Designing Virtuous Sex Robots. Int J of Soc Robotics 13, 55–66 (2021). https://doi.org/10.1007/s12369-019-00592-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-019-00592-1