Skip to main content

Nudge Nudge, Wink Wink: Sex Robots as Social Influencers

  • Chapter
  • First Online:
Sex Robots

Part of the book series: Philosophical Studies in Contemporary Culture ((PSCC,volume 28))

Abstract

It is likely that sex robots will exist in the near future, making the effect they might have on human relationships a pressing concern. In this future world, we can imagine sex robots shaping our personal and social relationships through their unique access to, and potential for influencing, our most intimate of behaviours. We investigate whether they might be employed to influence social behaviours in a positive way. The paper begins with an account of the state of the art, acknowledges powerful feminist criticisms that have been made of sex robots, and evaluates suggestions that it might be possible to design sex robots which do not raise these concerns and which might even work to influence social behaviours in a positive way. It then outlines a number of ways that sex robots might be used to educate, “nudge”, and influence people in positive ways. It defends the idea that it would be ethical to use sex robots to promote socially positive behaviours — behaviours that benefit others and improve social cohesion, such as fostering respect and empathy for persons — but not to promote commercial products for parochial interests. We argue that the former project could advance individual and social welfare, while preserving personal autonomy — a minimum requirement of which is the ability to make informed decisions — whereas the latter depends on a lack of transparency and democratic (public) control for its success, targets the vulnerability of the user to achieve its ends, and reinforces the problematic symbolism of negatively gendered sexuality. If sex robot design and application meet public requirements of transparency — enabling informed consent and reflective decision making — and democratic oversight — promoting accountability and the sharing of power with the public — it is conceivable that sex robots might assist, rather than harm society.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://campaignagainstsexrobots.org/

  2. 2.

    See Kirby, Forlizzi, and Simmons (2010, p. 323) for an explanation of this concept.

  3. 3.

    See, for example, Staub (1978); Bierhoff (2002).

  4. 4.

    Schmidt (2017) speaks at length of the importance of democratic control and transparency to personal autonomy. He also elaborates why these two conditions are necessary, if nudges are to be morally defensible against the charge that they exert uncontrolled influence over agents.

  5. 5.

    We follow the default in the literature and expect sex robots will predominantly be presented as female and their users will be men. This reflects the current state of the market, which is primarily aimed at meeting the expectations of heterosexual males. Preliminary research also reports that men are more likely to consider the use of sex robots as appropriate (Scheutz and Arnold 2016, 2017).

  6. 6.

    Currently, despite the hype, the “sex robots” commercially available are little more than “tricked-out” sex dolls. TrueCompanion.com have previously claimed to sell sex robots that can actively participate in conversation and respond to your touch. See http://www.truecompanion.com

  7. 7.

    Others, including David Levy (2008), Frank and Nyholm (2017), and Danaher (2017b) provide a comparable list of sex robot features, including being functionally autonomous, adaptive to their environment through artificial intelligence, and capable of learning. They also consider sex robots with this range of capabilities a near-future reality.

  8. 8.

    Kate Darling (2016, pp. 214–215) defines a social robot as “a physically embodied, autonomous [functionally] agent that communicates and interacts with humans on a social level.”

  9. 9.

    Coeckelbergh (2010) builds on this position and outlines how sex robots might use “vulnerability mirroring” to encourage the forming of human-robot bonds.

  10. 10.

    See Frank and Nyholm (2017) on the literature dealing with the ethics of sex robots.

  11. 11.

    See https://campaignagainstsexrobots.org/

  12. 12.

    See David Levy (2008).

  13. 13.

    Frank and Nyholm (2017) extend this argument and connect Richardson’s concern to the further problem of consent.

  14. 14.

    While Sparrow is focused on the representation of women, he also acknowledges that similarly representing men would be morally problematic.

  15. 15.

    See Borenstein and Arkin (2017, p. 500), who detail these capabilities.

  16. 16.

    Emerging communications technologies have become an important site for social interaction and the maintenance of social relationships (Prot et al. 2015; Rideout, Foehr, and Roberts 2010). In fact, Nicole Bluett-Boyd and her colleagues (2013, p. ix) assert they fulfil an essential function in socialisation, “creating a space for the exploration and construction of the social self” (Bluett-Boyd et al. 2013, p. ix) . On socialisation see (Grusec and Hastings 2015, p. xi; Maccoby 2015, p. 3) .

  17. 17.

    There is empirical evidence that people do recognise a range of emotional expressions in robots, and that this shapes their subsequent behaviour (Kirby, Forlizzi, and Simmons 2010).

  18. 18.

    Affective computing is “computing that relates to, arises from, or deliberately influences emotion or other affective phenomena … [and] includes giving a computer the ability … to respond intelligently to human emotion...” (Picard 1997, p. 3).

  19. 19.

    Under experimental conditions, Ham and Midden (2014) observed robots utilising affect to alter energy consumption patterns in humans, while Moshkina (2012) reports on improved compliance in emergency scenarios.

  20. 20.

    Unless, of course, your partner is your therapist. Informal sex therapy with one’s intimate partner is also an option that will remain available, but sex robots could provide an option that some people will find appealing as an alternative to therapy sessions exclusively with a partner, or with a qualified sex therapist.

  21. 21.

    For an example from the “persuasive technology” literature that utilise apps see (Toscos et al. 2006).

  22. 22.

    The pop-cultural example of the episode of the television show Friends’ “Monica-The Sex Teacher”, where Chandler is taught how to satisfy his partner sexually through a system of numbered erogenous zones (1–7) and patterns of attention (1–2–1-2-3-5…), is surprisingly revelatory for current purposes.

  23. 23.

    Scheutz and Arnold (2016) suggest their methodological survey of public thinking about sex and robots provides an empirical examination of conceptions of sex robots that is lacking in existing scholarship on this topic. An example of the questions asked is: “Would it be appropriate to use sex robots to demonstrate forms of sexual harassment for training and prevention?” Such questions were posed to 100 participants in the survey, who were U.S. subjects recruited through Amazon Mechanical Turk, and were categorised based on the independent variables of age and gender. The original findings of the 2016 survey were confirmed by a 2017 survey (Scheutz and Arnold 2017) with 198 similarly sourced participants. A limitation of this study, as Scheutz and Arnold (2016, p. 357) acknowledge, is that “actual sexual interaction with a robot” may alter attitudes toward appropriate characteristics and applications of future sex robots.

  24. 24.

    The two caveats are understood by Thaler and Sunstein (2008) to preserve liberty, thereby effectively tethering the use of nudges to a form of “libertarian paternalism ” and beneficence (see Nagatsu 2015).

  25. 25.

    A classic example of nudging is product placement in retail situations, with the positioning of goods influential on consumer choice s (Borenstein and Arkin 2017; Sunstein 2015). However, the use of nudging in marketing and as a retail strategy is contentious as to whether these applications risk undermining personal welfare, dignity and autonomy (see Sunstein 2015, p. 417).

  26. 26.

    See Langner, Hennigs , and Wiedmann (2013) for further elaboration of the social category of influencer. Note, when speaking of “social influencers”, we mean agents that deliberately affect the behaviour of a target audience.

  27. 27.

    This is an obvious alternative to the dilemma of consent discussed in sect. 4.1, but is contra the commercial imperatives derived from the market for sex robots.

  28. 28.

    It would be morally problematic to argue, in relation to human-human sexual interactions, that by withholding consent women expose themselves to an increased risk of being raped, and thereby should be always-consenting. We adopt a similar position when thinking about the design of sex robots.

  29. 29.

    The “yes” model requires affirmative consent, and does not support the interpretation that in the absence of a “no” sex is consensual, while the “negotiation model” requires a communicative exchange between partners that indicates their shared interest in having sex (Frank and Nyholm 2017, p. 318; see also Anderson 2005) . It is also important to understand that while consent is necessary for ethical sex, it is not sufficient (see Frank and Nyholm 2017, p. 319). One of the authors (Sparrow) remains sceptical that widespread adoption of sex robots under existing social conditions would be compatible with genuine social equality between the sexes.

  30. 30.

    Marketeers already exploit our propensity to anthropomorphise objects in our environment to persuade buyers (Zlotowski et al. 2015, p. 347), and the use of robots for this purpose is an obvious extension of the practice.

  31. 31.

    An example of the conflict between commercial imperatives of marketeers and social good is the public health concern attendant with the advertising of gambling products in environments, such as sporting events, that create an illusion of banality and of social acceptance of the product. See Thomas et al. (2016).

  32. 32.

    See Robert B Cialdini and Goldstein (2004, pp. 598–599) for explication of these aspects of social influencing.

  33. 33.

    Another relevant question for our investigation that we cannot hope to address within the confines of this chapter is whether “the act that is supposed to result from the nudge [is] likely to be beneficial to the person?” (Borenstein and Arkin 2017) . This condition relates to the concept of beneficence introduce earlier, which is central to the socialisation scenarios but not market influencing.

  34. 34.

    See Borenstein and Arkin (2016, p. 36) , who further develop this point. Robotic interventions of this type are a special case of the more general concern with the ethics of nudging that centre on the third-party intervention and paternalism undermining personal independence (see Sunstein (2015); Schmidt (2017)). Similar worries are relevant to the deploying of market influencers, who are also managed by third-party interests.

  35. 35.

    See Borenstein and Arkin (2016, 2017), who discuss this in more detail.

References

  • Adoun, M., A.S. Djossa, M.P. Gagnon, G. Godin, N. Tremblay, M.M. Njoya, S. Ratté, H. Gagnon, J. Côté, J. Miranda, and B.A. Ly. 2017. Information and communication technologies (ICT) for promoting sexual and reproductive health (SRH) and preventing HIV infection in adolescents and young adults. The Cochrane Database of Systematic Reviews 2017 (2).

    Google Scholar 

  • Bierhoff, H.W. 2002. Prosocial behaviour. Hove: Psychology Press.

    Google Scholar 

  • Bluett-Boyd, N., B. Fileborn, A. Quadara, and A.D. Moore. 2013. The role of emerging communication technologies in experiences of sexual violence: A new legal frontier? Journal of the Home Economics Institute of Australia 20 (2): 25.

    Google Scholar 

  • Borenstein, J., and R. Arkin. 2016. Robotic nudges: The ethics of engineering a more socially just human being. Science and Engineering Ethics 22 (1): 31–46.

    Article  Google Scholar 

  • Borenstein, J., and R.C. Arkin. 2017. Nudging for good: Robots and the ethical appropriateness of nurturing empathy and charitable behavior. AI & SOCIETY 32 (4): 499–507.

    Article  Google Scholar 

  • Brooks, A.G., and R.C. Arkin. 2007. Behavioral overlays for non-verbal communication expression on a humanoid robot. Autonomous Robots 22 (1): 55–74.

    Article  Google Scholar 

  • Chu, S.K.W., A.C.M. Kwan, R. Reynolds, R.R. Mellecker, F. Tam, G. Lee, A. Hong, and C.Y. Leung. 2015. Promoting sex education among teenagers through an interactive game: Reasons for success and implications. Games for Health Journal 4 (3): 168–174.

    Article  Google Scholar 

  • Cialdini, R.B., and N.J. Goldstein. 2004. Social influence: Compliance and conformity. Annual Review of Psychology 55: 591–621.

    Article  Google Scholar 

  • Cialdini, R.B., and V. Griskevicius. 2010. Social influence. In Advanced social psychology, eds. R.F. Baumeister and E.J. Finkel, 385–417. New York: Oxford University Press.

    Google Scholar 

  • Coeckelbergh, M. 2010. Artificial companions: Empathy and vulnerability mirroring in human-robot relations. Studies in Ethics, Law, and Technology 4 (3): 2.

    Google Scholar 

  • Danaher, J. 2017a. Robotic rape and robotic child sexual abuse: Should they be criminalised? Criminal Law and Philosophy 11 (1): 71–95.

    Article  Google Scholar 

  • ———. 2017b. The symbolic-consequences argument in the sex robot debate. In Robot sex: Social and ethical implications, eds. J. Danaher and N. McArthur, 123–154. Cambridge, MA: MIT Press.

    Chapter  Google Scholar 

  • ———. 2019. Building better sex robots: Lessons from feminist pornography. In AI love you, eds. Y. Zhou and M. Fischer, 133–147. Dordrecht: Springer.

    Chapter  Google Scholar 

  • Darling, K. 2016. Extending legal protection to social robots: The effects of anthropomorphism, empathy, and violent behavior towards robotic objects. In Robot law, eds. R. Calo, A.M. Froomkin, and I. Kerr. Cheltenham and Northhampton: Edward Elgar Publishing.

    Google Scholar 

  • Devlin, K. 2015. In defence of sex machines: Why trying to ban sex robots is wrong. The Conversation. [Online] Available: https://theconversation.com/in-defence-of-sex-machines-why-trying-to-ban-sex-robots-is-wrong-47641

  • Dicheva, D., C. Dichev, G. Agre, and G. Angelova. 2015. Gamification in education: A systematic mapping study. Educational Technology & Society 18 (3): 75–88.

    Google Scholar 

  • Dreyfus, H. 2004. Nihilism on the information highway: Anonymity versus commitment in the present age. In Community in the digital age: Philosophy and practice, eds. A. Feenberg and D. Barney. Maryland: Rowman & Littlefield.

    Google Scholar 

  • Duffy, B.R. 2003. Anthropomorphism and the social robot. Robotics and Autonomous Systems 42 (3–4): 177–190.

    Article  Google Scholar 

  • Fogg, B.J. 2003. Persuasive technology: Using computers to change what we think and do. San Francisco: Morgan Kaufmann Publishers.

    Book  Google Scholar 

  • Frank, L., and S. Nyholm. 2017. Robot sex and consent: Is consent to sex between a robot and a human conceivable, possible, and desirable? Artificial intelligence and law 25 (3): 305–323.

    Article  Google Scholar 

  • Fussell, S. R., Kiesler, S., Setlock, L. D. and Yew, V. 2008. How people anthropomorphize robots. 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

    Google Scholar 

  • Grusec, J.E., and P.D. Hastings. 2015. Handbook of socialization: Theory and research. 2nd ed. New York/London: The Guilford Press.

    Google Scholar 

  • Gutiu, S. 2016. The roboticization of consent. In Robot law, eds. R. Calo, A.M. Froomkin, and I. Kerr. Cheltenham and Northhampton: Edward Elgar Publishing.

    Google Scholar 

  • Ham, J., and C.J.H. Midden. 2014. A persuasive robot to stimulate energy conservation: The influence of positive and negative social feedback and task similarity on energy-consumption behavior. International Journal of Social Robotics 6 (2): 163–171.

    Article  Google Scholar 

  • Hansen, P.G., and A.M. Jespersen. 2013. Nudge and the manipulation of choice: A framework for the responsible use of the nudge approach to behaviour change in public policy. European Journal of Risk Regulation 4 (1): 3–28.

    Article  Google Scholar 

  • Kirby, R., J. Forlizzi, and R. Simmons. 2010. Affective social robots. Robotics and Autonomous Systems 58 (3): 322–332.

    Article  Google Scholar 

  • Langner, S., N. Hennigs, and K.-P. Wiedmann. 2013. Social persuasion: Targeting social identities through social influencers. Journal of Consumer Marketing 30 (1): 31–49.

    Article  Google Scholar 

  • Levy, D. 2008. Love and sex with robots: The evolution of human-robot relationships. New York: Harper Perennial.

    Google Scholar 

  • Maccoby, E.E. 2015. Historical overview of socialization research and theory. In Handbook of socialization: Theory and research, eds. J.E. Grusec and P.D. Hastings. New York/London: The Guilford Press.

    Google Scholar 

  • Moshkina, L. 2012. Improving request compliance through robot affect. Proceedings of the twenty-sixth AAAI conference on artificial intelligence. [Online] Available: https://www.aaai.org/ocs/index.php/AAAI/AAAI12/paper/view/5085/5368.

  • Nagatsu, M. 2015. Social nudges: Their mechanisms and justification. Review of Philosophy and Psychology 6 (3): 481–494.

    Article  Google Scholar 

  • Noar, S.M., H.G. Black, and L.B. Pierce. 2009. Efficacy of computer technology-based HIV prevention interventions: A meta-analysis. AIDS 23 (1): 107–115.

    Article  Google Scholar 

  • Picard, R.W. 1997. Affective computing. Cambridge, MA: MIT Press.

    Google Scholar 

  • Prot, S., C.A. Anderson, D.A. Gentile, W. Warburton, M. Saleem, C.L. Groves, and S.C. Brown. 2015. Media as agents of socialisation. In Handbook of socialization: Theory and research, eds. J.E. Grusec and P.D. Hastings. New York/London: The Guilford Press.

    Google Scholar 

  • Richardson, K. 2015. The asymmetrical "relationship": Parallels between prostitution and the development of sex robots. ACM SIGCAS Computers and Society 45 (3): 290–293.

    Article  Google Scholar 

  • Rideout, V.J., U.G. Foehr, and D.F. Roberts. 2010. Generation M2: Media in the lives of 8-to 18-year-olds. Merlo Park: Henry J. Kaiser Foundation.

    Google Scholar 

  • Scheutz, M., and T. Arnold. 2016. Are we ready for sex robots? Proceedings of the eleventh ACM/IEEE international conference on human robot interaction, pp. 351–358. [Online] Available: https://dl.acm.org/doi/10.5555/2906831.2906891

  • ———. 2017. Intimacy, bonding, and sex robots: Examining empirical results and exploring ethical ramifications. In Robot sex: Social and ethical implications, eds. J. Danaher and N. McArthur. Cambridge, MA: The MIT Press.

    Google Scholar 

  • Schmidt, A.T. 2017. The power to nudge. American Political Science Review 111 (2): 404–417.

    Article  Google Scholar 

  • Schubert, C. 2017. Exploring the (behavioural) political economy of nudging. Journal of Institutional Economics 13 (3): 499–522.

    Article  Google Scholar 

  • Sparrow, R. 2017. Robots, rape, and representation. International Journal of Social Robotics 9 (4): 465–477.

    Article  Google Scholar 

  • Staub, E. 1978. Positive social behavior and morality: Social and personal influences. New York: Academic.

    Google Scholar 

  • Sunstein, C.R. 2015. The ethics of nudging. Yale Journal on Regulation 32 (2): 413–450.

    Google Scholar 

  • Tannenbaum, D., C.R. Fox, and T. Rogers. 2017. On the misplaced politics of behavioural policy interventions. Nature Human Behaviour 1 (7): 0130.

    Article  Google Scholar 

  • Thaler, R.H., and C.R. Sunstein. 2008. Nudge: Improving decisions about health, wealth, and happiness. New Haven: Yale University Press.

    Google Scholar 

  • Thomas, S., Pitt, H., Bestman, A., Randle, M., Daube, M. and Pettigrew, S. 2016. Child and parent recall of gambling sponsorship in Australian sport. Victorian Responsible Gambling Foundation (Melbourne).

    Google Scholar 

  • Toscos, T., Faber, A., Shunying A., and Gandhi, M. P. 2006. Chick clique: Persuasive technology to motivate teenage girls to exercise. CHI 2006 Montreal, Canada, April 22-27, 2006.

    Google Scholar 

  • Turkle, S. 2017. Alone together: Why we expect more from technology and less from each other. London: Hachette.

    Google Scholar 

  • Złotowski, J., D. Proudfoot, K. Yogeeswaran, and C. Bartneck. 2015. Anthropomorphism: Opportunities and challenges in human–robot interaction. International Journal of Social Robotics 7 (3): 347–360.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark Howard .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Howard, M., Sparrow, R. (2021). Nudge Nudge, Wink Wink: Sex Robots as Social Influencers. In: Fan, R., Cherry, M.J. (eds) Sex Robots. Philosophical Studies in Contemporary Culture, vol 28. Springer, Cham. https://doi.org/10.1007/978-3-030-82280-4_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-82280-4_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-82279-8

  • Online ISBN: 978-3-030-82280-4

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics