Skip to main content

Should We Develop Empathy for Social Robots?

  • Chapter
  • First Online:
Sex Robots

Part of the book series: Philosophical Studies in Contemporary Culture ((PSCC,volume 28))

Abstract

Artificial intelligence (AI) technology and human-like robots raise key questions regarding appropriate relationships between humans and such social robots. These artificial, AI infused, creations are programmed to mimic and manipulate human emotions so as to provide affective companionship for humans. How should we treat them: as a mere machine, as a fellow human being, or as some third thing set in between the two? This chapter explores the ethical implications of affective, emphatic connections with robots. Is it possible to create a human-robot moral community? Section one provides methodological reflections on the empathy that humans often feel for robots. Empathy does not begin and end with human beings. Persons often have such feeling for pets, for example. Very probably in the future human empathy will extend to embrace highly intelligent, empathetic robots. Against this background, one urgent issue arises: Morally speaking, should we develop empathy for social robots? This paper critically explores different ways that humans might empathize with robot companions, examining the social and moral influence of various empathetic relationships on the human agents involved. I will argue from the Confucian point of view that given the dramatic difference between human emotions and programed robot emotions, we should be cautious of attempts to build love machines that will distract us from the more valuable pursuit of empathy and sympathy with our human fellows.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Both of the thought experiments are designed to test how intelligent an artificial system is rather than how intelligent it appears (Coeckelbergh 2009).

  2. 2.

    In the context of contemporary ethics, the term “sympathy” is usually reserved for those social and moral emotions like compassion. Sympathy for someone must involve caring for (and about) him,while “empathy” in the broad sense merely means a mechanics of feeling for the other.

  3. 3.

    To some extent, human capability to empathize with robots is a confirmed fact in many experiments.

  4. 4.

    “Ek-sist” is a phenomenological term for “exist”, empathizing the unique way of human existence, a unique kind of being in which we stand out among the beings which we deal with. As Heidegger (1991, p. 99) puts it, an emotion is like an atmosphere, “by force of which and in accordance with which we are always already lifted beyond ourselves into beings as a whole, which in this or that way matters or not matters to us.”

  5. 5.

    I borrow this phrase “poor in world” from Heidegger’s works. Heidegger originally uses this phrase to describe a mode of being that is ascribed to animals: an animal behaves towards the objects and other, but has no knowledge of its relation to the world. Likewise, Heidegger’s arguments can be applied to robots (see, for example, Coeckelbergh 2011).

  6. 6.

    In this essay, I will use sympathetic empathy for short. The distinction between projective empathy and sympathetic empathy is also a topic which interests many phenomenologists. For example, Husserl draws a line between what he calls “genuine sympathy” and empathy as a suggestive infusion of feeling in the Humean sense. He writes: “Pity does not mean suffering from the same thing as the Other [does], but [it means] pitying him, suffering from the fact that he is suffering and because he is suffering”. To illustrate this point, Husserl gives an example, “When I pity the Other due to the death of his father, I do not directly suffer from the death of his father, but from the fact that he has lost his father” (Husserl 2004, p. 194). We thereby are affected and unsettled as this situation is for the other, to the extent that it contains a tendency to act on her behalf, to care for her wants and needs.

  7. 7.

    Love for real persons is always intertwined with the emotions such as sympathy, compassion , tenderness, sociability, benign concern. As Merleau-Ponty says, “to love is inevitable to enter into an undivided situation with another” (Merleau-Ponty 1964, p. 154), so that what happens to the loved-one equally alarms and affects the lover. If one really loves another, he will suffer from her suffering.

References

  • Coeckelbergh, M. 2009. Personal robots, appearance, and the good: A methodological reflection on roboethics. International Journal of Social Robotics 1 (3): 217–221.

    Article  Google Scholar 

  • ———. 2011. Humans, animals, and robots: A phenomenological approach to human-robot relations. International Journal of Social Robotics 3 (2011): 197–204.

    Article  Google Scholar 

  • ———. 2018. Why care about robots? Empathy, moral standing, and the language of suffering. Journal of Philosophy & Science 20: 141–158.

    Google Scholar 

  • Darwall, S. 1998. Empathy, sympathy, care. Philosophical Studies 89: 261–282.

    Article  Google Scholar 

  • Hauskeller, M. 2017. Automatic sweethearts for transhumanists. In Robot sex: Social and ethical implications, eds. John Danaher and Neil McArthur, 229–248. Cambridge: The MIT Press.

    Google Scholar 

  • Heidegger, M. 1962. Being and time. Trans. John Macquarrie and Edward Robinson. New York: Harper and Row.

    Google Scholar 

  • ———. 1991. Nietzsche. Trans. David Farrell Krell. New York: Harper Collins Publishers.

    Google Scholar 

  • von Hildebrand, Dietrich. 2009. The nature of love. South Bend: St. Augustine’s Press.

    Google Scholar 

  • Hume, D. 1999. A treatise of human nature. Beijing: China Social Sciences Publishing House.

    Google Scholar 

  • Husserl, E. 2004. Einleitung in die Ethik. Vorlesungen Sommersemester 1920/1924 (Husserliana vol. XXXVII). Dordrecht: Herausge-geben von Henning Peucker. Boston, London: Kluwer Academic Publishers.

    Google Scholar 

  • James, W. 1909. The meaning of truth. In A sequel to ‘Pragmatism’. New York: Longmans, Green and Co.

    Google Scholar 

  • Kant, I. 1997. Lectures on ethics. New York: Cambridge University Press.

    Google Scholar 

  • Kern, I. 2012. Mengzi (Mencius), Adam Smith and Edmund Husserl on sympathy and conscience. In Intersubjectivity and objectivity in Adam Smith and Edmund Husserl, eds. Christel Fricke and Dagfinn Føllesdal, 139–170. Dordrecht: Ontos Verlag.

    Chapter  Google Scholar 

  • Levy, D. 2007. Love and sex with robots: The evolution of human-robot relationships. New York: Harper.

    Google Scholar 

  • Mencius. 1970. Translated with an introduction by D.C. Lau. New York: Penguin Books.

    Google Scholar 

  • Merleau-Ponty, M. 1964. The primacy of perception and other essays on phenomenological psychology. Evanston: Western University Press.

    Google Scholar 

  • Pereira, A., I. Leite, S. Mascarenhas, C. Martinho, and A. Paiva. 2011. Using empathy to improve human–robot relationships. In Human–robot personal relationship, 130–138. Berlin: Springer.

    Chapter  Google Scholar 

  • Redstone, J. 2014. Making sense of empathy with social robots. In Social robots and the future of social relations, eds. J. Seibt et al., 171–177. Amsterdam: IOS Press.

    Google Scholar 

  • Sartre, J. 1993. Being and nothingness. Trans. Hazel Barnes. New York: Washington Square Press.

    Google Scholar 

  • Scheler, M. 2009. The nature of sympathy. New Brunswick and London: Transaction Publishers.

    Google Scholar 

  • Scheutz, M. 2012. The inherent dangers of unidirectional emotional bonds between humans and social robots. In Robot ethics: The ethical and social implications of robotics, eds. P. Lin, K. Abney, and G.A. Bekey, 205–221. Cambridge: The MIT Press.

    Google Scholar 

  • Smith, A. 1976. The theory of moral sentiments. Indianapolis: Liberty Classics.

    Google Scholar 

  • Solomon, R. 2006. Emotions in phenomenology and existentialism. In A companion to phenomenology and existentialism, eds. Hubert L. Dreyfus and Mark A. Wrathall. West Sussex: Blackwell Publishing Ltd.

    Google Scholar 

  • Sparrow, R., and L. Sparrow. 2006. In the hands of machines? The future of aged care. Minds and Machines 16 (2): 141–161.

    Google Scholar 

  • Sullins, J. 2012. Robots, love, and sex: The ethics of building a love machine. IEEE Transactions on Affective Computing 3 (4): 398–409.

    Article  Google Scholar 

  • Turkle, S. 2011. Alone together: Why we expect more from technology and less from each other. New York: Basic Books.

    Google Scholar 

  • ———. 2015. Reclaiming conversation: The power of talk in a digital age. New York: Penguin Press.

    Google Scholar 

  • Xunzi. 2016. Beijing: Zhong Hua Book Company.

    Google Scholar 

Download references

Acknowledgements

This research is supported by the National Social Science Fund of China (国家社会科学基金项目) (grant no. 20BZX127), and Fundamental Research Funds for Universities (RW190410).

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Wang, J. (2021). Should We Develop Empathy for Social Robots?. In: Fan, R., Cherry, M.J. (eds) Sex Robots. Philosophical Studies in Contemporary Culture, vol 28. Springer, Cham. https://doi.org/10.1007/978-3-030-82280-4_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-82280-4_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-82279-8

  • Online ISBN: 978-3-030-82280-4

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics