Abstract
Artificial intelligence (AI) technology and human-like robots raise key questions regarding appropriate relationships between humans and such social robots. These artificial, AI infused, creations are programmed to mimic and manipulate human emotions so as to provide affective companionship for humans. How should we treat them: as a mere machine, as a fellow human being, or as some third thing set in between the two? This chapter explores the ethical implications of affective, emphatic connections with robots. Is it possible to create a human-robot moral community? Section one provides methodological reflections on the empathy that humans often feel for robots. Empathy does not begin and end with human beings. Persons often have such feeling for pets, for example. Very probably in the future human empathy will extend to embrace highly intelligent, empathetic robots. Against this background, one urgent issue arises: Morally speaking, should we develop empathy for social robots? This paper critically explores different ways that humans might empathize with robot companions, examining the social and moral influence of various empathetic relationships on the human agents involved. I will argue from the Confucian point of view that given the dramatic difference between human emotions and programed robot emotions, we should be cautious of attempts to build love machines that will distract us from the more valuable pursuit of empathy and sympathy with our human fellows.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Both of the thought experiments are designed to test how intelligent an artificial system is rather than how intelligent it appears (Coeckelbergh 2009).
- 2.
In the context of contemporary ethics, the term “sympathy” is usually reserved for those social and moral emotions like compassion. Sympathy for someone must involve caring for (and about) him,while “empathy” in the broad sense merely means a mechanics of feeling for the other.
- 3.
To some extent, human capability to empathize with robots is a confirmed fact in many experiments.
- 4.
“Ek-sist” is a phenomenological term for “exist”, empathizing the unique way of human existence, a unique kind of being in which we stand out among the beings which we deal with. As Heidegger (1991, p. 99) puts it, an emotion is like an atmosphere, “by force of which and in accordance with which we are always already lifted beyond ourselves into beings as a whole, which in this or that way matters or not matters to us.”
- 5.
I borrow this phrase “poor in world” from Heidegger’s works. Heidegger originally uses this phrase to describe a mode of being that is ascribed to animals: an animal behaves towards the objects and other, but has no knowledge of its relation to the world. Likewise, Heidegger’s arguments can be applied to robots (see, for example, Coeckelbergh 2011).
- 6.
In this essay, I will use sympathetic empathy for short. The distinction between projective empathy and sympathetic empathy is also a topic which interests many phenomenologists. For example, Husserl draws a line between what he calls “genuine sympathy” and empathy as a suggestive infusion of feeling in the Humean sense. He writes: “Pity does not mean suffering from the same thing as the Other [does], but [it means] pitying him, suffering from the fact that he is suffering and because he is suffering”. To illustrate this point, Husserl gives an example, “When I pity the Other due to the death of his father, I do not directly suffer from the death of his father, but from the fact that he has lost his father” (Husserl 2004, p. 194). We thereby are affected and unsettled as this situation is for the other, to the extent that it contains a tendency to act on her behalf, to care for her wants and needs.
- 7.
Love for real persons is always intertwined with the emotions such as sympathy, compassion , tenderness, sociability, benign concern. As Merleau-Ponty says, “to love is inevitable to enter into an undivided situation with another” (Merleau-Ponty 1964, p. 154), so that what happens to the loved-one equally alarms and affects the lover. If one really loves another, he will suffer from her suffering.
References
Coeckelbergh, M. 2009. Personal robots, appearance, and the good: A methodological reflection on roboethics. International Journal of Social Robotics 1 (3): 217–221.
———. 2011. Humans, animals, and robots: A phenomenological approach to human-robot relations. International Journal of Social Robotics 3 (2011): 197–204.
———. 2018. Why care about robots? Empathy, moral standing, and the language of suffering. Journal of Philosophy & Science 20: 141–158.
Darwall, S. 1998. Empathy, sympathy, care. Philosophical Studies 89: 261–282.
Hauskeller, M. 2017. Automatic sweethearts for transhumanists. In Robot sex: Social and ethical implications, eds. John Danaher and Neil McArthur, 229–248. Cambridge: The MIT Press.
Heidegger, M. 1962. Being and time. Trans. John Macquarrie and Edward Robinson. New York: Harper and Row.
———. 1991. Nietzsche. Trans. David Farrell Krell. New York: Harper Collins Publishers.
von Hildebrand, Dietrich. 2009. The nature of love. South Bend: St. Augustine’s Press.
Hume, D. 1999. A treatise of human nature. Beijing: China Social Sciences Publishing House.
Husserl, E. 2004. Einleitung in die Ethik. Vorlesungen Sommersemester 1920/1924 (Husserliana vol. XXXVII). Dordrecht: Herausge-geben von Henning Peucker. Boston, London: Kluwer Academic Publishers.
James, W. 1909. The meaning of truth. In A sequel to ‘Pragmatism’. New York: Longmans, Green and Co.
Kant, I. 1997. Lectures on ethics. New York: Cambridge University Press.
Kern, I. 2012. Mengzi (Mencius), Adam Smith and Edmund Husserl on sympathy and conscience. In Intersubjectivity and objectivity in Adam Smith and Edmund Husserl, eds. Christel Fricke and Dagfinn Føllesdal, 139–170. Dordrecht: Ontos Verlag.
Levy, D. 2007. Love and sex with robots: The evolution of human-robot relationships. New York: Harper.
Mencius. 1970. Translated with an introduction by D.C. Lau. New York: Penguin Books.
Merleau-Ponty, M. 1964. The primacy of perception and other essays on phenomenological psychology. Evanston: Western University Press.
Pereira, A., I. Leite, S. Mascarenhas, C. Martinho, and A. Paiva. 2011. Using empathy to improve human–robot relationships. In Human–robot personal relationship, 130–138. Berlin: Springer.
Redstone, J. 2014. Making sense of empathy with social robots. In Social robots and the future of social relations, eds. J. Seibt et al., 171–177. Amsterdam: IOS Press.
Sartre, J. 1993. Being and nothingness. Trans. Hazel Barnes. New York: Washington Square Press.
Scheler, M. 2009. The nature of sympathy. New Brunswick and London: Transaction Publishers.
Scheutz, M. 2012. The inherent dangers of unidirectional emotional bonds between humans and social robots. In Robot ethics: The ethical and social implications of robotics, eds. P. Lin, K. Abney, and G.A. Bekey, 205–221. Cambridge: The MIT Press.
Smith, A. 1976. The theory of moral sentiments. Indianapolis: Liberty Classics.
Solomon, R. 2006. Emotions in phenomenology and existentialism. In A companion to phenomenology and existentialism, eds. Hubert L. Dreyfus and Mark A. Wrathall. West Sussex: Blackwell Publishing Ltd.
Sparrow, R., and L. Sparrow. 2006. In the hands of machines? The future of aged care. Minds and Machines 16 (2): 141–161.
Sullins, J. 2012. Robots, love, and sex: The ethics of building a love machine. IEEE Transactions on Affective Computing 3 (4): 398–409.
Turkle, S. 2011. Alone together: Why we expect more from technology and less from each other. New York: Basic Books.
———. 2015. Reclaiming conversation: The power of talk in a digital age. New York: Penguin Press.
Xunzi. 2016. Beijing: Zhong Hua Book Company.
Acknowledgements
This research is supported by the National Social Science Fund of China (国家社会科学基金项目) (grant no. 20BZX127), and Fundamental Research Funds for Universities (RW190410).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Wang, J. (2021). Should We Develop Empathy for Social Robots?. In: Fan, R., Cherry, M.J. (eds) Sex Robots. Philosophical Studies in Contemporary Culture, vol 28. Springer, Cham. https://doi.org/10.1007/978-3-030-82280-4_3
Download citation
DOI: https://doi.org/10.1007/978-3-030-82280-4_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-82279-8
Online ISBN: 978-3-030-82280-4
eBook Packages: Social SciencesSocial Sciences (R0)