What it’s like to be a _____: why it’s (often) unethical to use VR as an empathy nudging tool

Abstract

In this article, we apply the literature on the ethics of choice-architecture (nudges) to the realm of virtual reality (VR) to point out ethical problems with using VR for empathy-based nudging. Specifically, we argue that VR simulations aiming to enhance empathic understanding of others via perspective-taking will almost always be unethical to develop or deploy. We argue that VR-based empathy enhancement not only faces traditional ethical concerns about nudges (autonomy, welfare, transparency), but also a variant of the semantic variance problem that arises for intersectional perspective-taking. VR empathy simulations deceive and manipulate their users about their experiences. Despite their often laudable goals, such simulations confront significant ethical challenges. In light of these goals and challenges, we propose VR designers shift from designing simulations aimed at producing empathic perspective-taking to designing simulations aimed at generating sympathy for their targets. These simulations, we claim, can avoid the most serious ethical issues associated with VR nudges, semantic variance, and intersectionality.

This is a preview of subscription content, access via your institution.

Notes

  1. 1.

    Our claim can also be construed in the following way: empathy-based VR nudgers achieve their nudging effects in virtue of a form of deception that we argue are manipulations of (non-conscious) System 1 and (conscious) System 2 processes. The perceived goods of empathy-based VR nudges almost never outweigh the unethical deception, loss of autonomy, and manipulation associated with VR EEEs especially in the light of available, and less problematic, alternatives to VR empathy-based nudging. Some authors have occasionally framed the ethics of nudges in terms of System 1 and System 2 manipulation (see Sunstein, 2015).

  2. 2.

    Augmented reality (AR) devices also exist and can be used as nudge vehicles. Instead of simulating an entirely new three dimensional world for users to inhabit, AR devices (like the new defunct Google Glass) overlay information onto our normal experiences (i.e., by providing a floating directional arrow to guide you through an unfamiliar city as you explore it). While empathy enhancing nudges have currently been developed for VR HMDs, what we say in this paper can extend to AR attempts to generate empathy as well (i.e., by changing how you perceive your own, or other people’s, racial or gender identity).

  3. 3.

    Citations omitted for review.

  4. 4.

    Citation omitted for review.

  5. 5.

    Citations omitted for review.

  6. 6.

    What we quickly learn from these results is that a simulation’s ability to generate virtually real experiences in its users is at least partially a result of the user’s psychology and background beliefs. Users who believe in time travel, ghosts, and dragons are more likely to experience simulations containing these elements as more context-real than users without these beliefs (citation omitted for review).

  7. 7.

    That a user treats an experience as virtually real need not indicate that they will continue to act as if that experience were real after-the-fact. What matters, though, is that subjects who are in the middle of the experience will react to it as if it were real. As a result, context-real and perspectival faithful simulations also share some features with context-real and perspectivally faithful dream experiences (Chalmers, 2017).

  8. 8.

    Subjects may, of course, later deny that they “really” thought they were located in a virtual space. One advantage of relying on physiological and behavioral evidence to define virtually real experience is that we can assess the similarity of a subject’s responses (arousal, stress, panic, facial expressions, and so on) to their responses to real-world events. A subject whose palms become sweaty while exploring a virtual Grand Canyon, who carefully tiptoes over to the edge, and who refuses to step into the canyon itself could be said, on our view, to be behaving as if the experience were real (even if, when verbally prompted, they may say otherwise).

  9. 9.

    According to Sunstein, a person’s welfare is benefited when it makes them “better off, as judged by themselves” (Sunstein, 2015, p. 429). We wish to remain agnostic about how best to make sense of welfare for the sake of nudging as it seems sensible to claim that public welfare may reasonably render a nudge permissible even if those nudged may not be materially benefitted (as with opt-out organ donation nudgers).

  10. 10.

    Some might argue that transparency only matters instrumentally, insofar as it enhances autonomy and its absence undermines it. However, even if this is true, understanding whether and how a nudge is transparent or obscure is still helpful in assessing its ethical status.

  11. 11.

    It’s unlikely that transparency fully removes the nudging power of many nudges. For example, hyper-targeted political ads, intended to emotionally nudge users, may retain some of their nudging power even if the nudge’s nature is made transparent to subjects. In this sense, that a nudge is transparent (or not) does not fully determine the permissibility of the nudge. A nudge’s transparency is an important piece of the puzzle when assessing the ethics of a particular nudge but it may be outweighed by all things considered judgments of the nudge’s effects on autonomy and individual/social welfare.

  12. 12.

    That we can be nudged in this way is one reason why nudge ethicists have called for the development of conditions of competence and trust among nudge developers (Selinger & Whyte, 2010).

  13. 13.

    For example, urinals will sometimes have a fly printed on their surface as a way to nudge users toward aiming at the ideal place on the urinal to reduce splashing. Such a nudge may, arguably, be ethically deceptive in the all things considered sense. Why would this be? First, such a nudge isn’t obviously an educative nudge. The printed fly is deceptive in the sense that it might fool a user into believing, at least temporarily, that a real fly is on the urinal but the force of the nudge is derived from its value in providing users somewhere to aim on an otherwise featureless surface. An astute person may realize that the fly is merely printed on the urinal and nonetheless be successfully nudged by it. Second, such a deception (minor and short-lived though it might be) also is likely to align with the user’s own goal of avoiding splash. By contrast, “lying calories'' goes beyond deception and manipulates users by subverting their autonomy. We say more about deception and manipulation below when discussing the ethics of VR EEEs.

  14. 14.

    Emphasis added. We argue that no simulation can show human users what it is like to be a cow or to feel cow feelings or have cow thoughts. In part, we locate part of the problem with VR empathy simulations as resting on a fundamental mistake about the nature of experience that we see repeated throughout the psychological literature on VR embodiment. Though we say more about this later, we think it confusing to parse out how these researchers think human and cow experiences are related and individuated such that we can say that a person’s experience, in VR, of walking on all fours in a virtual pasture, is relevantly like an actual cow’s experience of walking naturalistically in a field. See (citation omitted for review) and (Nagel, 1974) for more on this general problem in the philosophy of mind.

  15. 15.

    Emphasis added.

  16. 16.

    Emphasis added.

  17. 17.

    Some nudges might avoid the problem of semantic variance altogether because they nudge their users on the basis of physiological or non-cognitive psychological capacities widely shared regardless of a user’s context. To the degree that such nudges are cognitively impenetrable (to the degree that a person’s thoughts do not affect the power of the nudge), then they would avoid the problem of semantic variance. Images of diseased organs on the cover of cigarette packs, for example, may trigger evolved disgust responses that nudge users to smoke less frequently regardless of their cultural context. Whether or not any nudges, including the one just mentioned, actually fit these criteria is a matter of significant controversy.

  18. 18.

    It is, of course, possible that a user may actually share many intersectional features with Michael Sterling (they may have been born in a similar era, grown up in similar SES surroundings, and identify as the same gender and race as the character). In those cases, perspective-taking may, indeed, be more likely to deliver truthful simulated content. However, VR empathy simulations are, as we have demonstrated, almost always targeted at populations very much unlike those being simulated (undocumented migrants from Mexico, black men, shorthorn cattle, etc.).

  19. 19.

    Emphasis added. Altough she does not use the term “structural intersectionality,” consider also Elena Ruíz’ (2017) framing of intersectionality in terms of experience: “As a descriptive term, [intersectionality] refers to the ways human identity is shaped by multiple social vectors and overlapping identity categories (such as sex, race, class) that may not be readily visible in single-axies formulations of identity, but which are taken to be integral to robustly capture the multifaceted nature of human experience” (335).

  20. 20.

    By primitive, we mean only that mirror-neuron empathic structures appear in many mammals, including rats, and thus are likely to have evolved early relative to other more complex cerebral structures found in humans and other apes (Carillo et al., 2019).

  21. 21.

    Emphasis added.

  22. 22.

    We pause here to note that the problem we’re raising for virtual reality simulations is focused specifically on those wishing to use VR as an empathy enhancing nudge. To use VR in such a way requires creating a simulation depicting someone’s experience with the intention of having that experience be shared with those who are intersectionality different from the person being simulated. VR simulations designed to provide a generic perspective (i.e., a VR simulation of sitting in a stadium during the Olympics) wouldn’t run into any of the ethical issues we’re raising here for empathy enhancing nudges. Similarly, it’s entirely possible for someone to design a VR simulation of their own experiences and intended for their own use. Insofar as the designer and user of a VR simulation like this are similar to one another, then self-empathy of this sort would also be relatively problem free. We say “relevantly” because there is some evidence that we can fail to empathize with our past selves, especially after undergoing transformative experiences (see Levine, 1997). For more on how structural intersectionality can impact empathy even across those who identify as members of the same race or gender see Táíwò (2020).

  23. 23.

    This is, we want to emphasize, an empirical matter. It may turn out that some intersectional features (race, gender, sex, etc.) may be more influenced by internalized bias than others (class, nationality). Our point in producing this example is to note that those who will be subject to (or whom subject themselves to) empathy based VR nudge simulations are unlikely, for these sorts of reasons, to successfully mirror the experience of those whose bodies they virtually inhabit. This is, in one way, to reiterate the concerns we raised earlier about empathic perspective-taking and the importance of subdoxastic features. See also Ruckmann et al. (2015) for a similar study using fMRI to assess empathic contagion.

  24. 24.

    We intend the arguments in this paper to be as agnostic as possible with respect to first-order moral theory. Nearly all normative systems make room for the value of transparency, respect for autonomy, and individual/social welfare (though for different reasons and in different ways). Our argument relies on the fact that VR empathy-enhancing simulations unnecessarily deceive and manipulate users toward (arguably) good ends. The presence of non-manipulative alternatives leaves us comfortable claiming that most first-order moral theories would converge to find this form of nudging unethical given the presence of alternatives. We thank an anonymous reviewer for helping us to clarify this point.

  25. 25.

    Users can thus be manipulated in several ways. For example, a user can be manipulated toward forming (and/or acting) on inappropriate beliefs (i.e., that a food has substantially more calories than it really does) but they can also be manipulated if they come to form morally benign beliefs through inappropriate/ deceptive means (e.g., convincing a child that Santa Claus does not exist because he was murdered in the nineteenth century). In the case of VR empathy-enhancing simulations, we believe both forms of manipulation are present.

  26. 26.

    Insofar as designers of VR empathy-enhancing simulations accept even the weakest forms of structural intersectionality (i.e., that internalized concepts can structure the content of experience) then they ought to know that their simulations cannot succeed at the task they’ve designed them to do.

  27. 27.

    Another question, outside the scope of our argument here, is whether “lying calories” would be morally acceptable to use if we lacked a non-manipulative alternative. Any response to this issue would require a complex analysis of the value of the social goods that “lying calories'' would help realize. Once we account for all of the institutions that would be involved in maintaining the opacity of “lying calories” and the effects of creating and maintaining institutions designed to paternalistically manipulate public beliefs about health, we’re skeptical that “lying calories” could be justified, but it isn’t beyond the realm of possibility. We thank an anonymous reviewer for pushing us to clarify this aspect of our argument.

  28. 28.

    A corollary issue here is that manipulations of this sort will fail to make their subjects morally better people even if they succeed in changing their behavior. Writing about behavioral moral enhancements of this sort, John Harris (2013) writes that “…moral enhancement, properly so called, must not only make the doing of good or right actions more probable and the doing of bad one’s less likely, but must also include the understanding of what constitutes right and wrong action” (172). Because subjects of VR EEEs are using emotionally-laden false beliefs to change their behavior, it’s unlikely that they’re behaving, as Harris suggests, on the right kinds of reasons when they act.

  29. 29.

    It might be argued, however, that such nudgers are ultimately permissible. Why would that be? Because becoming more sympathetic with the plight of those experiencing homelessness or with members of other marginalized communities (1000 Cut Journey, Carne y Arena, etc.) would align with the values of the person using the simulation. That is, such nudgers may be used to manipulate oneself into a position that one ultimately desires and endorses upon reflection. The literature on self-deception is interesting, and vast, (Bermúdez, 2000; Bortolotti & Mameli, 2012; Lynch, 2016; Mele, 1997) but it seems that such responses miss the point of the critique. In order to properly consent to the use of such simulations, and in order for such nudges to sidestep the ethical issues we raise here, their function must be transparent to the user. Our claim here is that such transparency would break the simulation’s power as a nudge. If a subject knows, in advance, that the experiences they will have are not the experience of “what it’s like” to be the person represented by the simulation, then the simulation would cease to function as an empathy-enhancing nudge and thus, by one’s own lights, one ought not desire to use the simulation for empathy-enhancing reasons if one cares about the ethics of nudging.

References

  1. Aardema, F., O’Connor, K., Côté, S., & Taillon, A. (2010). Virtual reality induces dissociation and lowers sense of presence in objective reality. Cyberpsychology, Behavior, and Social Networking, 13(10), 429–435

    Article  Google Scholar 

  2. Ahn, S. J., Bostick, J., Ogle, E., Nowak, K., McGillicuddy, K., & Bailenson, J. N. (2016). Experiencing nature: Embodying animals in immersive virtual environments increases inclusion of nature in self and involvement with nature. Journal of Computer-Mediated Communication. https://doi.org/10.1111/jcc4.12173

    Article  Google Scholar 

  3. Andrews, K. (2008). It’s in your nature: A pluralistic folk psychology. Synthese, 165(1), 13–29

    Article  Google Scholar 

  4. Artaud, A. (1958). The theater and its double. Grove. Trans. Mary Caroline Richards.

    Google Scholar 

  5. Avenanti, A., Sirigu, A., & Aglioti, S. M. (2010). Racial bias reduces empathic sensorimotor resonance with other-race pain. Current Biology, 20(11), 1018–1022

    Article  Google Scholar 

  6. Bailenson, J. (2018). Experience on demand: What virtual reality is, how it works, and what it can do. Norton.

    Google Scholar 

  7. Bermúdez, J. L. (2000). Self-deception, intentions and contradictory beliefs. Analysis, 60(4), 309–319

    Article  Google Scholar 

  8. Bernstein, S. (2020). The metaphysics of intersectionality. Philosophical Studies, 177(2), 321–335

    Article  Google Scholar 

  9. Blumenthal-Barby, J.S. (2013). Choice architecture: Improving choice while preserving liberty? In Coons, C. & Weber, M. (Eds.), Paternalism. Cambridge University Press

  10. Bortolotti, L., & Mameli, M. (2012). Self-deception, delusion and the boundaries of Folk Psychology. Humana Mente, 5(20), 203–221

    Google Scholar 

  11. Bovens, L. (2009). The ethics of Nudge. In T. Grüne-Yanoff & S. O. Hansson (Eds.), Preference change approaches from philosophy, economics and psychology. (pp. 207–219). Springer.

    Google Scholar 

  12. Carastathis, A. (2014). The concept of intersectionality in feminist theory. Philosophy Compass, 9(5), 304–314

    Article  Google Scholar 

  13. Carrillo, M., Han, Y., Migliorati, F., Liu, M., Gazzola, V., & Keysers, C. (2019). Emotional mirror neurons in the rat’s anterior cingulate cortex. Current Biology, 29(8), 1301–1312

    Article  Google Scholar 

  14. Chalmers, D. (2017). The virtual and the real. Disputatio, 9(46), 309–352

    Article  Google Scholar 

  15. Cogburn, C., Bailenson, J., Ogle, E., Tobin, A., & Nichols, T. (2018). 1000 cut journey. SIGGRAPH '18 ACM SIGGRAPH, virtual, augmented, and mixed reality article no. 1

  16. Cogburn, J., & Silcox, M. (2014). Against Brain-in-a-Vatism: On the value of virtual reality. Philosophy and Technology, 27(4), 561–579

    Article  Google Scholar 

  17. Crenshaw, K. (1989). Demarginalizing the intersecetion of race and sex: A black feminist critique of antidiscrimination doctrine, feminist theory, and antiracist policies. University of Chicago Legal Form, 139–167

  18. Crenshaw, K. (1991). Mapping the margins: Intersectionality, identity politics, and violence against women of color. Stanford Law Review, 43(6), 1241–1299

    Article  Google Scholar 

  19. Cummings, J., & Bailenson, J. (2016). How immersive is enough? A meta-analysis of the effect of immersive technology on user presence. Media Psychology, 19(2), 272–309

    Article  Google Scholar 

  20. Fischer, J. M., & Ravizza, M. (1998). Responsibility and control: A theory of moral responsibility. Cambridge University Press.

    Book  Google Scholar 

  21. Gasdaglis, K., & Madva, A. (2020). Intersectionality as a regulative ideal. Ergo: An Open Access Journal of Philosophy, 6(4), 1287–1330

    Google Scholar 

  22. Goldie, P. (2011). Anti-empathy. In A. Coplan & P. Goldie (Eds.), Empathy: Philosophical and psychological perspectives. (pp. 318–330). Oxford University Press.

    Google Scholar 

  23. Goldman, A. I., & Jordan, L. (2013). Mindreading by simulation: The roles of imagination and mirroring. In S. Baron-Cohen, M. Lombardo, & H. Tager-Flusberg (Eds.), Understanding other minds: Perspectives from developmental social neuroscience. (3rd ed., pp. 448–466). Oxford University Press.

    Google Scholar 

  24. Grimsley-Vaz, E. (2018). Creator of ‘1000 Cut Journey’ uses VR to help white liberals understand racism. Moguldom.com. Retrieved from https://moguldom.com/152786/creator-of-1000-cut-journey-uses-vr-to-help-white-liberals-understand-racism/

  25. Guttentag, D. A. (2010). Virtual reality: Applications and implications for tourism. Tourism Management, 31(5), 637–651

    Article  Google Scholar 

  26. Ham, D. (2018). I am a man. Retrieved from http://iamamanvr.logicgrip.com/

  27. Hansen, P. G., & Jespersen, A. M. (2013). Nudge and the manipulation of choice. European Journal of Risk Regulation, 3, 3–28

    Article  Google Scholar 

  28. Harris, J. (2013). Ethics is for bad guys! Putting the ‘moral’ into moral enhancement. Bioethics, 27(1), 169–173

    Article  Google Scholar 

  29. Hotchkiss, S. (Jun 18, 2019). In San Jose’s Japantown, contemporary transience takes on historical weight. KQED. Retrieved from https://www.kqed.org/arts/13859833/transient-existence-artobjectgallery-san-jose-japantown

  30. Iñárritu, A. G. (2017). Carney y arena (virtually present, physically invisible). Fondazione Prada, Legendary Entertainment.

    Google Scholar 

  31. Iñárritu, A. G. (2017). CARNE y ARENA (Virtually present, physically invisible). Retrieved from https://www.lacma.org/art/exhibition/alejandro-g-inarritu-carne-y-arena-virtually-present-physically-invisible

  32. Jacobs, O. & Anderson, N. (May 26, 2019). Virtual reality is reality. Psychology Today. Retrieved from https://www.psychologytoday.com/us/blog/virtual-reality/201905/virtual-reality-is-reality

  33. Levine, L. (1997). Reconstructing memory for emotions. Journal of Experimental Psychology, 126(2), 165–177

    Article  Google Scholar 

  34. Lynch, K. (2016). Willful ignorance and self-deception. Philosophical Studies, 173(2), 505–523

    Article  Google Scholar 

  35. Mele, A. R. (1997). Real self-deception. Behavioral and Brain Sciences, 20(1), 91–102

    Article  Google Scholar 

  36. Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67, 371–378

    Article  Google Scholar 

  37. Nagel, T. (1974). What is it like to be a bat? The Philosophical Review, 83, 435–450

    Article  Google Scholar 

  38. Noggle, R. (2020). Pressure, trickery, and a unified account of manipulation. American Philosophical Quarterly, 57(3), 241–252. https://doi.org/10.2307/48574436.

  39. Ogle, E., Asher, T., & Bailenson, J. (2018). Becoming Homeless: A Human Experience. Virtual Human Interaction Laboratory. Retrieved from http://vhil.stanford.edu/becominghomeless/

  40. Opriş, D., Pintea, S., García-Palacios, A., Botella, C., Szamosközi, Ş., David, D. (2012). Virtual reality exposure therapy in anxiety disorders: a quantitative meta-analysis. Depress Anxiety, 29(2), 85–93. https://doi.org/10.1002/da.20910.

  41. Parsons, T. D. (2015). Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Frontiers in Human Neuroscience, 9, 650

    Article  Google Scholar 

  42. Parsons, T. D., & Rizzo, A. A. (2008). Affective outcomes of virtual reality exposure therapy for anxiety and specific phobias: A meta-analysis. Journal of Behavior Therapy and Experimental Psychiatry, 39(3), 250–261

    Article  Google Scholar 

  43. Perez-Gomez, J. (2020). Verbal microaggressions as hyper-implicatures. The Journal of Political Philosophy. https://doi.org/10.1111/jopp.12243

    Article  Google Scholar 

  44. Pinch, T. (2010). Comment on “Nudges and cultural variance.” Knowledge, Technology & Policy, 23(3–4), 487–490

    Article  Google Scholar 

  45. Ramirez, E. (2017). Empathy and the limits of thought experiments. Metaphilosophy, 48(4), 504–526

    Article  Google Scholar 

  46. Ramirez, E. (2018). Ecological and ethical issues in virtual reality research: A call for increased scrutiny. Philosophical Psychology, 32(2), 211–233

    Article  Google Scholar 

  47. Rizzo, A., Difede, J., Rothbaum, B. O., Reger, G., Spitalnick, J., Cukor, J., & Mclay, R. (2010). Development and early evaluation of the Virtual Iraq/Afghanistan exposure therapy system for combat-related PTSD. Annals of the New York Academy of Sciences, 1208, 114–125

    Article  Google Scholar 

  48. Rizzo, A., Roy, M. J., Hartholt, A., Costanzo, M., Beth Highland, K., Jovanovic, T., Norrholm, S. D., Reist, C., Rothbaum, B., & Difede, J. (2017). Virtual reality applications for the assessment and treatment of PTSD. In S. V. Bowles & P. T. Bartone (Eds.), Handbook of military psychology. (pp. 453–471). Springer International.

    Google Scholar 

  49. Ruckmann, J., Bodden, M., Jansen, A., Kircher, T., Dodel, R., & Rief, W. (2015). How pain empathy depends on ingroup/outgroup decisions: A functional magnet resonance imaging study. Psychiatry Research: Neuroimaging, 234(1), 57–65

    Article  Google Scholar 

  50. Ruíz, E. (2017). Framing intersectionality. In P. C. Taylor, L. M. Alcoff, & L. Anderson (Eds.), The Routledge companion to philosophy of race. (pp. 335–348). Routledge.

    Google Scholar 

  51. Sanchez-Vives, M. V., & Slater, M. (2005). From presence to consciousness through virtual reality. Nature Reviews: Neuroscience, 6, 332–339

    Article  Google Scholar 

  52. Schubert, C. (2017). Green nudges: Do they work? Are they ethical? Ecological Economics, 132, 329–342

    Article  Google Scholar 

  53. Schüll, N. (2012). Addiction by design: Machine Gambling in Las Vegas. Princeton; Oxford: Princeton University Press.

  54. Schwartz, A. (March 20, 2017). Confronting the “shocking” virtual-reality artwork at the Whitney Biennial. New Yorker. Retrieved from https://www.newyorker.com/culture/cultural-comment/confronting-the-shocking-virtual-reality-artwork-at-the-whitney-biennial

  55. Selinger, E., & Whyte, K. P. (2010). Competence and trust in choice architecture. Knowledge, Technology & Policy, 23(3–4), 461–482

    Article  Google Scholar 

  56. Stone, R.J. (2000). Haptic feedback: A brief history from telepresence to virtual reality. In International Workshop on Haptic Human-Computer Interaction, pp. 1–16.

  57. Sunstein, C. (2015). The ethics of nudging. Yale Journal of Regulation, 32(2), 414–450

    Google Scholar 

  58. Táíwò, O. (August 2020). Being-in-the-Room privilege: Elite capture and epistemic deference. The Philosopher, 108 (4). https://www.thephilosopher1923.org/essay-taiwo

  59. Tannenbaum, D., Fox, C. R., & Rogers, T. (2017). On the misplaced politics of behavioral policy interventions. Nature Human Behavior, 1, 130

    Article  Google Scholar 

  60. Thatcher, S. (2019). VR and the role it plays in museums. Retrieved from https://ad-hoc-museum-collective.github.io/GWU-museum-digital-practice-2019/essays/essay-9/

  61. Ward, J., & Banissy, M. J. (2015). Explaining mirror-touch synesthesia. Cognitive Neuroscience, 6(2–3), 118–133

    Article  Google Scholar 

  62. Williams, K. D. (2014). The effects of dissociation, game controllers, and 3D versus 2D on presence and enjoyment. Computers in Human Behavior, 38, 142–150.

  63. Won, A. S., Bailenson, J., & Lanier, J. (2015). Homuncular flexibility: The human ability to inhabit nonhuman avatars. Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource. https://doi.org/10.1002/9781118900772.etrds0165

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Erick Jose Ramirez.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ramirez, E.J., Elliott, M. & Milam, PE. What it’s like to be a _____: why it’s (often) unethical to use VR as an empathy nudging tool. Ethics Inf Technol (2021). https://doi.org/10.1007/s10676-021-09594-y

Download citation

Keywords

  • Empathy
  • Implicit bias
  • Intersectionality
  • Nudge
  • Simulation ethics
  • Virtual reality