Skip to main content
Log in

Drones, information technology, and distance: mapping the moral epistemology of remote fighting

  • Original Paper
  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

Ethical reflection on drone fighting suggests that this practice does not only create physical distance, but also moral distance: far removed from one’s opponent, it becomes easier to kill. This paper discusses this thesis, frames it as a moral-epistemological problem, and explores the role of information technology in bridging and creating distance. Inspired by a broad range of conceptual and empirical resources including ethics of robotics, psychology, phenomenology, and media reports, it is first argued that drone fighting, like other long-range fighting, creates epistemic and moral distance in so far as ‘screenfighting’ implies the disappearance of the vulnerable face and body of the opponent and thus removes moral-psychological barriers to killing. However, the paper also shows that this influence is at least weakened by current surveillance technologies, which make possible a kind of ‘empathic bridging’ by which the fighter’s opponent on the ground is re-humanized, re-faced, and re-embodied. This ‘mutation’ or unintended ‘hacking’ of the practice is a problem for drone pilots and for those who order them to kill, but revealing its moral-epistemic possibilities opens up new avenues for imagining morally better ways of technology-mediated fighting.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Some people object to this and attach a specific meaning to ‘drones’, for instance ‘autonomous UAV’, but I decided to go with common usage.

  2. In this paper I assume a meta-ethical position inspired by Hume, Smith, Dewey, Nussbaum, and Rorty (among other thinkers) which holds that moral feelings and moral imagination are central to morality. The claim is that rational argument is not sufficient and that we need feelings and imagination instead of, or at least next to explicit and principled moral deliberation (for the latter claim see Coeckelbergh 2007). We may even agree with Rorty that the Platonic rationalist ethical project is entirely misguided and that instead of asking “Why should I be moral?” we (moral philosophers) should ask the question how we can act morally towards strangers (Rorty 1993). One answer to this question is that the exercise of empathy, putting oneself in the other’s shoes, helps one to become more moral. In this paper I will not (further) discuss my meta-ethical position in order to create room for my discussion of the main thesis concerning technology and moral distance in relation to drone fighting. However, both the question and the ‘empathy’ answer are centrally relevant to the discussion about drone fighting presented here. My question in this paper can be regarded as a ‘follow up’ on Rorty’s question and answer: if this is what we need to morally bridge out towards strangers, what epistemic, experiential conditions are necessary and sufficient for this empathy to get off the ground? If the main question is about who is a member of our moral community, we want to know how we decide about the border between ‘inside’ and ‘outside’, how we (can) draw ‘strangers’ into that community. In particular, in this paper I am interested in the role played by how the opponent appears to us in fighting (e.g. as a stranger, as a target, or as a human being): under what epistemic conditions does he appear as ‘one of us’?

  3. The agricultural metaphor I use here also turns up in the name of a US drone called ‘Reaper’: the name means ‘harvester’ and refers to the figure of the Grim Reaper: death personified as a man with a scythe: an agricultural hand tool for removing or harvesting plants.

  4. I use the term ‘ethical hacking’ or ‘moral hacking’ not as synonyms of so-called white (hat) hacking, although such hacking might play a role in it. Rather, I refer to the production or happening of consequences of a technology not intended by the designers or mainstream users of the technology that subvert the purpose of the technology in a way that has morally good consequences. I write “production or happening” since I wish to leave open the question which role human agency plays in this process.

References

  • Arkin, R. C. (2008). Governing lethal behavior: Embedding ethics in a hybrid deliberative/reactive robot architecture. In Proceedings of the 3rd ACM/IEEE international conference on humanrobot interaction.

  • Asaro, P. M. (2008). How just could a robot war be? In P. Brey, A. Briggle, & K. Waelbers (Eds.), Current issues in computing and philosophy (pp. 50–64). Amsterdam: Ios Press.

    Google Scholar 

  • Borgmann, A. (1984). Technology and the character of contemporary life: A philosophical inquiry. Chicago: University of Chicago Press.

    Google Scholar 

  • Brooks, R. (2012). What’s Not wrong with drones? Foreign Policy. Retrieved from http://www.foreignpolicy.com/articles/2012/09/05/whats_not_wrong_with_drones.

  • Bumiller, E. (2012). A day job waiting for a kill shot a world away. The New York Times. Retrieved from http://www.nytimes.com/2012/07/30/us/drone-pilots-waiting-for-a-kill-shot-7000-miles-away.html?pagewanted=all.

  • Caroll, R. (2012). Drone warfare: A new generation of deadly unmanned weapons. The Guardian. Retrieved from http://www.guardian.co.uk/world/2012/aug/02/drone-warfare-unmanned-weapons.

  • Coeckelbergh, M. (2007). Imagination and principles: An essay on the role of imagination in moral reasoning. New York: Palgrave Macmillan.

  • De Cervantes, M. (1605/1615). Don Quixote (J. Ormsby 1922, Trans.). Penn State Electronic Classics, 2000.

  • Dreyfus, H. L. (2001). On the internet. Londen/New York: Routledge.

    Google Scholar 

  • Grossman, D. (1995). On killing: The psychological cost of learning to kill in war and society. New York/Boston/London: Little, Brown & Company. revised edition 2009.

    Google Scholar 

  • Grossman, D. (2001). On killing. II: The psychological cost of learning to kill. International Journal of Emergency Mental Health, 3(3), 137–144.

    MathSciNet  Google Scholar 

  • Heidegger, M. (1977). The question concerning technology. In M. Heidegger (Ed.), The question concerning technology and other essays (W. Lovitt, Trans.) New York: Harper & Row.

  • Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford: Oxford University Press.

    Google Scholar 

  • Levinas, E. (1961). Totality and infinity. Pittsburgh, Pennsylvania: Duquesne University Press.

    Google Scholar 

  • Lin, P., Bekey, G., & Abney, K. (2008). Autonomous military robotics: Risk, ethics, and design. Report for the US Department of Navy, Office of Naval Research.

  • Protevi, J. (2008). Affect, agency and responsibility: The act of killing in the age of cyborgs. Phenomenology and the Cognitive Sciences, 7(3), 405–413.

    Article  Google Scholar 

  • Rorty, R. (1993). Human rights, rationality, and sentimentality. In S. Shute & S. Hurley (Eds.), On human rights. New York: Basic Books.

    Google Scholar 

  • Sharkey, N. (2012). Killing made easy: From joysticks to politics. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robotics (pp. 111–128). Cambridge, MA: MIT Press.

    Google Scholar 

  • Singer, P. W. (2009). Military robots and the laws of war. The New Atlantis: A Journal of Technology & Society, 23(Winter issue), 28–47.

    Google Scholar 

  • Sparrow, R. (2007). Killer robots. Journal of Applied Philosophy, 24(1), 62–77.

    Article  Google Scholar 

  • Sparrow, R. (2009). Building a better warbot: Ethical issues in the design of unmanned systems for military applications. Science and Engineering Ethics, 15, 169–187.

    Article  Google Scholar 

  • Stewart, P. (2011). Overstretched U.S. Drone pilots face stress risk. Reuters. Retrieved from http://www.reuters.com/article/2011/12/18/us-usa-drones-stress-idUSTRE7BH0VH20111218.

  • Sullins, J. (2010). RoboWarfare: Can robots be more ethical than humans on the battlefield? Ethics and Information Technology, 12(3), 263–275.

    Article  Google Scholar 

  • Takaki, R. (1995). Hiroshima: Why America dropped the atomic bomb. New York: Little, Brown & Company.

    Google Scholar 

  • Valdes, R. (2012). How the predator UAV works. HowStuffWorks. Retrieved from http://science.howstuffworks.com/predator4.htm/printable.

Download references

Acknowledgments

I would like to thank the anonymous reviewers for their interesting comments, which helped me to fine-tune my brief phenomenology of fighting and its historical evolution.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark Coeckelbergh.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Coeckelbergh, M. Drones, information technology, and distance: mapping the moral epistemology of remote fighting. Ethics Inf Technol 15, 87–98 (2013). https://doi.org/10.1007/s10676-013-9313-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-013-9313-6

Keywords

Navigation