Abstract
Some writers claim that ethicists involved in assessing future technologies like nanotechnology and human enhancement devote too much time to debating issues that may or may not arise, at the expense of addressing more urgent, current issues. This practice has been claimed to squander the scarce and valuable resource of ethical concern. I assess this view, and consider some alternatives to ‘speculative ethics’ that have been put forward. I argue that attempting to restrict ethical debate so as to avoid considering unacceptably speculative scenarios would not only leave scientific progress devoid of ethical guidance, but would also rule out some of our most important ethical projects. I conclude that the issue of speculation is a red herring: what is most important is not that ethicists concentrate on current issues or those that are most likely to arise; but that ethicists, scientists, and others focus on maximising what is most valuable.
Similar content being viewed by others
Notes
Some have disputed the value of considering such hypothetical scenarios. Ludwig Wittgenstein believed that exotic thought experiments can be misleading and meaningless: ‘It is as if our concepts involve a scaffolding of facts … If you imagine certain facts otherwise … then you can no longer imagine the application of certain concepts’ ([3]: proposition 350). And Alasdair Urquhart has accused discussions in the philosophy of mind and cognitive science of a ‘fascination with far-fetched thought experiments’ ([4]: 27, cited at [5]: 34f), which illicitly derive their power from ignorance of the underlying science. For an illuminating discussion of the value of thought experiments, see [6].
I use this term broadly, to include not only academic or philosophically-trained ethicists, but people in general whose job it is to decide how best to prepare for future scientific developments. This is in line with the concerns of the authors whose views I consider here.
Others have used the term ‘speculative ethics’ to refer to other things. For example, John Maynard Keynes used it to refer to the investigation of what is good in and of itself (Keynes JM (1905) Miscellanea Ethica. Unpublished manuscript deposited in King’s College Library, University of Cambridge), and Albert William Levi, based on Henri Bergson’s distinction between knowing a thing by moving around an object and knowing a thing by entering into it, used it to refer to a method of ethics that proceeds by ‘moving around’ moral phenomena rather than attempting to penetrate them [9].
Nordmann does not attribute this particular claim to de Grey, or to anyone else. He appears to use it only to illustrate his claim that ethical deliberation can escalate into a demand for action.
He lists additional reasons why the ‘if and then’ syndrome is worrying: it casts purely speculative scenarios as more probable than they really are, or as more probable than we have reason to believe they are; and it prompts questions about ourselves and our future that are ‘unintelligible’. I shall ignore these other concerns, however. By themselves, they do not constitute a compelling case against speculation in ethics, and—for reasons I do not have space to outline here—the latter problem arguably does not exist at all.
In fact, the picture is more complex than this. Judgments about the likelihood of a given scenario’s occurring depend on a variety of factors, of which its desirability is only one. Other relevant factors include whether the activity that might lead to the scenario is voluntary, whether it is perceived as controllable, and whether it is familiar. See [14].
Occasionally Nordmann seems to do this. For example, he writes, ‘In order to resist foreshortening, considerable work is required to hold the scientific community to its own standards of honesty and clarity. Whose responsibility is it … to remind scientists … of the categorical difference between a therapeutic brain-machine interface and the vision of a thought-controlled mind-machine interface?’ ([5]: 43). However, we can interpret him to mean not that scientists should avoid speculation—that is, imagining what might be possible given the right technology—but that it is important that scientists should be clear about what is involved in achieving those scenarios that they envision.
It may be objected that Weckert and Moor are concerned with the precautionary principle, which is often said to be an alternative to cost-benefit analysis, and as such the observation that their view is at odds with cost-benefit analysis can hardly be deemed an objection. However, it is worth noting that, despite the fact that those who advocate using the precautionary principle rather than cost-benefit analysis do so because the former is more risk-averse than the latter, the version of the precautionary principle that Weckert and Moor favour—which considers only ‘credible’ threats—fails to consider some risks that even cost-benefit analysis would recognise. If the costs of a particular outcome are sufficiently severe, cost-benefit analysis advocates preparing for that outcome even if it is highly unlikely.
References
Parfit D (1984) Reasons and persons. 2nd edn. Oxford University Press, Oxford
Dennett DC (1978) Brainstorms: philosophical essays on mind and psychology. Bradford Books, New York
Wittgenstein L (1967) Zettel. G. Anscombe and G. von Wright (Eds) Translated by G. Anscombe. Blackwell, London
Urquhart A (2004) Complexity. In: Floridi L (ed) The Blackwell guide to philosophy of computing and information. Blackwell, Malden, pp 18–27
Nordmann A (2007) If and then: a critique of speculative nanoethics. NanoEthics 1:31–46. doi:10.1007/s11569-007-0007-6
O’Neill O (1986) The power of example. Philosophy 61(235):5–29
Manson N (2002) Formulating the precautionary principle. Environ Ethics 24:263–274
Keiper A (2007) Nanoethics as a discipline? New Atlantis (Spring):55–67
Levi WL (1943) Temperament and moral theory. Ethics 53(2):128–132. doi:10.1086/290336
Sunstein CR (2004). Your money or your life. The New Republic, 11 March, 2004. Retrieved 8 August, 2008, from http://www.powells.com/review/2004_03_11
Joy B (2000) Why the future doesn’t need us. Wired 8(04):238–262
Brown JS, Duguid P (2001) Don’t count society out: a reply to Bill Joy. In: Denning PJ (ed) The invisible future. McGraw-Hill, New York, NY, pp 117–144 Retrieved 13 August, 2008, from http://people.ischool.berkeley.edu/~duguid/SLOFI/Don’t_Count.htm
Slovic P, Finucane M, Peters E, MacGregor DG (2002) The affect heuristic. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases: the psychology of intuitive judgement. Cambridge University Press, New York, NY, pp 397–420
Starr C (1969) Social benefit versus technological risk. Science 165:1232–1238. doi:10.1126/science.165.3899.1232
Crow MM, Sarewitz D (2001) Nanotechnology and societal transformation. In: Teich AH, Nelson SD, McEnaney C, Lita SJ (eds) AAAS Science and Technology policy yearbook. American Association for the Advancement of Science, Washington, D.C, pp 89–101
Weckert J, Moor J (2007) The precautionary principle in nanotechnology. In: Allhoff F, Lin P, Moor J, Weckert J (eds) Nanoethics: the ethical and social implications of nanotechnology. Wiley-Interscience: Hoboken, NJ, pp 133–146
Shaw GB (1921) Back to Methuselah. Penguin, London
Tversky A, Kahneman D (1973) Availability: a heuristic for judging frequency and probability. Cognit Psychol 5:207–232. doi:10.1016/0010-0285(73)90033-9
Boyle A (2008). Doomsday fears spark lawsuit. Cosmic Log, 27 March, 2008. Retrieved 2 July, 2008, from http://cosmiclog.msnbc.msn.com/archive/2008/03/27/823924.aspx
Bostrom N (2006) Welcome to world of exponential change. In: Miller P, Wilsdon J (eds) Better humans? The politics of human enhancement and life extension. DEMOS, London, pp 40–50
Bostrom N (2007) Technological revolutions and the problem of prediction. In: Allhoff F, Lin P, Moor J, Weckert J (eds) Nanoethics: the ethical and social implications of nanotechnology. Wiley-Interscience, Hoboken, NJ, pp 101–118
Department of Health Antibiotic Campaign 2007–2008. Retrieved 24 July, 2008, from http://www.dh.gov.uk/en/Publichealth/Patientsafety/Antibioticresistance/DH_082512
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Roache, R. Ethics, Speculation, and Values. Nanoethics 2, 317–327 (2008). https://doi.org/10.1007/s11569-008-0050-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11569-008-0050-y