Abstract
The moral terrain of science, the full range of ethical considerations that are part of the scientific endeavor, has not been mapped. Without such a map, we cannot examine the responsibilities of scientists to see if the institutions of science are adequately constructed. This paper attempts such a map by describing four dimensions of the terrain: (1) the bases to which scientists are responsible (scientific reasoning, the scientific community, and the broader society); (2) the nature of the responsibility (general or role); (3) the level of responsibility (minimum demand or ideal); and (4) who bears the responsibility (the individual or the community). Such a map will be used to elucidate the recent debate over the publication of studies concerning H5N1 flu virus.
Similar content being viewed by others
Notes
White lies about evidence are unacceptable, but it is debated whether deception in other aspects of scientific work is acceptable. For example, deception may be essential to some studies in the social sciences, and whether deceiving subjects in these cases is okay is contested. See Baumrind (1985), Wendler (1996), Wendler and Miller (2004). Deceiving subjects in other studies is generally not. For example, in RCTs, it is essential to informed consent that we tell patients that they are being randomly assigned to different study populations, which may include a placebo. Efforts to make the placebo indistinguishable from the treatment is not an act of deception in this context. Whether using deception to study the placebo effect itself is acceptable is another issue. See Miller et al. 2005. That there are these debates is beside the main point, that role responsibilities for scientists add additional honesty burdens.
Whether scientists have more exacting role responsibilities to the broader society, such as that their research benefit society, seems to be in flux, evidenced by evolving codes of conduct, but it depends on the subdiscipline what the codes are. See, e.g. Kourany 2010, pp. 111–112.
Because a license to do science would amount to a license to conduct empirical inquiry, and because it is important for the robustness of inquiry that such an ability be open to all comers, creating licensing barriers to do science seems a bad idea. Nor is it even clear it could be enforceable that only licensed scientists conduct empirical inquiry—what would constitute proper boundaries around inquiry activities? Scientists might be licensed to work with particular materials or in particular contexts, but not to do science generally as such.
I will not address here whether the research should have been conducted, whether it should continue, or whether biosecurity containment at the labs where the research was done, and will continue, is adequate. These are clearly weighty issues as well. The argument that organisms as dangerous as the new strains of H5N1 require at least a biosecurity rating of Level 4 (BSL4) are particularly crucial. See, e.g., Murillo (2012).
Some also suggested it would be unprecedented, but that is not true. See Kaiser and Moreno 2012.
It is important to note that the NSABB could not order the journals (Science and Nature both had accepted manuscripts at issue) to withhold or redact publications. But the journals agreed that if the NSABB did recommend limited publication, they would abide by the NSABB’s decision.
References
Baumrind, D. (1985). Research using intentional deception: Ethical issues revisited. American Psychologist, 40(2), 165.
Brown, M. (2012). The source and status of values in Kourany’s socially responsible science. Philosophical Studies, 163(1), 67–76.
Brumfiel, G. (2012). Good science/bad science. Nature, 484(7395), 432–434.
Cohen, J. (2012). The limits of avian flu studies in ferrets. Science, 335(6068), 512–513.
Douglas, H. (2003). The moral responsibilities of scientists (tensions between autonomy and responsibility). American Philosophical Quarterly, 59–68.
Douglas, H. (2009). Reintroducing prediction to explanation. Philosophy of Science, 76, 444–463.
Douglas, H. (2010). Engagement for progress: applied philosophy of science in context. Synthese, 177(3), 317–335.
Elliott, K. (2011). Is a little pollution good for you? incorporating societal values in environmental research. Oxford: Oxford University Press.
Enserink, M. (2011). Controversial studies give a deadly flu virus wings. Science, 334(6060), 1192–1193.
Fauci, A. S., & Collins, F. S. (2012). Benefits and risks of influenza research: Lessons learned. Science, 336(6088), 1522–1523.
Fehr, C. (2011). What is in it for me? The benefits of diversity in scientific communities. In H. E. Grasswick (Ed.), Feminist epistemology and philosophy of science (pp. 133–155). Berlin: Springer.
Fehr, C., & Plaisance, K. S. (2010). Socially relevant philosophy of science: An introduction. Synthese, 177(3), 301–316.
Grady, D. (2012). Panel Says Flu Research is Safe to Publish. New York Times. Published March 30, 2012.
Greenfieldboyce, N. (2012). Dutch government set to reconsider restrictions on publishing bird flu study. NPR Shots Blog. http://www.npr.org/blogs/health/2012/04/20/151049741/dutch-government-set-to-reconsider-its-ban-on-publishing-bird-flu-study.
Hansen, L. A. (2013). Institution animal care and use committees need greater ethical diversity. Journal of Medical Ethics, 39(3), 188–190.
Hardimon, M. (1994). Role obligations. The Journal of Philosophy, XCI(7), 333–363.
Holbrook, J. B. (2005). Assessing the science–society relation: the case of the US National Science Foundation’s second merit review criterion. Technology in Society, 27(4), 437–451.
Holbrook, J. B. (Ed.) (2009). Special issue on the U.S. national science foundation’s broader impacts criterion. Social Epistemology, (Vol. 23, pp. 3–4).
Horner, J., & Minifie, F. D. (2011). Research ethics I: Responsible conduct of research (RCR)—Historical and contemporary issues pertaining to human and animal experimentation. Journal of Speech, Language, and Hearing Research, 54, S303–S329.
Intemann, K. (2009). Why diversity matters: Understanding and applying the diversity component of the National Science Foundation’s broader impacts criterion. Social Epistemology, 23(3–4), 249–266.
Irzik, G. (2010). Why should philosophers of science pay attention to the commercialization of academic science? EPSA Epistemology and Methodology of Science, 129–138.
Kaiser, D., & Moreno, J. (2012). Dual-use research: Self-censorship is not enough. Nature, 492(7429), 345–347.
Kitcher, P. (2004). Responsible biology. BioScience, 54(4), 331–336.
Kourany, J. A. (2010). Philosophy of science after feminism. Oxford: Oxford University Press.
Lakoff, S. A. (1980). Moral responsibility and the “Galilean imperative”. Ethics, 100–116.
Longino, H. E. (1990). Science as social knowledge: Values and objectivity in scientific inquiry. Princeton: Princeton University Press.
Longino, H. E. (2002). The fate of knowledge. Princeton: Princeton University Press.
Macrina, F. L. (2000). Scientific integrity: An introductory text with cases. American Society for Microbiology.
McKay, C. (1995). The Evolution of the institutional review board: A brief overview of its history. Clinical Research and Regulatory Affairs, 12(2), 65–94.
Miller, F. G., Wendler, D., & Swartzman, L. C. (2005). Deception in research on the placebo effect. PLoS Medicine, 2(9), e262.
Murillo, L. N. (2012). Ferret-Transmissible Influenza A (H5N1) Virus: Let us err on the side of caution. mBio, 3(2).
Oreskes, N., & Conway, E. M. (2010). Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming. London: Bloomsbury Press.
Pimple, K. D. (2002). Six domains of research ethics. Science and Engineering Ethics, 8(2), 191–205.
Radder, H. (2010). The commodification of academic research. Pittsburgh: University of Pittsburgh Press.
Resnik, D. B. (1998). The ethics of science: an introduction. London: Routledge.
Schienke, E. W., Tuana, N., Brown, D. A., Davis, K. J., Keller, K., Shortle, J. S., et al. (2009). The role of the national science foundation broader impacts criterion in enhancing research ethics pedagogy. Social Epistemology, 23(3–4), 317–336.
Shrader-Frechette, K. S. (1994). Ethics of scientific research. Lanham: Rowman & Littlefield Pub Incorporated.
Solomon, M. (2001). Social empiricism. Cambridge, MA: MIT press.
Solomon, M. (2006). Norms of epistemic diversity. Episteme, 3(1–2), 23–36.
Solomon, M. (2012). Socially responsible science and the unity of values. Perspectives on Science, 20(3), 331–338.
Wendler, D. (1996). Deception in medical and behavioral research: Is it ever acceptable? The Milbank Quarterly, 87–114.
Wendler, D., & Miller, F. G. (2004). Deception in the pursuit of science. Archives of Internal Medicine, 164(6), 597.
Wolinetz, C. D. (2012). Implementing the New US dual-use policy. Science, 336(6088), 1525–1527.
Acknowledgments
This paper evolved at a series of talks I gave at Center for Interdisciplinary Research (ZiF) at the Universität Bielefeld (at a conference on “The Social Relevance of Philosophy of Science” organized by Martin Carrier and Don Howard), the Department of Philosophy at the University of Cincinnatti (at their 49th Annual Philosophy Colloquium colloquium on socially engaged philosophy of science organized by Angela Potochnik), the Department of Philosophy at the University of Guelph (organized by Maya Goldenberg), and at the Department of Philosophy at the University of Alberta (organized by Ingo Brigandt). My thanks to all those who organized these events and to the audiences who provided such helpful feedback. This paper also benefited from the comments of Doreen Fraser, Marc Lipsitch, Ted Richards, two anonymous referees, and the feedback from my students of PHIL 271 Science in Society Winter 2013 at the University of Waterloo.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Douglas, H. The Moral Terrain of Science. Erkenn 79 (Suppl 5), 961–979 (2014). https://doi.org/10.1007/s10670-013-9538-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10670-013-9538-0