Inductive risk and the contexts of communication

Abstract

In recent years, the argument from inductive risk against value free science has enjoyed a revival. This paper investigates and clarifies this argument through means of a case-study: neonicitinoid research. Sect. 1 argues that the argument from inductive risk is best conceptualised as a claim about scientists’ communicative obligations. Sect. 2 then shows why this argument is inapplicable to “public communication”. Sect. 3 outlines non-epistemic reasons why non-epistemic values should not play a role in public communicative contexts. Sect. 4 analyses the implications of these arguments both for the specific case of neonicitinoid research and for understanding the limits of the argument from inductive risk. Sect. 5 sketches the broader implications of my claims for understanding the “Value Free Ideal” for science.

This is a preview of subscription content, access via your institution.

We’re sorry, something doesn't seem to be working properly.

Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Notes

  1. 1.

    See John (2011, p. 502) for a slightly different account of “epistemic standards”, which this account builds on.

  2. 2.

    Elliott (2011a), raises a similar concern although the specific formulation below draws on unpublished work by Anthony Woodman. (See Steel and Whyte 2012 and Elliott and McKaughan 2014, for other concerns about Douglas’s work).

  3. 3.

    This reworked communicative obligation might be justified by a more general account of moral responsibility (as Douglas justifies her original proposal) or in some other way—such as by appeal to Grice’s “cooperative principle” to “make your contribution such as it is required, at the stage at which it occurs, by the accepted purpose or direction of the talk exchange in which you are engaged” (Grice 1975, p. 45). In this paper, I will not discuss the broader issue of how to justify communicative obligations more generally.

  4. 4.

    See Ziliak and McCloskey (2007) for extremely thorough discussion of how significance tests are routinised.

  5. 5.

    See Howard-Snyder (1997) for a useful overview of the history and content of this principle.

  6. 6.

    Interestingly, Douglas herself suggests that Kevin Elliott’s ethics of expertise, according to which experts are obliged to communicate that information which allows others to make informed choices, is problematic because it is unclear who experts’ audiences are (Douglas 2012).

  7. 7.

    I am grateful to Rune Nyrup for this point.

  8. 8.

    The final section of John (forthcoming) develops these points in greater detail.

  9. 9.

    These remarks relate to Edward Craig’s claim (1999) that the social role of the concept of “knowledge” is to identify “reliable informants”. I suggest that the institutions of scientific research ensure that scientists are a super-“reliable informant”: whatever a hearer’s practical interests, she has reason to defer to what they say.

  10. 10.

    Note here the interesting relationship to the “precautionary principle” in environmental and public health policy-making, which some authors (e.g. Sunstein 2005) read as a reminder to policy-makers that a threat may be sufficiently well-warranted to justify action even if it is not sufficiently well-warranted to be “scientifically certain” of its existence. The proposals above suggest that as well as reminding policy-makers to beware of scientific reticence, maybe scientists should sometimes be less reticent. See John (2010), for further comments on how the problem of inductive risk relates to interpreting the precautionary principle.

  11. 11.

    Furthermore, the proposed distinction between different forms of communication is preferable to Elliott’s similarly pluralistic suggestion that the propriety of scientists’ appeal to values depends on the particular “goals” prioritized in their context (see, for example, Elliott 2013, p. 381; Elliott and McKaughan 2014). Elliott’s approach might seem to justify not appealing to non-epistemic values in, for example, journal articles if the “goals” of that activity are promoting truth, rather than aiding regulation. However, it is unclear why the fact that a scientist has a particular epistemic goal should grant her exemption from other moral considerations. What my argument does, then, is to “fill in” a non-epistemic justification for pursuing what might seem to be epistemic goals.

  12. 12.

    Note then that there may be an interesting analogy here between scientific and legal contexts. In a recent paper, Enoch et al. (2012) have argued that courts’ refusal to use statistical evidence might be understood in terms of the epistemic good of “sensitivity”. However, as they also note, that we can redescribe courts’ practices in this way leaves open a further justificatory question: why should courts care about this epistemic good, given that the exclusion of statistical evidence often seems to conflict with important aims of the legal system. They suggest, then, that “policy” considerations must be used to justify this practice. I suggest that a similar dual-level structure applies in the case of science.

  13. 13.

    I am very grateful to the following people for discussion of the ideas raised in this paper: Anna Alexandrova, Shahar Avin, Marion Boulicault, Hasok Chang, Charlotte Goodburn, Tim Lewens, Emily McTernan, Onora O’Neill and Anthony Woodman. A previous version of this paper was presented at the Departmental Seminar, Department of History and Philosophy of Science, University of Cambridge, April 2013, and I benefitted from the discussion there. The comments by two anonymous referees for Synthese were unusually constructive and helpful. I am most grateful, however, to the several cohorts of undergraduate students who patiently sat through my lectures on the topic of inductive risk where I tried to articulate the concerns above.

References

  1. Betz, G. (2013). In defence of the value free ideal. European Journal for Philosophy of Science, 3, 207–220.

    Article  Google Scholar 

  2. Biddle, J., & Winsberg, E. (2010). Value judgements and the estimation of uncertainty in climate modelling. In J. Busch & P. D. Magnus (Eds.), New waves in philosophy of science (pp. 127–197). London: Palgrave Macmillan.

    Google Scholar 

  3. Carrington, D. (2013). Bee-harming pesticides banned in Europe. The Guardian Mon, 29th April, 2013, at http://www.theguardian.com/environment/2013/apr/29/bee-harming-pesticides-banned-europe (Accessed 15th March 2014)

  4. Craig, E. (1999). Knowledge and the state of nature. Oxford: Clarendon Press.

    Google Scholar 

  5. Desneux, N., Decourtye, A., & Delpuech, J. M. (2007). The sublethal effects of pesticides on beneficial arthropods. Annual Review of Entomology, 52, 81–106.

    Article  Google Scholar 

  6. DEFRA. (2013). An assessment of key evidence about neonicotinoids and bees. London: Department for Environment, Food and Rural Affairs.

  7. Douglas, H. E. (2009). Science, policy, and the value-free ideal. Pittsburgh: University of Pittsburgh Press.

    Google Scholar 

  8. Douglas, H. E. (2012). Book review of Kevin Elliott, Is a little pollution good for you? Philosophy of Science, 79(3), 425–428.

    Article  Google Scholar 

  9. EFSA Panel on Plant Protection Products and their Residues (PPR). (2012). Scientific opinion of the panel on plant protection products and their residues on a request from the European commission on the science behind the development of a risk assessment of plant protection products on bees ( Apis mellifera, Bombus spp. and solitary bees). The EFSA Journal, 10(5), 2668.

    Google Scholar 

  10. EFSA. (2013a). Press release: EFSA identifies risks to bees from neonicitinoids available at www.efsa.europa.eu/en/press/news/130116.htm.

  11. EFSA. (2013b). Guidance on the risk assessment of plant protection products on bees ( Apis mellifera, Bombus spp. and solitary bees). EFSA Journal, 11(7), 3295.

  12. Elliott, K. (2011). Direct and indirect roles for values in science. Philosophy of Science, 78(2), 303–324.

    Article  Google Scholar 

  13. Elliott, K. (2013). Douglas on values: From indirect roles to multiple goals. Studies in History and Philosophy of Science, 44, 375–383.

    Article  Google Scholar 

  14. Elliott, K., & McKaughan, D. (2014). Nonepistemic values and the multiple goals of science. Philosophy of Science, 81(1), 1–21.

    Article  Google Scholar 

  15. Enoch, D., Fisher, T., & Spectre, L. (2012). Statistical evidence, sensitivity and the legal value of knowledge. Philosophy and Public Affairs, 40(3), 197–224.

    Google Scholar 

  16. Fantl, J., & McGrath, M. (2010). Knowledge in an uncertain world. Oxford: Oxford University Press.

    Google Scholar 

  17. Gaa, J. (1977). Moral autonomy and the rationality of science. Philosophy of Science, 44(4), 513–541.

    Article  Google Scholar 

  18. Gerken, M. (2012). On the cognitive basis of knowledge ascriptions. In M. Gerken & J. Brown (Eds.), Knowledge-ascriptions. Oxford: Oxford University Press.

    Google Scholar 

  19. Grice, P. (1975). Logic and conversation. In C. Peter & M. Jerry (Eds.), Sntax and semantics 3: Speech acts (pp. 41–58). New York: Academic Press.

    Google Scholar 

  20. Henderson, D. (2011). Gate-keeping contextualism. Episteme, 8(1), 83–96.

    Article  Google Scholar 

  21. Henry, M., Béguin, M., Requier, F., Rollin, O., Odoux, J.-F., Aupinel, P., et al. (2012). A Common pesticide decreases foraging success and survival in honey bees. Science, 336(6079), 348–350.

    Article  Google Scholar 

  22. Hempel, C. G. (1965). Science and human values. In his Aspects of scientific explanation (pp 81–96). New York: Free Press.

  23. Howard-Snyder, F. (1997). The rejection of objective consequentialism. Utilitas, 9(02), 241–248.

    Article  Google Scholar 

  24. Jeffrey, R. (1956). Valuation and acceptance of scientific hypotheses. Philosophy of Science, 23(3), 237–246.

    Article  Google Scholar 

  25. John, S. (2010). In defence of bad science and irrational policies: An alternative account of the precautionary principle. Ethical Theory and Moral Practice, 13(1), 3–18.

    Article  Google Scholar 

  26. John, S. (2011). Expert testimony and epistemological free-riding. The Philosophical Quarterly, 61(244), 496–517.

    Article  Google Scholar 

  27. John, S. (Forthcoming). The example of the IPCC does not vindicate the value free ideal: A reply to Gregor Betz forthcoming in European. Journal for Philosophy of Science.

  28. Kant, I. (1970). What is enlightenment? In H. Reiss (Ed.), Kant’s political writings. Cambridge: Cambridge University Press.

    Google Scholar 

  29. Kitcher, P. (2011). Science in a democratic society. New York: Prometheus Books.

    Google Scholar 

  30. Kukla, R. (2012). Author TBD: Radical collaboration in contemporary biomedical research. Philosophy of Science, 79(5), 845–858.

    Article  Google Scholar 

  31. Levi, I. (1960). Must the scientist make value judgments? The Journal of Philosophy, 57(11), 345–357.

    Article  Google Scholar 

  32. Lipton, P. (2004). Inference to the best explanation. London: Routledge.

    Google Scholar 

  33. Nickel, P. J. (2011). Testimonial entitlement, norms of assertion and privacy. Episteme, 10(02), 207–217.

    Article  Google Scholar 

  34. O’Neill, O. (1986). The public use of reason. Political Theory, 14(4), 523–551.

    Article  Google Scholar 

  35. Rawls, J. (1993). Political liberalism. New York: Columbia University Press.

    Google Scholar 

  36. Rudner, R. (1953). The scientist qua scientist makes value judgments. Philosophy of Science, 20(1), 1–6.

    Article  Google Scholar 

  37. Saul, J. (2013). Lying, misleading, and the role of what is said. Oxford: Oxford University Press.

    Google Scholar 

  38. Steel, D. (2010). Epistemic values and the argument from inductive isk. Philosophy of Science, 77(1), 14–34.

    Article  Google Scholar 

  39. Steel, D., & Whyte, K. P. (2012). Environmental justice, values, and scientific expertise. Kennedy Institute of Ethics Journal, 22, 163–182.

    Article  Google Scholar 

  40. Steele, K. (2012). The scientist qua policy advisor makes value judgments. Philosophy of Science, 79(5), 893–904.

    Article  Google Scholar 

  41. Stokstad, E. (2012). Field research on bees raises concern about low dose pesticides. Science, 335(6078), 1555.

    Article  Google Scholar 

  42. Sunstein, C. R. (2005). Laws of fear: Beyond the precautionary principle. Cambridge: Cambridge University Press.

    Google Scholar 

  43. Trouwborst, A. (2002). Evolution and status of the precautionary principle in international law. The Hague: Kluwer Law International.

    Google Scholar 

  44. Whitehorn, P., O’Connor, S., Wackers, F., & Goulson, D. (2012). Neonicotinoid pesticide reduces bumble bee colony growth and queen production. Science, 336(6079), 351–352.

    Article  Google Scholar 

  45. Wilholt, T. (2013). Epistemic trust in science. The British Journal for the Philosophy of Science, 64(2), 233–253.

    Article  Google Scholar 

  46. Ziliak, S., & McCloskey, D. (2007). The Cult of statistical significance. Ann Arbor: University of Michigan Press.

    Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Stephen John.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

John, S. Inductive risk and the contexts of communication. Synthese 192, 79–96 (2015). https://doi.org/10.1007/s11229-014-0554-7

Download citation

Keywords

  • Inductive risk
  • Values in science
  • Social epistemology
  • Neonicitinoid research
  • Public/private distinction
  • Communicative obligations