Abstract
Scientific expert testimony is crucial to public deliberation, but it is associated with many pitfalls. This article identifies one—namely, expert trespassing testimony—which may be characterized, crudely, as the phenomenon of experts testifying outside their domain of expertise. My agenda is to provide a more precise characterization of this phenomenon and consider its ramifications for the role of science in society. I argue that expert trespassing testimony is both epistemically problematic and morally problematic. Specifically, I will argue that scientific experts are subject to a particular obligation. Roughly, this is the obligation to qualify their assertions when speaking outside their domain of scientific expertise in certain contexts. Thus, I argue that scientists who possess expert knowledge are confronted with hard questions about when and how to testify and, therefore, that being a scientific expert comes with great responsibility. Consequently, I provide a concrete “expert guideline” according to which scientific experts, in certain contexts, face an obligation to qualify their assertions when speaking outside their domain of expertise. Furthermore, I consider a number of the conditions in which the guideline is waived or overridden. On this basis, I consider the broader aspects of the roles of scientific experts in a society with a high division of cognitive labor that calls for trust in scientific expert testimony.
Similar content being viewed by others
Notes
My articulation of the expert guideline seeks to avoid some difficulties for Hardwig’s maxim. Moreover, I will provide a more explicit and specific motivation for it. The differences are partly due to the fact that Hardwig’s pioneering work was called to my attention after I had developed the key ideas. Once my proposed guideline is articulated, I will compare it to Hardwig’s maxim in Sect. 4. In addition, I will draw on Hardwig’s division of responsibilities in Sect. 6.
Ballantyne’s paper was (thanks to Nikolaj J.L.L. Pedersen) called to my attention a few weeks before the present paper was accepted. For almost a decade, I used the phrase ‘domain-strain testimony’ (coined by Julie Brummer) to denote the phenomenon under discussion. But since philosophy suffers from far too many terminological overlaps (and associated Frege problems), I opted to adopt Ballantyne’s phrase ‘epistemic trespassing.’ I cannot include a discussion of putative differences between Ballantyne’s characterization and mine that go beyond terminology and focus. But it strikes me as the right choice to align the terminology and subsequently distinguish between species of trespassing if such differences emerge. I am grateful to Ballantyne for 11th hour correspondence and to the editors for permitting the terminological change after the paper was accepted.
A chemist needs a lab. Hence the “in suitable conditions” qualification. The “is likely” qualification, in turn, is required to account for cases in which the experts subscribe to a warranted but false theory and, therefore, form unreliable judgments about D. The characterization builds on Goldman (2001). Thanks to Nikolaj J.L.L. Pedersen.
So, the present characterization concerns a narrower subspecies than, for example, Collins and Evans’ approach (Collins and Evans 2007; Collins et al. 2016). Due to the domain-relativity it aligns fairly well with what they would characterize as esoteric expertise. But there are also substantive differences, as I regard their approach as overly inclusive in some regards. For further perspectives (Selinger and Crease 2006; Kusch 2007; Whyte and Crease 2010).
Thanks to Alvin Goldman and Nikolaj J.L.L. Pedersen for criticisms of a previous attempt to capture expert trespassing testimony in an audience-relative manner. The ensuing case is mine but it does little more than illustrate their general critique.
The qualification in (ii) according to which D2 need only be a purported domain of expertise is due to a case pressed on me by Johan Gersel. The case involves an expert in astrophysics who asserts propositions pertaining to a pre-Big Bang event. Perhaps such propositions are epistemically inaccessible, and hence do not fall within any domain of expertise. But even if this is so, the professor’s speech act should still be characterized as an instance of expert trespassing testimony.
In contrast, one need not acquire any special cognitive competencies to become a parent. (In fact, one may be more likely to become a parent if one is ignorant about certain things).
I say ‘largely’ due to the noted fact that whether S is an expert in the first place depends on context.
I will not characterize epistemic peerhood or consider the epistemology of disagreement here. However, these notions, and thereby C1, should be investigated further.
I conjecture that expert trespassing testimony has comparatively many negative effects. In fact, a motivation for writing the paper comes from observing epistemically problematic instances of the phenomenon in public media.
Nitrox is a mixed gas that consists of atmospheric air enriched with oxygen. One should learn how to dive with Nitrox before diving with Nitrox.
This might have been Hardwig’s intention. If so, the present guideline explicates this intention by way of some more fine-grained terminology that furthers an epistemic argument for it. Moreover, The Singapore statement on research integrity (Resnik and Shamoo 2011) also contains a suggestion (10. Public Communication) that is similar to Expert Trespassing Guideline despite variations in formulation.
Perhaps it may even be permissible to assert a literal falsehood in order to convey an important truth.
Infeasibility may be understood in terms of practical difficulty. If it were taken to imply practical impossibility, it would near-trivially provide an excuse given an appropriate ought-implies-can principle.
Thanks to Klemens Kappel for discussion.
Thanks are due to Nils Holtug for bringing up this issue. A referee highlights that further complications ensue since it is frequently non-transparent whether the consequences of a given testimony are (sufficiently) good.
Thanks to Sune Lægaard and Jesper Ryberg.
It will be worth investigating whether the present suggestion demands such an approach. Thanks to Sune Lægaard.
I believe that this is often the uncomfortable predicament of philosophers. I hope to elaborate elsewhere.
The paper originates from an oral exam at the University of Copenhagen in January 2009 where the censor, Lars Binderup, brought up the issue of experts speaking outside their domain. In January 2010, on a long afternoon stuck in an Omani dessert, I sketched out the basic cases, arguments and definitions. In the fall of 2010, I got helpful comments from Finn Collin, Kira Vrist Rønn and Julie Zahle at a workshop. The paper was subsequently presented at conferences and workshops in Copenhagen (August 2011, September 2012 and February 2012) and Roskilde University (May 2013). Helpful written comments were provided, at various stages, by Kristoffer Ahlström-Vij, Catherine Elgin, Nils Holtug, Julie Boldt Mark, Klemens Kappel, Sune Lægaard and Nikolaj Jang Pedersen. I have benefitted from correspondence and discussion with Jessie Baird, Nathan Ballantyne, Johan Gersel, Alvin Goldman, Matthew Liao, Miranda Fricker, Katherine Hawley. Thanks also to various journal referees including two very helpful ones from this journal. Special thanks to Julie Brummer.
References
Anderson, E. (2011). Democracy, public policy, and lay assessments of scientific testimony. Episteme, 8(2), 144–164.
Ballantyne, N. (forthcoming). Epistemic trespassing. Mind. https://doi.org/10.1093/mind/fzx042.
Christensen, D. (2007). Epistemology of disagreement: The good news. Philosophical Review, 116(2), 187–217.
Collins, H., & Evans, R. (2007). Rethinking expertise. Chicago: University of Chicago Press.
Collins, H., Evans, R., & Weinel, M. (2016). Expertise revisited II: Contributory expertise. Studies in History and Philosophy of Science, 56, 103–110.
Conee, E., & Feldman, R. (2004). Evidentialism. Oxford: Oxford University Press.
Douglas, H. (2009). Science, policy, and the value-free ideal. Pittsburgh: University of Pittsburgh Press.
Elgin, C. Z. (2001). Word giving, word taking. In A. Byrne, R. Stalnaker, & R. Wedgwood (Eds.), Fact and value: Essays for Judith Jarvis Thomson (pp. 97–116). Cambridge: MIT Press.
Epstein, S. (1996). Impure science: AIDS, activism, and the politics of knowledge. Berkeley: University of California Press.
Fishkin, J. (2009). When the people speak: Deliberative democracy and public consultation. Oxford: Oxford University Press.
Franco, P. L. (2017). Assertion, non-epistemic values, and scientific practice. Philosophy of Science, 84(1), 160–180.
Fricker, E. (2002). Trusting others in the sciences: A priori or empirical warrant? Studies in History and Philosophy of Science Part A, 33(2), 373–383.
Gerken, M. (2011). Warrant and action. Synthese, 178(3), 529–547.
Gerken, M. (2012). Discursive justification and skepticism. Synthese, 189(2), 373–394.
Gerken, M. (2013a). Epistemic reasoning and the mental. Basingstoke: Palgrave Macmillan.
Gerken, M. (2013b). Internalism and externalism in the epistemology of testimony. Philosophy and Phenomenological Research, 87(3), 532–557.
Gerken, M. (2015). The epistemic norms of intra-scientific testimony. Philosophy of the Social Sciences, 45(6), 568–595.
Gerken, M. (2017). On folk epistemology. How we think and talk about knowledge. Oxford: Oxford University Press.
Gerken, M. (forthcoming). Epistemic entitlement—Its scope and limits. In P. Graham and N. J. L. L. Pedersen (eds.), Epistemic entitlement. Oxford University Press.
Goldman, A. I. (1999). Knowledge in a social world. Oxford: Clarendon Press
Goldman, A. I. (2001). Experts: Which ones should you trust? Philosophy and Phenomenological Research, 63(1), 85–110.
Hardwig, J. (1985). Epistemic dependence. Journal of Philosophy, 82(7), 335–349.
Hardwig, J. (1994). Toward an ethics of expertise. In D. E. Wueste (Ed.), Professional ethics and social responsibility (pp. 83–101). London: Rowman and Littlefield.
Hart, H. L. A. (1968). Punishment and responsibility. Oxford: Clarendon Press.
Hawley, K. (2012). Trust: A very short introduction. Oxford: Oxford University Press.
Irwin, A. (1995). Citizen science: A study of people, expertise, and sustainable development. London, New York: Routledge.
John, S. (2015). Inductive risk and the contexts of communication. Synthese, 192(1), 79–96.
Keren, A. (2015). Science and informed, counterfactual, democratic consent. Philosophy of Science, 82(5), 1284–1295.
Kitcher, P. (1990). The division of cognitive labor. Journal of Philosophy, 87(1), 5–22.
Kitcher, P. (1993). The advancement of science: Science without legend, objectivity without illusions. Oxford: Oxford University Press.
Kitcher, P. (2011). Science in a democratic society. Amherst: Prometheus Books.
Kusch, M. (2007). Towards a political philosophy of risk: Experts and publics in deliberative democracy. In T. Lewens (Ed.), Risk: Philosophical perspective. London: Routledge.
Lackey, J. (2011). Assertion and isolated secondhand knowledge. In J. Brown & H. Cappellen (Eds.), Assertion (pp. 251–276). Oxford: Oxford University Press.
Mann, T. (1947/1999). Doctor faustus. Vintage International (transl. Woods, J. E.).
Miller, B. (2015). Trust me—I’m a public intellectual”: Margaret Atwood’s and David Suzuki’s social epistemologies of climate science. In M. Keren & R. Hawkins (Eds.), Speaking power to truth: Digital discourse and the public intellectual (pp. 113–128). Athabasca: Athabasca University Press.
Nielsen, K. H. (2013). Scientific communication and the nature of science. Science & Education, 22, 2067–2086.
Resnik, D. B., & Shamoo, A. E. (2011). The Singapore statement on research integrity. Accountability in Research, 18(2), 71–75.
Rolin, K. (2008). Science as collective knowledge. Cognitive Systems Research, 9(1–2), 115–124.
Selinger, E., & Crease, R. (2006). The philosophy of expertise. New York: Columbia University.
Steele, K. (2012). The scientist qua policy advisor makes value judgments. Philosophy of Science, 79(5), 893–904.
Strevens, M. (2003). The role of the priority rule in science. Journal of Philosophy, 100, 55–79.
Thomson, J. J. (1990). The realm of rights. Cambridge, MA: Harvard University Press.
Turner, S. (2001). What is the problem with experts? Social Studies of Science, 31(1), 123–149.
von Goethe, J. W. (1808/2000). Faust: A tragedy (2nd edn) (Ed. Hamlin; Transl. Arndt). W. W. Norton & Company.
Whyte, K. P., & Crease, R. P. (2010). Trust, expertise, and the philosophy of science. Synthese, 177(3), 411–425.
Wilholt, T. (2013). Epistemic trust in science. British Journal for the Philosophy of Science, 64(2), 233–253.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Gerken, M. Expert Trespassing Testimony and the Ethics of Science Communication. J Gen Philos Sci 49, 299–318 (2018). https://doi.org/10.1007/s10838-018-9416-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10838-018-9416-1