Abstract
Vice essentialism is the view that epistemic vices have robustly negative effects on our epistemic projects. Essentialists believe that the manifestation of epistemic vices can explain many of our epistemic failures, but few, if any, of our epistemic successes. The purpose of this paper is to argue that vice essentialism is false. In §1, I review the case that some epistemic vices, such as closed-mindedness and extreme epistemic deference, have considerably beneficial effects when manifested in collectivist contexts. In §2, I add that there are putative epistemic vices whose repeated manifestation leads to significant epistemic achievements over time. Epistemic recklessness is one such unstable vice. Though Sosa argues that epistemically reckless judgements cannot constitute knowing full well, the repeated manifestation of epistemic recklessness is essential to our becoming fully well knowledgeable in the long run. Without making incompetent judgements in environments that offer unambiguous, actionable feedback, we could not develop the intuitive and reflective competences and meta-competences required for knowing full well.
Similar content being viewed by others
Notes
Morton uses the term ‘paradoxical virtues’ to refer to putative epistemic vices that serve this transcendental function (2012, 59–60). I use the term ‘unstable vices’ instead, partly because there are many contexts in which their manifestations are epistemically harmful and criticisable. In addition, I am not interested in the purely terminological question of what should count as epistemic virtues and vices. Rather, my aim is to show that cognitive dispositions can be epistemically harmful and criticisable without being “…bad for us in just about any context in which we are likely to find ourselves”. Indeed, there are ordinary circumstances in which they can have enormously beneficial effects. This is what makes them unstable.
For an interactionist treatment of the epistemic vices responsible for cognitive biases, see (Bland 2020) and (Bland forthcoming).
Summativism is the opposing view that collective epistemic behaviour is reducible to the behaviour of individuals. Collectivists attack this position with divergence arguments, i.e., arguments that seek to show that individual and collective behaviour can systematically diverge. Lahroodi (2019) surveys a number of these arguments. Section 1.1 and 1.2 present two further divergence arguments.
For a similar discussion, including a reference to Bernard Mandeville, see (Morton 2014).
See, for example, Kuhn 1977; Kitcher 1990; Popper 2002 [1996]; Strevens 2003; and Zollman 2010.
Battaly distinguishes dogmatism from closed-mindedness more generally, which she characterizes as “…an unwillingness or inability to engage (seriously) with relevant intellectual options” (Battaly 2018, 262). In this section, I focus on dogmatic closed-mindedness specifically.
Mercier and Sperber use ‘confirmation bias’ and ‘myside bias’ interchangeably, though they recognize the terms actually denote distinct phenomena (Mercier & Sperber 2017, 218).
More specifically, for an account of the positive role that myside bias played in the development and defence of the theory of continental drift, see (Solomon 1992).
On the advantages of deference with respect to these problems, see also (Goldstein et al. 2001).
Of course, there may have been practical reasons why this epistemic risk was worth taking, such as: the decision-making time-frame, the acquisition costs of intelligence from other sources, etc.
The well known estimate of 10,000 hours comes from Ericsson et al. (1993). While this number is controversial, researchers agree that the mastery of a skill, and even a basic level of competence, requires sustained deliberate practice.
This is true only in environments that provide unambiguous feedback about the accuracy of our judgements. In other environments, epistemic loss aversion compels us to interpret our judgements as being accurate, even when they’re not, which gives rise to the illusion of skill.
Note that because resolution is a component of the Brier score, participants are encouraged to take epistemic risks.
By contrast, Gigerenzer identifies medicine as a negative error culture where costly errors are denied and covered up as a result of hierarchical power structures and perverse incentives (Gigerenzer 2014, 54). This largely accounts for the fact that preventable medical errors are one of the leading causes of death in the United States (Makary & Daniel 2016).
I am grateful to Mark Alfano, Jon Marsh, and three anonymous referees for their helpful comments on earlier drafts of this paper.
Bibliography
Barnett, A. (2020). Aviation safety: A whole new world? Transportation Science, 54(1), 84–96
Battaly, H. (2020). Closed-mindedness as an intellectual vice. In C. Kelp, & J. Greco (Eds.), Virtue-Theoretic epistemology: New methods and approaches (pp. 15–41). Cambridge: Cambridge University Press
Battaly, H. (2018). Closed-mindedness and dogmatism. Episteme, 15(3), 261–282
Bland, S. (forthcoming). Interactionism, debiasing, and the division of epistemic labor. Forthcoming in M. Alfano, G. J. deRidder & C. Klein (Eds.), Routledge companion to social virtue epistemology
Bland, S. (2020). An interactionist approach to cognitive debiasing. Episteme. DOI: https://doi.org/10.1017/epi.2020.9
Cassam, Q. (2019). Vices of the mind: From the intellectual to the political. Oxford: Oxford University Press
Dacey, A. (2020). Come now, let us reason together: Cognitive bias, individualism, and interactionism in critical thinking education. Informal Logic, 40(1), 47–76
Dawes, R., Faust, D., & Meehl, P. (2002). Clinical versus actuarial judgment. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 716–729). Cambridge: Cambridge University Press
Ericsson, K. A., Krampe, R. T., & Tesch-Römer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363–406
Fantl, J. (2018). The limitations of the open mind. Oxford: Oxford University Press
Fantl, J. (2013). A defense of dogmatism. In T. Gendler, & J. Hawthorne (Eds.), Oxford studies in epistemology, Vol. 4 (pp. 34–56). Oxford: Oxford University Press
Gigerenzer, G. (2014). Risk savvy: How to make good decisions. New York: Penguin
Gigerenzer, G. (2008). Rationality for mortals: How people cope with uncertainty. Oxford: Oxford University Press
Gigerenzer, G. (2000). Adaptive thinking: Rationality in the real world. Oxford: Oxford University Press
Goldstein, D. G., et al. (2001). Group report: Why and when do simple heuristics work?. In G. Gigerenzer, & R. Selten (Eds.), Bounded rationality: The adaptive toolbox (pp. 173–190). Cambridge, MA: MIT Press
Henrich, J. (2016). The secret of our success: How culture is driving human evolution, domesticating our species, and making us smarter. Princeton: Princeton University Press
Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: A failure to disagree. American Psychologist, 64(6), 515–526
Kahneman, D., Sibony, O., & Sunstein, C. (2021). Noise: A flaw in human judgment. New York: Little, Brown Spark
Kant, I. (1991). An answer to the question: ‘What is enlightenment’. Trans H. B. Nisbet. New York: Penguin Books
Kitcher, P. (1990). The division of cognitive labor. Journal of Philosophy, 87(1), 5–21
Kripke, S. (2011). Philosophical troubles: Collected papers, vol. 1. Oxford: Oxford University Press
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134
Kuhn, T. (1977). Collective belief and scientific change. The essential tension (pp. 320–339). Chicago: University of Chicago Press
Lahroodi, R. (2019). Virtue epistemology and collective epistemology. In H. Battaly (Ed.), Routledge handbook to virtue epistemology (pp. 407–419). New York: Routledge
Laughlin, P. R., & Ellis, A. L. (1986). Demonstrability and social combination processes on mathematical intellective tasks. Journal of Experimental Social Psychology, 22(3), 177–189
Levy, N., & Alfano, M. (2020). Knowledge from vice: Deeply social epistemology. Mind, 129(515), 887–915
Longino, H. (1990). Science as social knowledge: Values and objectivity in scientific inquiry. Princeton: Princeton University Press
Lucas, E. J., & Ball, L. J. (2005). Think-aloud protocols and the selection task: Evidence for relevance effects and rationalisation processes. Thinking & Reasoning, 11(1), 35–66
Makary, M. A., & Daniel, M. (2016). Medical error – The third leading cause of death in the US. BMJ, 353, https://doi.org/10.1136/bmj.i2139
Meehl, P. (1954). Clinical versus statistical predictions: A theoretical analysis and a review of the evidence. Minneapolis: University of Minnesota Press
Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34(2), 57–74
Mercier, H., & Sperber, D. (2017). The enigma of reason. Cambridge Massachusetts: Harvard University Press
Moore, D. A., & Healy, P. J. (2008). The trouble with overconfidence. Psychological Review, 115(2), 502–517
Morton, A. (2012). Bounded thinking: Intellectual virtues for limited agents. Oxford: Oxford University Press
Morton, A. (2014). Shared knowledge from individual vice: The role of unworthy epistemic emotions. Philosophical Inquiries, 2(1), 163–172
Moshman, D., & Geil, M. (1998). Collaborative reasoning: Evidence for collective rationality. Thinking and Reasoning, 4(3), 231–248
Nagell, K., Olguin, R. S., & Tomasello, M. (1993). Processes of social leaning in the tool use of chimpanzees (Pan troglodytes) and human children (Homo sapiens). Journal of Comparative Psychology, 107, 174–186
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in may guises. Review of General Psychology, 2(2), 175–220
Popper, K. (2002 [1996]). The open society and its enemies. Fifth Edition. New York: Routledge
Smart, P. R. (2018). Mandevillian intelligence. Synthese, 195, 4169–4200
Solomon, M. (1992). Scientific rationality and human reasoning. Philosophy of Science, 59(3), 439–455
Sosa, E. (2014). Knowledge and time: Kripke’s dogmatism paradox and the ethics of belief. In J. Matheson, & R. Vitz (Eds.), The Ethics of Belief (pp. 77–88). Oxford: Oxford University Press
Sosa, E. (2015). Judgement and agency. Oxford: Oxford University Press
Sosa, E. (2019). Telic virtue epistemology. In H. Battaly (Ed.), The routledge handbook of virtue epistemology (pp. 15–25). New York: Routledge
Stichter, M. (2018). The skillfulness of virtue: Improving our moral and epistemic lives. Cambridge: Cambridge University Press
Strevens, M. (2003). The role of the priority rule in science. Journal of Philosophy, 100(2), 55–79
Tetlock, P. (2005). Expert political judgment. Princeton: Princeton University Press
Tetlock, P., & Gardner, D. (2015). Superforecasting: The art and science of prediction. Toronto: Signal
Wason, P. C. (1966). Reasoning. In B. M. Foss (Ed.), New horizons in psychology (pp. 106–137). New York: Penguin
Zollman, K. (2010). The epistemic benefits of transient diversity. Erkenntnis, 72, 17–35
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Bland, S. In defence of epistemic vices. Synthese 200, 59 (2022). https://doi.org/10.1007/s11229-022-03572-7
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11229-022-03572-7