Skip to main content
Log in

Debunking creedal beliefs

  • Original Research
  • Published:
Synthese Aims and scope Submit manuscript

Abstract

Following Anthony Downs’s classic economic analysis of democracy, it has been widely noted that most voters lack the incentive to be well-informed. Recent empirical work, however, suggests further that political partisans can display selectively lazy or biased reasoning. Unfortunately, political knowledge seems to exacerbate, rather than mitigate, these tendencies. In this paper, I build on these observations to construct a more general skeptical challenge which affects what I call creedal beliefs. Such beliefs share three features: (i) the costs to the individual of being wrong are negligible, (ii) the beliefs are subject to social scrutiny, and (iii) the evidential landscape relevant to the beliefs is sufficiently complex so as to make easy verification difficult. Some philosophers and social scientists have recently argued that under such conditions, beliefs are likely to play a signaling, as opposed to a navigational role, and that our ability to hold beliefs in this way is adaptive. However, if this is right, I argue there is at least a partial debunker for such beliefs. Moreover, this offers, I suggest, one way to develop the skeptical challenge based on etiological explanation that John Stuart Mill presents in On Liberty when he claims that the same causes which lead someone to be a devout Christian in London would have made them a Confucian in Peking. Finally, I contend that this skeptical challenge is appropriately circumscribed so that it does not over-extend in an implausible way.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. For a description of this case and its epistemic import, see for instance Elga (2013) and Schoenfield (2018). For an extended discussion of how epistemic principles might provide useful guidance and how we might use heuristics such as the one proposed in this paper, see Ballantyne (2019), particularly Chapters 3 and 4.

  2. For more recent discussions of this idea, see for example: Achen and Bartels (2017), Brennan (2016), Caplan (2008), and Somin (2013).

  3. For a recent discussion and defense of the general phenomenon of motivated ignorance, see Williams (2021a). Relatedly, standpoint epistemologists and critical theorists have argued that members of dominant social groups are willfully ignorant of certain features of their position even where such information is easily available. For a classic discussion of this idea, see Mills (2007), and more recently, Kinney and Bright (2021) and Woomer (2017).

  4. For more on this distinction between instrumental and epistemic rationality, see Kelly (2003). See also Feldman (2000).

  5. For further discussion on how such biased evidence gathering and processing mechanisms can operate in the political case, see Huemer (2016).

  6. Of course, this distinction is a rough one, and there are bound to be cases that are vague. Furthermore, it’s plausible that because of the different kinds of media and internet content that partisans consume, they are likely to be ignorant of basic and easily verifiable facts that might serve to challenge their worldviews—for example, facts about the prevalence of abortion, the relative amounts of defense versus social welfare spending, etc., depending on the case. However, it seems to me that even on such points, disagreement is typically unlikely to persist once the relevant facts are interpersonally verified. Yet such claims usually function as rationalizations for more complex political claims that are more central to partisan ideology. A committed partisan, even when he is made aware of and acknowledges a piece of conflicting evidence might nonetheless move to other bits of (putative) evidence that he sees as supporting his position.

  7. This premise is in part supported by the extensive literature within social psychology and political science, discussed above. The sorts of evidence-processing tendencies documented in Kahan et al. (2017), for instance, are plausibly not robustly truth-tracking. Moreover, insofar as we have the disposition to form socially adaptive beliefs—i.e. adopt the beliefs of the groups important to our success, on creedal issues—such beliefs are not plausibly products of truth-tracking processes either. What causally matters in these cases is where the social incentives lie, not where the truth lies. The argument of this paper thus has a structure similar to more familiar debunking arguments within philosophy. Analogously, for example, evolutionary debunking arguments against moral realism rely on the premise that the selection processes which causally explain our having the evaluative tendencies that we do are not truth-tracking, if robust moral realism is true (Joyce 2016; Kahane 2011; Street 2006). For a helpful recent overview of the extensive literature on debunking arguments in a variety of domains besides morality, see Korman (2019).

  8. Relatedly, some recent work finds that cognitive rigidity in general is a good predictor of ideological extremism (Zmigrod et al., 2019). For a recent discussion of the lengths to which various factions went in the twentieth century to enforce ideological conformity, see Cherniss (2021).

References

  • Achen, C. H., & Bartels, L. M. (2017). Democracy for realists: Why elections do not produce responsive government. Princeton University Press.

    Google Scholar 

  • Anderson, C., Hildreth, J. A., & Howland, L. (2015). Is the desire for status a fundamental human motive? A review of the empirical literature. Psychological Bulletin, 141(3), 574–601.

    Google Scholar 

  • Ballantyne, N. (2019). Knowing our limits. Oxford University Press.

    Google Scholar 

  • Bénabou, R., & Tirole, J. (2016). Mindful economics: The production, consumption, and value of beliefs. Journal of Economic Perspectives, 30(3), 141–164.

    Google Scholar 

  • Bicchieri, C. (2006). The grammar of society. Cambridge University Press.

    Google Scholar 

  • Brennan, J. (2009). Polluting the polls: When citizens should not vote. Australasian Journal of Philosophy, 87(4), 535–549.

    Google Scholar 

  • Brennan, J. (2016). Against democracy. Princeton University Press.

    Google Scholar 

  • Caplan, B. (2008). The myth of the rational voter. Princeton University Press.

    Google Scholar 

  • Cherniss, J. (2021). Liberalism in dark times: The liberal ethos in the twentieth century. Princeton University Press.

    Google Scholar 

  • Cohen, G. A. (2000). If you’re an egalitarian, how come you’re so rich? Harvard University Press.

    Google Scholar 

  • Downs, A. (1957). An economic theory of political action in a democracy. Journal of Political Economy, 65(2), 135–150.

    Google Scholar 

  • Elga, A. (2007). Reflection and disagreement. Noûs, 41(3), 478–502.

    Google Scholar 

  • Elga, A. (2013). The puzzle of the unmarked clock and the new rational reflection principle. Philosophical Studies, 164, 127–139.

    Google Scholar 

  • Feldman, R. (2000). The ethics of belief. Philosophy and Phenomenological Research LX, 3, 667–695.

    Google Scholar 

  • Festinger, L. (1962). Cognitive dissonance. Scientific American, 207(4), 93–106.

    Google Scholar 

  • Freiman, C. (2017). Unequivocal justice. Routledge.

    Google Scholar 

  • Freiman, C. (2020). Why it’s ok to ignore politics. Routledge.

    Google Scholar 

  • Funkhouser, E. (2017). Beliefs as signals: A new function for belief. Philosophical Psychology, 30(6), 809–831.

    Google Scholar 

  • Funkhouser, E. (2021). Evolutionary psychology, learning, and belief signaling: Design for natural and artificial systems. Synthese, 199, 14097–14119. https://doi.org/10.1007/s11229-021-03412-0

    Article  Google Scholar 

  • Funkhouser, E. (2022). A tribal mind: Beliefs that signal group identity or commitment. Mind & Language, 37(3), 444–464.

    Google Scholar 

  • Gallagher, S. (2008). Direct perception in the intersubjective context. Consciousness and Cognition, 17(2), 535–543.

    Google Scholar 

  • Gampa, A., Wojcik, S., Motyl, M., Nosek, B., & Ditto, P. (2019). (Ideo)Logical reasoning: Ideology impairs sound reasoning. Social Psychological and Personality Science, 10(8), 1075–1083.

    Google Scholar 

  • Gibbons, A. F. (2021). Political disagreement and minimal epistocracy. Journal of Ethics and Social Philosophy. https://doi.org/10.26556/jesp.v19i2.1088.

  • Golman, R., Hagmann, D., & Loewenstein, G. (2017). Information avoidance. Journal of Economic Literature, 55(1), 96–135.

    Google Scholar 

  • Greco, D. (2021). Climate change and cultural cognition. In M. Budolfson, T. McPherson, & D. Plunkett (Eds.), Philosophy and climate change. Oxford University Press.

    Google Scholar 

  • Hannon, M. (2022). Are knowledgeable voters better voters? Politics, Philosophy & Economics, 21(1), 29–54.

    Google Scholar 

  • Hardwig, J. (1991). The role of trust in knowledge. The Journal of Philosophy, 88(12), 693–708.

    Google Scholar 

  • Huemer, M. (2016). Why people are irrational about politics. In J. Anomaly, G. Brennan, M. Munger, & G. Sayre-McCord (Eds.), Philosophy, politics, and economics: An anthology. Oxford University Press.

    Google Scholar 

  • Iyengar, S., & Westwood, S. J. (2015). Fear and loathing across party lines: New evidence on group polarization. American Journal of Political Science, 59(3), 690–707.

    Google Scholar 

  • Joyce, R. (2016). Evolution, truth-tracking, and moral skepticism. In Essays in moral skepticism (pp. 142–158). Oxford University Press.

  • Kahan, D. (2012). Why we are poles apart on climate change. Nature. https://doi.org/10.1038/488255a

    Article  Google Scholar 

  • Kahan, D., Peters, E., Dawson, E. C., & Slovic, P. (2017). Motivated numeracy and enlightened self-government. Behavioural Public Policy, 1(1), 54–86.

    Google Scholar 

  • Kahane, G. (2011). Evolutionary debunking arguments. Nous, 45(1), 103–125.

    Google Scholar 

  • Kelly, T. (2003). Epistemic rationality as instrumental rationality: A critique. Philosophy and Phenomenological Research LXVI, 3, 612–640.

    Google Scholar 

  • Kelly, T. (2005). The epistemic significance of disagreement. Oxford Studies in Epistemology, 1, 167–196.

    Google Scholar 

  • Kinney, D., & Bright, L. (2021). Risk aversion and elite-group ignorance. Philosophy and Phenomenological Research. https://doi.org/10.1111/phpr.12837

    Article  Google Scholar 

  • Korman, D. Z. (2019). Debunking arguments. Philosophy Compass. https://doi.org/10.1111/phc3.12638

    Article  Google Scholar 

  • Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498.

    Google Scholar 

  • Lomasky, L., & Brennan, G. (2000). Is there a duty to vote? Social Philosophy and Policy, 17(1), 62–86.

    Google Scholar 

  • Marks, J., Copland, E., Loh, E., Sunstein, C., & Sharot, T. (2019). Epistemic spillovers: Learning others’ political views reduces the ability to assess and use their expertise in nonpolitical domains. Cognition, 188, 74–84.

    Google Scholar 

  • Mercier, H., & Sperber, D. (2017). The enigma of reason. Harvard University Press.

    Google Scholar 

  • Mill, J. S. (2008). On liberty and other essays. Oxford University Press.

    Google Scholar 

  • Mills, C. (2007). White ignorance. In S. Sullivan & N. Tuana (Eds.), Race and epistemologies of ignorance (pp. 11–38). State University of New York Press.

    Google Scholar 

  • Nefsky, J. (2017). How you can help, without making a difference. Philosophical Studies, 174, 2743–2767.

    Google Scholar 

  • Nietzsche, F. (1966). Beyond good and evil. Trans by Walter Kaufmann. Random House.

  • Pascal, B. (1910). Pensées. Trans by W.F. Trotter. Dent.

  • Plato. (1997). Plato: Complete works. J. Cooper (Ed.). Hackett Publishing.

  • Pronin, E., & Kugler, M. B. (2007). Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot. Journal of Experimental Social Psychology, 43(4), 565–578.

    Google Scholar 

  • Schoenfield, M. (2018). An accuracy based approach to higher order evidence. Philosophy and Phenomenological Research, 96(3), 690–715.

    Google Scholar 

  • Somin, I. (2013). Democracy and political ignorance: Why small government is better. Stanford University Press.

    Google Scholar 

  • Street, S. (2006). A Darwinian Dilemma for realist theories of value. Philosophical Studies, 127(1), 109–166.

    Google Scholar 

  • Talisse, R. B. (2019). Overdoing democracy: Why we must put politics in its place. Oxford University Press.

    Google Scholar 

  • Tooby, J. (2017). Coalitional instincts. Edge (blog). Retrieved from https://www.edge.org/response-detail/27168.

  • Trivers, R. (2011). The folly of fools: The logic of deceit and self-deception in human life. Basic Books.

    Google Scholar 

  • Vavova, K. (2018). Irrelevant influences. Philosophy and Phenomenological Research XCVI. https://doi.org/10.1111/phpr.12297

    Article  Google Scholar 

  • West, R. F., Meserve, R. J., & Stanovich, K. E. (2012). Cognitive sophistication does not attenuate the bias blind spot. Journal of Personality and Social Psychology, 103(3), 506–519.

    Google Scholar 

  • White, R. (2010). You just believe that because. Philosophical Perspectives, 24(1), 573–615.

    Google Scholar 

  • Williams, D. (2021a). Motivated ignorance, rationality, and democratic politics. Synthese, 198, 7807–7827. https://doi.org/10.1007/s11229-020-02549-8

    Article  Google Scholar 

  • Williams, D. (2021b). Socially adaptive belief. Mind & Language, 36, 333–354.

    Google Scholar 

  • Williams, D. (2022). The marketplace of rationalizations. Economics & Philosophy. https://doi.org/10.1017/S0266267121000389

    Article  Google Scholar 

  • Woomer, L. (2017). Agential insensitivity and social supported ignorance. Episteme, 16(1), 73–91.

    Google Scholar 

  • Zmigrod, L., Rentfrow, P. J., & Robbins, T. W. (2019). Frontiers in psychology. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2019.00989

    Article  Google Scholar 

Download references

Acknowledgements

I would like to thank the two anonymous reviewers at Synthese for extremely helpful comments on this paper. Thanks also to Michael Hannon, Max Hayward, Yoaav Isaacs, Brandon Warmke, and participants at the Truth and Politics workshop in Bamberg, Germany, for feedback on earlier versions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hrishikesh Joshi.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Joshi, H. Debunking creedal beliefs. Synthese 200, 514 (2022). https://doi.org/10.1007/s11229-022-03991-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11229-022-03991-6

Keywords

Navigation