Skip to main content
Log in

Self-reflexive cognitive bias

  • Paper in the Philosophy of the Sciences of Mind and Brain
  • Published:
European Journal for Philosophy of Science Aims and scope Submit manuscript


Cognitive scientists claim to have discovered a large number of cognitive biases, which have a tendency to mislead reasoners. Might cognitive scientists themselves be subject to the very biases they purport to discover? And how should this alter the way they evaluate their research as evidence for the existence of these biases? In this paper, we posit a new paradox (the ‘Self-Reflexive Bias Paradox’), which bears a striking resemblance to some classical logical paradoxes. Suppose that research R appears to be good evidence for the existence of bias B, but if B exists, then R would have been subject to B. Thus, it seems sensible for the researcher to reject R as good evidence for the existence of B. However, rejecting R for this reason admits the existence of B. We examine four putative cognitive biases and criticisms of them, each of which seem to be subject to self-reflexivity. In two cases, we argue, paradox is avoidable. In the remaining two, we cannot find a way to avoid the paradox, which poses a practical obstacle to scientific inquiry and results in an intriguing theoretical quandary.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others


  1. There has been such an explosion in the literature on biases that the number of biases have multiplied in popular culture (e.g. Wikipedia lists well over 100 distinct biases).

  2. This characterization of cognitive bias is consistent with a number of definitions offered by researchers working in the field. ‘Cognitive bias refers to a systematic (that is, nonrandom and, thus, predictable) deviation from rationality in judgment or decision-making’ (Blanco 2017). ‘A bias is a systematic discrepancy between the (average) judgment of a person or a group and a true value or norm’ (Gigerenzer 2018). ‘The term biases refers to the systematic errors that people make in choosing actions and in estimating probabilities…’ (Stanovich, Toplak & West 2020).

  3. This terminology agrees broadly with standard usage in cognitive science; for example, in an introduction to a seminal collection of papers, Gilovich & Griffin (2002, 3) define biases as ‘departures from the normative rational theory that served as markers or signatures of the underlying heuristics.’.

  4. Later, we will explore how the paradox manifests itself depending on whether the existence or prevalence of the bias is at issue.

  5. Thanks to [redacted] for pointing this out to us.

  6. Studies on the D-K Effect usually involve novice subjects. Vnuk et al. (2006) is a rare instance of a D-K Effect experiment on a type of scientific expert.

  7. The six numeracy errors are: 1) ‘Random noise can generate X-shaped patterns in Kruger-Dunning-type graphs, and researchers can easily misinterpret these patters as meaningful measures of self-assessment.’ 2) Data sets are too small to offer reliability. 3) There are strong floor and ceiling effects in the type of graph Kruger and Dunning employ. 4) ‘Sorting data pairs by one member of the pair invariably produces the “X-shaped” pattern of Kruger-Dunning graphs and, sorting data by percentile rank renders all expressions of performance as norm-referenced rather than criterion-based.’ 5) ‘Kruger-Dunning graphs cannot show the distributions of varied self-assessment skills in a populace.’ 6) ‘Kruger-Dunning graphs fail to reveal the degree of correlation that exists between self-assessed competence and demonstrated competence on a participant-by-participant basis.’ (Nuhfer et al., 2017, 9).

  8. The Fundamental Attribution Error, the common human tendency to explain aberrant behavior of others in terms of their flawed character while tending to explain one’s own aberrant behavior in terms of rationalizations, can be thought of as an analogue of the inherence heuristic in reasoning about the social world (Jones & Harris 1967; Ross 1977).

  9. Though in our earlier work we wrote ‘innate heuristic’, it would have been more correct to say ‘inherent heuristic.’ Our main criticism was not that Cimpian and Salomon were offering an innate inherence heuristic (as opposed to a non-innate one); rather, our criticism relied on the distinction between inherent (or intrinsic) and extrinsic (or relational) properties.

  10. Sometimes Gigerenzer focuses on the sub-discipline of behavioral economics in particular, but it is clear from the wide range of research that he criticizes that he is targeting most of the work on cognitive bias in the cognitive sciences.

  11. This differs from Brighton and Gigerenzer’s ‘Bias Bias,’ according to which researchers are prone to attribute more importance to the existence of bias than to variance and noise (see Brighton and Gigerenzer 2015).

  12. These figures are based on the Wikipedia entry, ‘List of cognitive biases.’.

  13. Several decades ago, Christensen-Szalinski & Beach (1984) analyzed citation patterns and concluded that research papers that claimed to find evidence for cognitive biases were cited significantly more often than ones that did not. They referred to this as a “citation bias” rather than a bias bias. They also surveyed 80 psychologists working in the field and found that they tended to remember the negative results better than the positive ones.

  14. A variation on this objection would distinguish a reasoning error from the cognitive mechanism that is responsible for the error.

  15. Mugg (2020, 256) makes a similar point within the context of implicit racial bias: ‘given that we cannot rely on introspection to assess whether implicit biases are manifesting and that we continue to find new areas in which they manifest, we have good reason to think that implicit biases manifest in ways yet unknown. We have no strategy for blocking access in such cases.’ In her influential discussion of the epistemic implications of implicit social bias, Saul (2013) argues that the scope of cognitive bias is more restricted than that of social bias, taking probabilistic judgments as her primary illustration. However, we have argued here that other cognitive biases are not so circumscribed. Alfano (2014) and Carter & Pritchard (2017) both discuss the broader epistemic implications of cognitive bias, but neither discuss the possibility of a self-reflexive bias paradox.

  16. We are grateful to an anonymous referee for raising this possibility.


  • Alfano, M. (2014). Expanding the situationist challenge to reliabilism about inference. In A. Fairweather (Ed.), Virtue epistemology naturalized (pp. 103–122). Springer.

    Chapter  Google Scholar 

  • Antony, L. (2016). Bias: Friend or foe? Reflections on Saulish skepticism. In Brownstein & Saul (Eds.), Implicit bias and philosophy. Oxford University Press.

    Google Scholar 

  • Bacon, F. (1620/1902). Novum Organum. P. F. Collier & Son.

  • Bergus, G. R., Chapman, G. B., Gjerde, C., & Elstein, A. S. (1995). Clinical reasoning about new symptoms despite preexisting disease: Sources of error and order effects. Family Medicine, 27, 314–320.

    Google Scholar 

  • Blanco, F. (2017). Cognitive bias. In J. Vonk & T. Shackelford (Eds.), Encyclopedia of animal cognition and behavior. Springer.

    Google Scholar 

  • Bogen, J., & Woodward, J. (1992). Observations, theories and the evolution of the human spirit. Philosophy of Science, 59(4), 590–611.

    Article  Google Scholar 

  • Brewer, W. F. (2012). The theory ladenness of the mental processes used in the scientific enterprise: Evidence from cognitive psychology and the history of science. In R. W. Proctor & E. J. Capaldi (Eds.), Psychology of science: Implicit and explicit processes (pp. 289–334). Oxford University Press.

    Chapter  Google Scholar 

  • Brighton, H., & Gigerenzer, G. (2015). The bias bias. Journal of Business Research, 68, 1772–1784.

    Article  Google Scholar 

  • Burson, K. A., Larrick, R. P., & Klayman, J. (2006). Skilled or unskilled, but still unaware of it: How perceptions of difficulty drive miscalibration in relative comparisons. Journal of Personality and Social Psychology, 90, 60–77.

    Article  Google Scholar 

  • Carter, J. A., & Pritchard, D. (2017). Cognitive bias, scepticism and understanding. In S. R. Grimm, C. Baumberger, & S. Ammon (Eds.), Explaining understanding: New perspectives from epistemology and philosophy of science (pp. 272–292). Routledge.

    Google Scholar 

  • Christensen-Szalinski, J. J., & Beach, L. R. (1984). The citation bias: Fad and fashion in the judgment and decision literature. American Psychologist, 39, 75–78.

    Article  Google Scholar 

  • Cimpian, A., & Salomon, E. (2014). The inherence heuristic: An intuitive means of making sense of the world, and a potential precursor to psychological essentialism. Behavioral and Brain Sciences, 37(5), 461–480.

    Article  Google Scholar 

  • Cohen, L. J. (1981). Can human irrationality be experimentally demonstrated? Behavioral and Brain Sciences, 4(3), 317–331.

    Article  Google Scholar 

  • Dunning, D. (2011). The Dunning-Kruger effect: On being ignorant of one’s own ignorance. Advances in Experimental Social Psychology, 44, 247–290.

    Article  Google Scholar 

  • Evans, J. S. B. T. (2010). Thinking twice: Two minds in one brain. Oxford University Press.

    Google Scholar 

  • Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345(6203), 1502–1505.

    Article  Google Scholar 

  • Gigerenzer, G. (2018). The bias bias in behavioral economics. Review of Behavioral Economics, 5(3–4), 303–336.

    Article  Google Scholar 

  • Gigerenzer, G., & Brighton, H. (2009). Homo heuristicus: Why biased minds make better inferences. Topics in Cognitive Science, 1(1), 107–143.

  • Gilovich, T., & Griffin, D. (2002). Heuristics and biases: Then and now. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment. Cambridge University Press.

    Chapter  Google Scholar 

  • Gladwell, M. (2005). Blink. Little, Brown & Company.

    Google Scholar 

  • Jones, E. E., & Harris, V. A. (1967). The attribution of attitudes. Journal of Experimental Social Psychology, 3(1), 1–24.

    Article  Google Scholar 

  • Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.

    Google Scholar 

  • Khalidi, M. A., & Mugg, J. (2014). The inherent bias in positing an inherence heuristic: Commentary on Cimpian & Salomon. Behavioral and Brain Sciences, 37, 493–494.

    Article  Google Scholar 

  • Knobe, J., & Samuels, R. (2013). Thinking like a scientist: Innateness as a case study. Cognition, 126(1), 72–86.

    Article  Google Scholar 

  • Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121.

  • Kruger, J., & Dunning, D. (2002). Unskilled and unaware—but why? A reply to Krueger & Muller (2002). Journal of Personality and Social Psychology, 82(2), 189–192.

    Article  Google Scholar 

  • Larrick, R. P. (2004). Debiasing. In D. J. Koehler & N. Harvey (Eds.), Blackwell handbook of judgment and decision making (pp. 316–337). Blackwell Publishing.

    Chapter  Google Scholar 

  • Lopes, L. L. (1987). Procedural debiasing. Acta Psychologica, 64(2), 167–185.

    Article  Google Scholar 

  • Mercier, H. (2017). Confirmation Bias—Myside Bias. In R. F. Pohl (Ed.), Cognitive illusions: Intriguing phenomena in thinking, judgment and memory (pp. 99–114). Routledge/Taylor & Francis Group.

    Google Scholar 

  • Mugg, J. (2020). How not to deal with the tragic dilemma. Social Epistemology, 34(3), 253–264.

    Article  Google Scholar 

  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.

    Article  Google Scholar 

  • Nuhfer, E., Cogan, C., Fleisher, S., Gaze, E., & Wirth, K. (2016). Random number simulations reveal how random noise affects the measurements and graphical portrayals of self-assessed competency. Numeracy: Advancing Education in Quantitative Literacy

  • Nuhfer, E., Fleisher, S., Cogan, C., Wirth, K., & Gaze, E. (2017). How random noise and a graphical convention subverted behavioral scientists’ explanations of self-assessment data: Numeracy underlies better alternatives. Numeracy: Advancing Education in Quantitative Literacy.

  • Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 10, pp. 173–220). Academic Press.

    Google Scholar 

  • Saul, J. (2013). Scepticism and implicit bias. Disputatio, 5, 243–263.

    Article  Google Scholar 

  • Shanks, D. (1991). A connectionist account of base-rate biases in categorization. Connection Sciences, 3(2), 143–162.

    Article  Google Scholar 

  • Simon, H. (1969). Sciences of the artificial. MIT Press.

    Google Scholar 

  • Stanovich, K. (2004). The robot’s rebellion: Finding meaning in the age of Darwin. University of Chicago Press.

    Book  Google Scholar 

  • Stanovich, K. E., Toplak, M. E., & West, R. F. (2020). Intelligence and rationality. In R. J. Sternberg (Ed.), Cambridge handbook of intelligence (2nd ed., pp. 1106–1139). Cambridge University Press.

    Google Scholar 

  • Stein, E. (1997). Can we be justified in believing that humans are irrational? Philosophy & Phenomenological Research, 57, 545–565.

    Article  Google Scholar 

  • Todd, P. M., & Gigerenzer, G. (2003). Bounding rationality to the world. Journal of Economic Psychology, 24(2), 143–165.

    Article  Google Scholar 

  • Vnuk, A., Owen, H., & Plummer, J. (2006). Assessing proficiency in adult basic life support: Student and expert assessment and the impact of video recording. Medical Teacher, 28, 429–434.

    Article  Google Scholar 

Download references


We are grateful to audiences at the Society for Philosophy and Psychology annual conference (Ann Arbor, Michigan, 2018) and the Biases in Science conference (Munich, 2019) for helpful feedback, especially Momme van Sydow and Bennett Holman. We would also like to thank Brian Huss, Kevin Clark, and anonymous referees for very useful comments on earlier drafts.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Muhammad Ali Khalidi.

Ethics declarations

Ethical approval

Not applicable.

Informed consent.

Not applicable.

Conflict of interest


Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mugg, J., Khalidi, M.A. Self-reflexive cognitive bias. Euro Jnl Phil Sci 11, 88 (2021).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: