Modern democratic societies tend to appeal to the authority of science when dealing with important challenges and solving their problems. Nevertheless, distrust in science remains widespread among the public, and, as a result, scientific voices are often ignored or discarded in favour of other perspectives. Though superficially “democratic”, such a demotion of science in fact hinders democratic societies in effectively tackling their problems. Worryingly, some philosophers have provided ammunition to this distrust and scepticism of science. They either portray science as an institution that has unrightfully seized political power, or they claim that science constitutes only one voice among many and that scientists should know their proper place in our societies. As philosophers of science, we believe that it is potentially dangerous to undermine trust in science in this way. Instead, we believe that philosophers should help people to understand why science, even though it is far from perfect, deserves our trust and its special standing in modern societies. In this paper, we outline what such an explanation may look like from a naturalistic and pragmatic perspective, and we discuss the implications for the role of philosophy of science in science education.
When faced with severe problems and challenges such as climate change and the COVID pandemic, modern societies often rely on the authority of science, both to diagnose the problem and to find solutions, on the assumption that science provide us with the most reliable picture of the world. And indeed, this expectation has not been disappointing, since science has been quite successful in helping us overcome many societal and global challenges. Think, for instance, of the incredibly rapid development of vaccines against COVID or the diagnosis and consequent solution for the growing hole in our ozone layer. However, despite the impressive track record of science, some philosophers have been suspicious and even sceptical of society’s trust in science. If we only listen to what science has to say, argued the French philosopher Michel Foucault (1976), we end up with what he labelled “biopower”, a society dictated and ruled on the basis of knowledge delivered by the life sciences. In a similar vein, Feyerabend (1975) argued that science deserves no special privileges as it constitutes just one voice among many. We should therefore not only have a separation of church and state but also of science and state.Footnote 1
What both philosophers and their modern adherents suggest is that society’s unique trust in science is largely if not entirely misplaced and unwarranted. Science is just a means for a group of people to dominate and regulate society, and scientific knowledge deserves no special privilege and authority. What is worrisome about both accounts, we believe, is that they encourage, foster, and justify distrust in science among the public. Such distrust is already widespread and tends to hinder society in dealing with its challenges and effectively solving its problems, as we will demonstrate below with the examples of the COVID pandemic and climate change. As philosophers of science, we think that undermining trust in science is both unjustified and potentially dangerous. Instead, we believe that philosophers should help lay people understand how science works and why it deserves our (calibrated) trust.
To answer this important question, we will start with a brief exposition about the cognitive capacities and limitations of our human mind. This naturalistic approach in epistemology and philosophy of science has a long tradition, including thinkers such as Francis Bacon and David Hume, but today we can rely on developments in evolutionary and cognitive psychology. First, we discuss how our cognitive make-up poses serious obstacles to an accurate understanding of the world. Next, we explain how science provides scaffolds to our intuitive understanding of the world, allowing us to develop and handle counterintuitive concepts. Perhaps the most important scaffold is the construction of a social ecology that makes the most of our capacities for interactive reasoning, resulting in what Rauch (2021) has recently labelled the “constitution of knowledge”, a dynamical collection of rules, values, and institutions that are geared towards the production of reliable beliefs about the world. It is because of these scaffolding processes that we can mitigate our intuitive biases and constraints and effectively deploy our reasoning capacities. This naturalistic understanding explains why, though science is far from perfect, it is far superior to alternative perspectives. Despite what philosophers like Foucault and Feyerabend have argued, if anything, politicians in democratic societies tend to place too little trust in science, rather than too much. We conclude with a discussion of the implications of our account for science education and of the role that philosophy of science can play in this context.
2 The Unscientific Mind?
Since science is a construction by human minds, pioneers from the earliest days of the scientific revolution have thought about the powers and limitations of the human mind. In his discussion of the scientific method, Bacon (1620) already included an analysis of what he described as “idols”, patterns of thought that interfere with the acquisition of knowledge. David Hume, in the introduction to A Treatise of Human Nature (1739–1740), wrote that “the science of man is the only solid foundation for the other sciences”. Bacon, Hume, and other philosophers of those days could only rely on common sense and astute observations, but today we have a much more powerful tool at our disposal, namely cognitive science. This discipline studies the ways in which the human mind handles information, which makes it the ideal source of insights about our epistemic capacities.
What picture of the mind emerges from the cognitive sciences? Kahneman and Tversky, for instance, demonstrated by means of numerous empirical studies that humans are far from the ideal of rational actors who, when making a judgement or decision, calculate probabilities and objectively weigh the pros and cons of each option (Kahneman, 2011). Instead, we rely upon a whole suit of mental heuristics to come up with quick and spontaneous solutions to our problems. This does not mean that humans are irredeemably irrational. As Gigerenzer has argued, these “fast and frugal” heuristics typically result in adequate reflexive responses to particular adaptive problems, which renders them “ecologically rational” (Gigerenzer et al., 1999). However, when confronted with more abstract and complex problems, as in the case of Kahneman and Tversky’s experiments, these solutions often break down, thus producing a whole range of biases, such as the availability or the representativeness bias. People end up drawing the wrong conclusions, which makes them (appear) irrational.Footnote 2
The fact that we are evolved primates explains why scientific thinking does not come naturally and why science needs all sorts of checks and safeguards to protect us from the foibles of irrationality. It also accounts for the fact that the view of the world emerging from modern science conflicts with our intuitive world view (McCauley, 2000; Shtulman, 2017; Wolpert, 1992). To survive and reproduce, we do not need a representation of the world that is scientifically correct but only one that is sufficiently accurate for us to efficiently navigate our surroundings. This is not to say that our cognitive capacities have been selected at the expense of their truth-tracking capacities. Evolution is not indifferent to truth and accuracy, if these are conducive to fitness. Our minds evolved to enable us to quickly and adequately respond to opportunities, challenges, and risks in our immediate environment. In order to do so, our mind attends only to information that is relevant for our survival. However, many truths are completely irrelevant to fitness, and evolution only cares about local and ecologically relevant truths, not truths about the cosmos at large. As a result, we develop mental models of the world that are accurate enough for managing everyday problems and situations but that break down outside their limited range of application (Boudry & Vlerick, 2014).
Often, a fully accurate understanding of the world would either be too costly to obtain, potentially stifle us, or be entirely irrelevant to our survival. As such, evolution has endowed us with bundles of expectations about relevant aspects of our surroundings, which we can label as “intuitive ontologies” (Boyer & Barrett, 2005). For instance, we have the intuitive expectation that objects will not move without being moved by an external force, that they will not pass through one another, and that they will not suddenly disappear (Spelke, 1990). These expectations about objects constitute our intuitive physics. Similarly, we have expectations about the living world (intuitive biology), other people’s minds (intuitive psychology), social groups (intuitive sociology), and about economical transactions (intuitive economics). These ontologies are not elaborate theories. They are hunches that automatically help us to make sense of the world around us and as such play an important role in how we navigate the world.
Nevertheless, because they implicitly impose structure and causal relations upon the world, these hunches strongly affect the development and understanding of scientific knowledge. Recently, Shtulman (2017) has extensively documented how, even in modern scientific societies, they render people effectively “science blind”. Since they make intuitive sense, they render us sceptical of scientific concepts and theories that are often highly counterintuitive (McCauley, 2011). In our intuitive conception of the world, heat is some sort of fluid, not another way of describing the movement of molecules; objects stop moving when they run out of force, not because they are impeded by friction; and organisms possess an unobservable and immutable core that determines their identity. As Shtulman notes, these intuitions are coherent, widespread, and robust, which means that they are very difficult to overcome. This also explains why, historically, the development of science is a rare phenomenon and why many people still fail to develop a scientific view on the world.
The obstacles posed by our intuitions also become clear when we draw a comparison with pseudoscience and science denialism (for a discussion of these two phenomena, see, e.g., Hansson, 2017). In contrast to real scientists, pseudoscientists and science denialists often tap into the very same intuitions that tend to hinder scientific understanding. Pseudoscientific ideas manage to appeal to range of evolved cognitive mechanisms and thus manage to become widely distributed (Boudry et al., 2015a, b). Just as smileys and emoticons exploit our face recognition system and candy piggybacks on our evolved taste for sweetness, so does creationism tap into our essentialist intuitions (Blancke & De Smedt, 2013), conspiracy theories into coalitional threat detection (van Prooijen & Van Vugt, 2018), and GMO opposition into intuitive feelings of disgust (Blancke et al., 2015). Pseudoscience and science denialism often make intuitive sense; science hardly ever does.
Furthermore, recent developments in philosophy and psychology suggest that people and the groups they associate with often hold misbeliefs when these serve certain social purposes, especially when errors are associated with low costs (e.g. Bergamaschi Ganapini, 2021; Funkhouser, 2017; Mercier, 2020; Williams, 2020). Williams (2020) has called them “socially adaptive beliefs”. People might adopt such beliefs to signal loyalty to a group, thus deriving social rewards and avoiding social punishments. For instance, those who claim that the COVID-19 measures are ineffective might do so not because of evidence-based reasons but because it conflicts with a political ideology they identify with (e.g. a right-leaning ideology that is suspicious of government interventions and strong public health policy). In many cases, people may cite evidential reasons for their beliefs that turn out to be spurious. For instance, anti-vaccination activists invoke unproven correlations between vaccination and autism to rationalize their intuitive resistance to the injection of alien or toxic substances in their body (Miton & Mercier, 2015). People might also hold certain beliefs to coordinate with others. Sharing the rumour that COVID is no worse than a flu helps people to ally against what they consider to be repressive measurements. Such beliefs also function to signal group membership. By expressing them, one demonstrates one’s willingness to break away from the mainstream view that the pandemic requires strong action and consequently one’s commitment to the dissenting group (Mercier, 2020). In these cases, people are not motivated to know the truth, for instance, about the fatality rate of Sars-CoV-2. They primarily want to fit in with the group they associate with (Storr, 2021).
3 Minds Make Science
This brief survey of some of the literature on our evolved cognition suffices to show that when confronted with problems that are not part of the ancestral environment, our unaided minds will no longer suffice to make sense of the world. They require support and correction. This is exactly what science provides. It draws on ordinary processes of inquiry that we rely on in our everyday lives, but these processes become supported by tools and crutches of all sorts (Haack, 2003). Scientists use telescopes, scans, and other devices to extend their observational capacities; they have invented mathematics, logic, and statistics to refine their reasoning; they have developed symbols and formula to restrict the range of possible interpretations and thus to make communication more efficient; and they create micro-worlds in the form of experiments to isolate causes and make their observations voluntary and controllable.
Many of these helps or scaffolds are in place because they correct for our mistakes and mitigate the effect of our biases. This does not mean, however, that scientists are entirely free from error and bias. After all, scientists are humans just as the rest of us, and so we cannot expect them to be cognitively perfect (McIntyre, 2019). They might still make mistakes in their observations, be careless in applying their methodologies, or only pay attention to evidence that confirms pre-existing beliefs. Indeed, scientists no less than regular folk tend to suffer from my-side bias when they want to convince their peers that their hypotheses are correct (Mercier & Heintz, 2014).
The most important cognitive scaffold in science is the reliance on the judgement of scientific peers. In recent decades, sociologists and philosophers have pointed out that science is an inherently social enterprise (Goldman, 1999; Kitcher, 1993a; Longino, 1990; Oreskes, 2019; Ziman, 1968). It is a collaborative effort to solve the puzzles and problems which the world confronts us with. In these collaborations, scientists rely on their peers in all sorts of ways (Haack, 2003). First, they build on knowledge produced by their predecessors and their colleagues. Even Isaac Newton, one of the greatest scientists in history, realized that he was “standing on the shoulders of giants”, in the sense that he could not have developed his theory of gravity without the cumulative achievements of his many predecessors like Kepler, Galileo, and others. The accomplishments which scientists borrow from each other to build on do not only involve theories and concepts, but also tools, methods, and practices. Indeed, each of these scaffolds themselves constitutes the outcome of scientific developments. Despite occasional revolutions and reversals, scientific knowledge is cumulative and progressive in the long run, which means that the scientific knowledge of each generation is superior to all previous generations, even though scientists almost always build on the accomplishments of their predecessors. Arguably, this feature is unique to science and is not characteristic of any of the other “voices” which policymakers may consider when, for example, deciding how to come to terms with a raging pandemic. Second, and relatedly, science depends upon a division of cognitive labour (Kitcher, 1993b). The world has become so complex that it is impossible for one person to be a Renaissance “uomo universale”. Not only do we see disciplines divided in many sub-disciplines and even down to further specializations, but the same happens even within disciplines, for instance, also in devising and conducting experiments. In high-energy physics, for instance, empirical studies require the combined expertise of hundreds or even thousands of researchers (as in the case of the CERN experiments) who do not necessarily know exactly what the contributions of their collaborators consist in or how the experiment as a whole works.
Perhaps the most important way in which science is social is in its reliance on our capacities for interactive reasoning. Recently, cognitive scientists Mercier and Sperber (2017) have proposed that the function of reasoning is not for an individual reasoner to correct his thinking mistakes and arrive at true beliefs. Instead, they argue, reasoning is a social process by which people provide reasons as arguments and justifications. As a result, the production of reasons is “biased and lazy”, resulting in the well-known confirmation bias and my-side bias. However, at the receiving end, the evaluation of reasons is much more critical, which results in the identification and correction of reasoning errors and misbeliefs. The process of interactive reasoning by itself does not necessarily result in reliable beliefs about the world (Blancke et al., 2019). As we discussed above, people who end up adopting beliefs for socially strategic reasons will also be able to provide reasons to justify their beliefs. Interactive reasoning requires the right social conditions to produce knowledge, which are the conditions that have been carefully developed and fine-tuned over time in the institutions and practices of science (Blancke & Boudry, 2022).
Scientists work in an environment that allows them to share their ideas through appropriate venues, facilitates the uptake of criticism, and creates room for every member of the scientific community to voice their opinion, whatever their standing. By interactively scrutinizing one another’s beliefs and the reasons for them, scientists can eventually arrive at a consensus that gives us the best approximation of what is true and real. Interactive reasoning thus transforms individual belief into knowledge, a process Longino labels as “transformative criticism” (Longino, 2002). The process results in reliable practices and beliefs even in domains where our intuitions break down: these are the ones that have survived (so far) the onslaught of scientists’ continuous questioning and scrutinizing. Furthermore, if scientists want their proposals to be endorsed by their peers, they must take care to justify them with reasons they expect their colleagues to accept. As such, they adjust their practices and beliefs to the common standards of their community. This means that the critical exchange of reasons not only affects the fate of science through the evaluation, but also the production of reasons. Scientists realize that only the beliefs and practices that meet the standards of their community will make it through.
Evidence plays a critical role in the process of transformative criticism. Although virtually all human forms of inquiry rely on evidence of some sort—think of inquiries in court to establish whether the accused has committed a crime—in science its role is exceptional. Think, for instance, of the enormous amount of evidence that Darwin provides in On the origin of species in support of his theory of evolution by natural selection, including biogeographical, embryological, and paleontological data. Not only are scientists collectively gathering enormous amounts of empirical data, they also create mini-worlds in the form of experiments, where they can control different variables, make precise measurements, and test rival hypotheses (Rouse, 2015). The Large Hadron Collider that creates the conditions under which physicists can study the smallest particles stands as an impressive example. Subsequently, scientists invoke their evidence as reasons in support of their proposed hypothesis or practice to convince their peers. However, what counts as evidence is not straightforward but depends in its turn on what the relevant scientific community finds acceptable. As Longino (2002, p. 103) points out, “the data are established socially through the interactive discursive processing of sensory interactions. There is no way but the interaction of multiple perspectives to ascertain the observational status of individual perceptions”. In other words, the standards for what constitutes proper evidence are themselves the result of transformative criticism. In fact, it is precisely because of the latter process that science has come to depend crucially on empirical evidence. Most other alleged sources of information such as divination or intuition were deemed unreliable and hence unjustifiable methods in the production of knowledge.
The thesis that social processes of transformative criticism result in objective knowledge does not entail that science is value-free. Science is the result of cooperative efforts, and, as human beings, scientists will inevitably bring values to their work. To the extent that their values distort the production of scientific knowledge, the idea is that transformative criticism cancels out or at least mitigates their impact (Boudry & Pigliucci, 2018). However, values also have a positive and even indispensable role to play in science (Brown, 2020; Douglas, 2009; Longino, 1990). Scientists value knowledge because they want to know how things work and to try to make things better. Epistemic values such as consistency and parsimony regulate what scientists find acceptable. Furthermore, scientists have the responsibility towards society not to inflict any harm on their fellow citizens. And in fact, science can be described as a culture that abides by certain norms such as universalism and organized scepticism (Merton, 1973) or a moral system that depends upon and promotes certain virtues such as curiosity and honesty (Pennock, 2019). According to Rauch (2021), scientists commit to a dynamical collection of values, norms, and institutions that allow “the constitution of knowledge”. The constitution enables scientists to make the most of interactive reasoning as it builds on the rules that no one has final authority, and that people should always adduce empirical evidence or rational arguments to convince others. The “reality-based community” governed by this constitution, and of which science forms an important part, puts a high price on values such as civility, accountability, and pluralism. It is therefore precisely because science incorporates these values that it delivers us exceptionally trustworthy beliefs about the world.
In sum, the reason why we can trust science is because of its peculiar social and cultural arrangements, in which human minds are set to work so that they are most likely to produce knowledge. However, trust does not mean blind trust. As Haack repeatedly emphasizes, the supports and corrections that scientists rely on remain fallible. Experiments can go wrong, instruments can malfunction, and peer review does not always effectively stop sloppy or fraudulent science, or even outright pseudoscience, from being published. Scientists are human beings, which means that they will inevitably make mistakes and that they import all sorts of biases and prejudices that might affect their work, sometimes even an entire research program or scientific discipline; and they might be tempted to lower their standards or even cheat for all sort of reasons, such as boosting their career or reputation, money gain, or caving in to social or financial pressures. Furthermore, science does not always speak with one voice and might even provide contradictory perspectives to societal problems, so that policymakers have to balance and negotiate between them. And then there are those who complicate matters even further by pretending to do science while really engaging in pseudoscience (Blancke et al., 2017). Promoting and restoring trust in science will therefore also necessarily include guidance on how to calibrate that trust, taking into account that scientific output is not always reliable and straightforward and that what looks like science is not always the real thing. In the next section, we discuss how philosophers of science can help to promote and restore a healthy form of trust in science.
4 Turning Minds
4.1 Trust in Science
Since science makes the most of our constraints and capacities to generate reliable beliefs about the world, disregarding or rejecting the insights of science, to the benefit of majority opinion or common sense, can come at a serious cost. When people argue that the perspective of science should be balanced against other societal perspectives, they are often adopting a rather narrow conception of science, mostly equating it to the natural sciences, and they fail to realize that this “balancing” of perspectives is itself often the subject of scientific research. In tackling the COVID-19 pandemic, for instance, many have argued that scientists are narrowly focused on health, and science-based recommendations such as lockdowns and other restrictions have caused more damage than they prevented. However, the trade-off between health and economy is itself the subject of scientific investigations, and these go mostly against popular opinion. In particular, economists have shown that the trade-off between health and economy is largely non-existent, at least for a virus with the profile of Sars-Cov-2: the virus itself wrecks the economy, not so much the restrictions (Arnon et al., 2020). This means that, if you are concerned about the collateral damage to the economy, the best thing to do may be (counterintuitively) to crush the virus first, even with very strict measures (Dunning et al., 1989; Meyerowitz-Katz et al., 2021). International comparisons show that countries which had the best health outcomes also protected their economy, and countries which failed to control the epidemic suffered far worse consequences, both in terms of health and economy (Fernández-Villaverde & Jones, 2020). Still, the trade-off view, though lacking support and conflicting with economic research, was intuitive and therefore compelling.
Another popular way in which people disregard the perspective of science, to their own detriment, is to accept the scientific diagnosis of a problem but maintain that science has hardly anything to say about what solutions are effective. It is true, to be sure, that science does not directly dictate normative questions, but in many cases, scientific knowledge is crucial to understand both the diagnosis of a problem and its most effective solutions. For example, in tackling the crisis of climate change, most activists and concerned citizens accept the diagnoses offered by climatologists: the climate is warming, and human emission of greenhouse gases is responsible. When it comes to solutions to climate change, however, many environmentalists resent science-based “technofixes” such as nuclear energy or genetically modified crops and prefer solutions that are intuitively more palatable, such as the “soft energy paths” (Lovins, 1979) of renewables energies and organic agriculture. In order to evaluate the potential of nuclear energy and renewable energy, however, we have to look at scientific evidence. Nuclear energy has a proven track record in slashing CO2 emissions and reducing pollution (Kharecha & Hansen, 2013), while renewable energies such as wind and solar still suffer from the problem of intermittency and lower power density, and countries like (nuclear) France are still outperforming (renewables-oriented) Germany in terms of emission intensity (Partanen & Korhonen, 2020). Indeed, the only countries that have thus far managed to decarbonize their electricity grid have relied heavily on nuclear power (sometimes in combination with hydropower), not intermittent renewables (Friederich & Boudry, 2022).
As for GMO technology, scientific research not only shows that it is safe to public health and environment but that it has significant benefits both in terms of both climate mitigation (higher yields and less deforestation) and climate adaptation (drought-resistant crops). By contrast, organic farming produces lower yields and thus leads to more deforestation and environmental degradation. Nevertheless, because both nuclear energy and GMO elicit fears and intuitive aversions, which are often fueled by environmentalist campaigns, they encounter strong public opposition (Blancke et al., 2015; Hacquin et al., 2022). Because societies have yielded to unscientific intuitions rather than sound scientific judgements, they have perpetuated and even worsened environmentalist problems. One way to mitigate people’s aversion to science’s dominant role in modern societies is to help them understand that accepting scientific views and following scientific recommendations is in their own best interest, even when it does not feel like it.
Intuitions can be extremely compelling, and when our social environment further endorses them, it becomes very difficult to resist their pull. The antidote consists in developing a population that is scientifically literate enough to understand why they should not follow their hunches but attend to what the scientists have to say.
4.2 The Goal of Science Education
How can science education create and nurture such trust in science?Footnote 3 An answer to this question depends very much upon what one thinks should be the goal of science education. A common view is that students should learn about the content of scientific theories, which would allow them to act appropriately in response to pressing societal challenges such as climate change or poverty. Indeed, lay people often show a lack of understanding of the processes underlying such problems (Krauss & Colombo, 2020), and a better understanding of the relevant scientific knowledge would arguably help in solving societal problems. In at least some cases, such as evolutionary theory and GMOs, there is a link between knowledge and acceptance of scientific theories and technologies (Fernbach et al., 2019; Weisberg, Landrum, Metz, & Weisberg, 2018). Recent studies suggest that providing accurate information significantly reduces science denialism (e.g. Altay & Lakhlifi, 2020; Altay et al., 2022; Schmid & Betsch, 2019).Footnote 4 However, other studies suggest that science denialism does not result from a knowledge deficit but mostly from ideological factors. For example, climate denialists are not less informed about climate science than those who accept the scientific consensus on global warming and sometimes even more so (Kahan, 2012, 2015).
In a recent article with the intriguing title “The public understanding of what”? philosopher Arnon Keren (2018) argues that science education should not aim at bringing lay people’s understanding of science closer to the understanding of experts. Instead, the public understanding of science should be based on a division of cognitive labour, whereby lay people should not adopt the role of “expert insiders”, but of “competent outsiders”. According to Keren, rather than attempting to acquire the beliefs of professional scientists, such competent outsiders need to learn to trust the right sources, based on a proper understanding of the role and importance of consensus in science. If this is the goal of science education, he adds, philosophers of science can play a significant role in developing these competences. Indeed, as we hope to have demonstrated above, philosophy of science sheds light on the nature and authority of science and helps to understand why science is trustworthy.
Although we agree that people should not strive for inside knowledge of science and should instead remain “competent outsiders”, we do believe that they need to have some understanding about the inner workings of science. If not, people may fail to appreciate why science deserves our trust and why it deserves primacy over other “voices” in the public arena. In short, they might be susceptible to science-sceptical arguments like those of Foucault and Feyerabend, according to which science is just one perspective among many and should not claim predominance over other perspectives. Lay people do not need to be aware of all the evidence that supports scientific theories, nor even to understand the theories themselves, but they need to understand how science overcomes our cognitive limitations (namely, by relying on all sorts of scaffolds), and why, therefore, we can be confident that the results of their investigations form the best approximation of the world, even if these results are strikingly counterintuitive and bizarre.
4.3 The Role of Philosophy of Science
How can we achieve this? In this paper, we do not have the ambition to discuss the particulars and practicalities of developing educational materials for young students, but we want to sketch how philosophy of science can help enhance students’ understanding of science. One strategy which we would recommend is to teach students about the limitations of our cognitive capacities and the biases we are all prone to and provide them with concrete examples from the history of science and pseudoscience. By giving them a flavour of how biases and intuitions have distorted our reasoning in the past, students will learn to appreciate that intuitions and appeals to “common sense” are extremely unreliable when it comes to understanding anything about the world outside of the ecological environment our minds are adapted to. If people realize that, for instance, we tend to interpret the world in “essentialist” terms, and such intuitive essentialism can lead us seriously astray (e.g. race pseudoscience, creationism), this might make them a bit more sceptical about their own “common sense” and about the way they usually obtain information about the world (Blancke et al., 2018).
Rather than just informing students about our cognitive make-up and its limitations, we suggest personally exposing them to problems and puzzles that defy intuitions. This can be done by eliciting their intuitive but biased theories about the world and demonstrating how they fail to properly account for our observations. For instance, if students have the intuition that a ball leaving a curved tube will continue to follow a curved path, let them experience how the ball moves in a straight line. If their essentialist intuitions lead them to think of species as fixed categories with crisp borders, confront them with borderline cases like hybrid species, partial interbreeding, and so-called ring species. In doing so, they might come to realize that they need to make a conceptual change to cope with the new experience and that biological species do not reflect immutable types but consists of populations of varying individuals, despite their everyday intuitions and classifications. By using these and other hands-on examples, we hope that students will appreciate that an accurate understanding of the world does not come for free (Carey, 2000; Carey & Spelke, 1994).
Philosophy of science can also help students to appreciate all the scaffolds and practices that scientists rely on to correct their biases and build on their cognitive capacities (Blancke et al., 2018). However, we believe that such insights about the nature and epistemic authority of science will have a much stronger impact if students themselves experience how cognitive and social scaffolds such as used in science enable them to expand and improve their knowledge. This means that they do not simply learn about scientific practices, but also engage in them personally (García-Carmona & Acevedo-Díaz, 2018; Osborne, 2014) and thus come to appreciate how these practices are meaningful in the goal-directed activity of inquiry (Leema K. Berland et al., 2016). These practices include asking questions, planning and carrying out investigations, and analysing and interpreting data (Osborne, 2014).
Perhaps the most important form of cognitive scaffolding that students need to appreciate is how the reliance on their peers through critical discussion leads to better outcomes (Kuhn, 2019; Osborne, 2014). Again, we recommend that students experience such insights about the constitution of knowledge first-hand. For instance, by solving logical or mathematical problems first by themselves and then in small groups, they can learn to appreciate how a social setting enables better solutions (Mercier et al., 2017). Furthermore, they learn to sort out not only what is the right sort of evidence for their question, but also how to make a convincing case to their peers (Leema Kuhn Berland & Reiser, 2009; Kuhn & Modrek, 2021). By engaging in such exercises, student might come to realize that developing knowledge is not an individual affair but requires a particular type of social and critical interaction in which errors and biases are weeded out consistently.
We agree with Keren that the goal of science education is not for everyone to attain the same level of knowledge as scientific experts. However, if we only teach students about our cognitive limitations and the nature and authority of science, this will probably not suffice to make people into competent outsiders. Philosophical reflections could be more effective, we suggest, when students personally engage in the sort of practices scientists rely on when producing knowledge. To create a society with informed outsiders, we need a combination of several approaches (Sinatra & Hofer, 2016). In such a society, people will realize that, when dealing with pressing societal problems, it is in their own best interests to set aside their own opinions and intuitions and that trusting the voice of science may be, even or perhaps especially in democratic societies, justified after all.
We do not mean to imply that we should only listen to virologists and epidemiologists at the expense of other experts such as economists or social scientists. Attending to a variety of experts results in a form of epistemic pluralism that might enable policymakers to handle societal problems more effectively, a position one can also read in Feyerabend (e.g. see Lohse & Bschir, 2020).
On the challenges of educating scientific literacy in the face of unwarranted beliefs, see, e.g., Fasce and Picó (2019).
Some earlier studies reported a backfire effect where people became more negative towards GMOs after receiving information relating to the technology (e.g. Scholderer & Frewer, 2003). However, more recent research indicates that this effect is “elusive” (Wood & Porter, 2016) and “not a robust empirical phenomenon” (Swire-Thompson, DeGutis, & Lazer, 2020). This can be expected given the important role of recognizing and accepting accurate information in human communication (Mercier, 2020).
Altay, S., & Lakhlifi, C. (2020). Are science festivals a good place to discuss heated topics? Journal of Science Communication, 19(01), A07.
Altay, S., Schwartz, M., Hacquin, A.-S., Allard, A., Blancke, S., & Mercier, H. (2022). Scaling up interactive argumentation by providing counterarguments with a chatbot. Nature Human Behaviour. https://doi.org/10.1038/s41562-021-01271-w
Arnon, A., Ricco, J., & Smetters, K. (2020). Epidemiological and economic effects of lockdown. Brookings Papers on Economic Activity.
Bacon, F. (1620). Novum Organum, sive Indicia Vera de Interpretatione Naturae.
Bergamaschi Ganapini, M. (2021). The signaling function of sharing fake stories. Mind & Language. https://doi.org/10.1111/mila.12373
Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26–55. https://doi.org/10.1002/sce.20286
Berland, L. K., Schwarz, C. V., Krist, C., Kenyon, L., Lo, A. S., & Reiser, B. J. (2016). Epistemologies in practice: Making scientific practices meaningful for students. Journal of Research in Science Teaching, 53(7), 1082–1112. https://doi.org/10.1002/tea.21257
Blancke, S., & De Smedt, J. (2013). Evolved to be irrational? Evolutionary and cognitive foundations of pseudosciences. In M. Pigliucci & M. Boudry (Eds.), The philosophy of pseudoscience (pp. 361–379). The University of Chicago Press.
Blancke, S., & Boudry, M. (2022). Pseudoscience as a negative outcome of scientific dialogue: A pragmatic-naturalistic approach to the demarcation problem. International Studies in the Philosophy of Science, 1-16. https://doi.org/10.1080/02698595.2022.2057777
Blancke, S., Van Breusegem, F., De Jaeger, G., Braeckman, J., & Van Montagu, M. (2015). Fatal attraction: The intuitive appeal of GMO opposition. Trends in Plant Science, 20(7), 414–418. https://doi.org/10.1016/j.tplants.2015.03.011
Blancke, S., Boudry, M., & Pigliucci, M. (2017). Why do irrational beliefs mimic science? The Cultural Evolution of Pseudoscience. Theoria, 83(1), 78–97. https://doi.org/10.1111/theo.12109
Blancke, S., Tanghe, K. B., & Braeckman, J. (2018). Intuitions in science education and the public understanding of science. In K. Rutten, S. Blancke, & R. Soetaert (Eds.), Perspectives on science and culture (pp. 223–242). Purdue University Press.
Blancke, S., Boudry, M., & Braeckman, J. (2019). Reasonable irrationality: The role of reasons in the diffusion of pseudoscience. Journal of Cognition and Culture, 19(5), 432–449. https://doi.org/10.1163/15685373-12340068
Boudry, M., & Vlerick, M. (2014). Natural selection does care about truth. International Studies in the Philosophy of Science, 28(1), 65–77. https://doi.org/10.1080/02698595.2014.915651
Boudry, M., & Pigliucci, M. (2018). Vindicating science - By bringing it down. In K. Rutten, S. Blancke, & R. Soetaert (Eds.), Perspectives on science and culture (pp. 243–258). Purdue University Press.
Boudry, M., Blancke, S., & Pigliucci, M. (2015). What makes weird beliefs thrive? The Epidemiology of Pseudoscience. Philosophical Psychology, 28(8), 1177–1198. https://doi.org/10.1080/09515089.2014.971946
Boudry, M., Vlerick, M., & McKay, R. (2015). Can evolution get us off the hook? Evaluating the ecological defence of human rationality. Consciousness and Cognition, 33, 524–535. https://doi.org/10.1016/j.concog.2014.08.025
Boyer, P., & Barrett, H. C. (2005). Domain specificity and intuitive ontology. In D. M. Buss (Ed.), The handbook of evolutionary psychology (pp. 96–118). Wiley
Brown, M. J. (2020). Science and moral imagination. A new ideal for values in science. University of Pittsburgh Press.
Carey, S. (2000). Science education as conceptual change. Journal of Applied Developmental Psychology, 21(1), 13–19. Retrieved from <Go to ISI>://000086119200002
Carey, S., & Spelke, E. (1994). Domain-specific knowledge and conceptual change. In L. Hirschfeld & S. A. Gelman (Eds.), Mapping the mind. Domain specificity in cognition and culture. Cambridge: Cambridge University Press.
Douglas, H. E. (2009). Science, policy and the value-free ideal. University of Pittsburgh Press.
Dunning, D., Meyerowitz, J. A., & Holzberg, A. D. (1989). Ambiguity and self-evaluation: The role of idiosyncratic trait definitions in self-serving assessments of ability. Journal of Personality and Social Psychology, 57(6), 1082.
Fasce, A., & Picó, A. (2019). Science as a vaccine. Science & Education, 28(1), 109–125. https://doi.org/10.1007/s11191-018-00022-0
Fernández-Villaverde, J., & Jones, C. I. (2020). Macroeconomic outcomes and COVID-19: A progress report. Retrieved from
Fernbach, P. M., Light, N., Scott, S. E., Inbar, Y., & Rozin, P. (2019). Extreme opponents of genetically modified foods know the least but think they know the most. Nature Human Behaviour, 3(3), 251–256. https://doi.org/10.1038/s41562-018-0520-3
Feyerabend, P. (1975). Against method. Outline of an anarchistic theory of knowledge. NLB.
Foucault, M. (1976). Histoire de la sexualité. Éditions Gallimard.
Friederich, S., & Boudry, M. (2022). Ethics of nuclear energy in times of climate change: Escaping the collective action problem. Philosophy & Technology, 35(2), 30. https://doi.org/10.1007/s13347-022-00527-1
Funkhouser, E. (2017). Beliefs as signals: A new function for belief. Philosophical Psychology, 30(6), 809–831. https://doi.org/10.1080/09515089.2017.1291929
García-Carmona, A., & Acevedo-Díaz, J. A. (2018). The nature of scientific practice and science education. Science & Education, 27(5), 435–455. https://doi.org/10.1007/s11191-018-9984-9
Gigerenzer, G., Todd, P. M., & the ABC research group. (1999). Simple heuristics that make us smart. Oxford University Press.
Goldman, A. I. (1999). Knowledge in a social world. Clarendon Press.
Haack, S. (2003). Defending science - Within reason. Between scientism and cynism. Prometheus Books.
Hacquin, A.-S., Altay, S., Aarøe, L., & Mercier, H. (2022). Disgust sensitivity and public opinion on nuclear energy. Journal of Environmental Psychology, 80, 101749. https://doi.org/10.1016/j.jenvp.2021.101749
Hansson, S. O. (2017). Science denial as a form of pseudoscience. Studies in History and Philosophy of Science Part A. https://doi.org/10.1016/j.shpsa.2017.05.002
Hume, D. (1739–1740). A treatise of human nature: Being an attempt to introduce the experimental method of reasoning into moral subjects and dialogues concerning natural religion.
Kahan, D. (2012). Why we are poles apart on climate change. Nature, 488(7411), 255.
Kahan, D. (2015). Climate-science communication and the measurement problem. Political Psychology, 36, 1–43.
Kahneman, D. (2011). Thinking fast and slow. Farrar, Straus and Giroux.
Keren, A. (2018). The public understanding of what? Laypersons’ epistemic needs, the division of cognitive labor, and the demarcation of science. Philosophy of Science, 85(5), 781–792. https://doi.org/10.1086/699690
Kharecha, P. A., & Hansen, J. E. (2013). Prevented mortality and greenhouse gas emissions from historical and projected nuclear power. Environmental Science & Technology, 47(9), 4889–4895.
Kitcher, P. (1993). The advancement of science: Science without legend, objectivity without illusions. Oxford University Press.
Kitcher, P. (1993b). The advancement of science. Science without legend, objectivity without illusions. New York: Oxford University Press.
Krauss, A., & Colombo, M. (2020). Explaining public understanding of the concepts of climate change, nutrition, poverty and effective medical drugs: An international experimental survey. PLoS One, 15(6), e0234036. https://doi.org/10.1371/journal.pone.0234036
Kuhn, D. (2019). Critical thinking as discourse. Human Development, 62(3), 146–164. https://doi.org/10.1159/000500171
Kuhn, D., & Modrek, A. S. (2021). Choose your evidence. Science & Education. https://doi.org/10.1007/s11191-021-00209-y
Lohse, S., & Bschir, K. (2020). The COVID-19 pandemic: A case for epistemic pluralism in public health policy. History and Philosophy of the Life Sciences, 42(4), 58. https://doi.org/10.1007/s40656-020-00353-8
Longino, H. (1990). Science as social knowledge: Values and objectivity in scientific inquiry. Princeton University Press.
Longino, H. (2002). The fate of knowledge Princeton: Princeton University Press.
Lovins, A. B. (1979). Soft energy paths: Toward a durable peace: Harper & Row.
McCauley, R. N. (2000). The naturalness of religion and the unnaturalness of science. In F. C. Keil & R. A. Wilson (Eds.), Explanation and cognition (pp. 61–86). MIT Press.
McCauley, R. N. (2011). Why religion is natural and science is not. Oxford University Press.
McIntyre, L. (2019). The scientific attitude. Defending science form denial, fraud, and pseudoscience. Cambridge: MIT Press.
Mercier, H. (2020). Not born yesterday: The science of who we trust and what we believe. Princeton University Press.
Mercier, H., Boudry, M., Paglieri, F., & Trouche, E. (2017). Natural-born arguers: Teaching how to make the best of our reasoning abilities. Educational Psychologist, 52(1), 1–16. https://doi.org/10.1080/00461520.2016.1207537
Mercier, H., & Heintz, C. (2014). Scientists’ argumentative reasoning. Topoi, 33(2), 513–524. https://doi.org/10.1007/s11245-013-9217-4
Mercier, H., & Sperber, D. (2017). The enigma of reason. Harvard University Press.
Merton, R. K. (1973). The sociology of science. Theoretical and empirical investigations. University of Chicago Press.
Meyerowitz-Katz, G., Bhatt, S., Ratmann, O., Brauner, J. M., Flaxman, S., Mishra, S., & Vollmer, M. (2021). Is the cure really worse than the disease? The health impacts of lockdowns during COVID-19. BMJ Global Health, 6(8), e006653.
Miton, H., & Mercier, H. (2015). Cognitive obstacles to pro-vaccination beliefs. Trends in Cognitive Sciences, 19(11), 633–636. https://doi.org/10.1016/j.tics.2015.08.007
Oreskes, N. (2019). Why trust science? Princeton University Press.
Osborne, J. (2014). Teaching scientific practices: Meeting the challenge of change. Journal of Science Teacher Education, 25(2), 177–196. https://doi.org/10.1007/s10972-014-9384-1
Partanen, R., & Korhonen, J. M. (2020). The dark horse: Nuclear power and climate change: National Library of Finland.
Pennock, R. T. (2019). An instinct for truth. Curiosity and the moral character of science. MIT Press.
Rauch, J. (2021). The constitution of knowledge. A defense of truth. Brookings Institution Press.
Rouse, J. (2015). Articulating the world: Conceptual understanding and the scientific image. University of Chicago Press.
Schmid, P., & Betsch, C. (2019). Effective strategies for rebutting science denialism in public discussions. Nature Human Behaviour, 3(9), 931–939. https://doi.org/10.1038/s41562-019-0632-4
Scholderer, J., & Frewer, L. J. (2003). The biotechnology communication paradox: Experimental evidence and the need for a new strategy. Journal of Consumer Policy, 26(2), 125–157. https://doi.org/10.1023/A:1023695519981
Shtulman, A. (2017). Scienceblind: Why our intuitive theories about the world are so often wrong. Basic Books.
Sinatra, G. M., & Hofer, B. K. (2016). Public understanding of science: Policy and educational implications. Policy Insights from the Behavioral and Brain Sciences, 3(2), 245–253. https://doi.org/10.1177/2372732216656870
Spelke, E. S. (1990). Principles of object perception. Cognitive Science, 14(1), 29–56. https://doi.org/10.1207/s15516709cog1401_3
Storr, W. (2021). The status game. William Collins.
Swire-Thompson, B., DeGutis, J., & Lazer, D. (2020). Searching for the backfire effect: Measurement and design considerations. Journal of Applied Research in Memory and Cognition, 9(3), 286–299. https://doi.org/10.1016/j.jarmac.2020.06.006
van Prooijen, J.-W., & Van Vugt, M. (2018). Conspiracy theories: Evolved functions and psychological mechanisms. Perspectives on Psychological Science, 13(6), 770–788.
Weisberg, D. S., Landrum, A. R., Metz, S. E., & Weisberg, M. (2018). No missing link: Knowledge predicts acceptance of evolution in the United States. BioScience, 68(3), 212–222. https://doi.org/10.1093/biosci/bix161
Williams, D. (2020). Socially adaptive belief. Mind & Language. https://doi.org/10.1111/mila.12294
Wolpert, L. (1992). The unnatural nature of science. Faber and Faber.
Wood, T., & Porter, E. (2016) The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior, 41, 135–163. https://doi.org/10.1007/s11109-018-9443-y
Ziman, J. (1968). Public knowledge. The social dimension of science. Cambridge University Press.
We want to thank Patrick Loobuyck and four anonymous reviewers for their helpful comments.
Conflict of Interest
The authors declare that they have no conflict of interest.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Blancke, S., Boudry, M. “Trust Me, I’m a Scientist”. Sci & Educ 31, 1141–1154 (2022). https://doi.org/10.1007/s11191-022-00373-9