Medical conspiracy theories: cognitive science and implications for ethics

Abstract

Although recent trends in politics and media make it appear that conspiracy theories are on the rise, in fact they have always been present, probably because they are sustained by natural dispositions of the human brain. This is also the case with medical conspiracy theories. This article reviews some of the most notorious health-related conspiracy theories. It then approaches the reasons why people believe these theories, using concepts from cognitive science. On the basis of that knowledge, the article makes normative proposals for public health officials and health workers as a whole, to deal with conspiracy theories, in order to preserve some of the fundamental principles of medical ethics.

Introduction

Conspiracy theories are narratives about events or situations, that allege there are secret plans to carry out sinister deeds. Although this kind of narratives have existed for a long time, scientists are only beginning to understand why people come to believe them in the first place. The medical world is not spared of these dynamics. Conspiracy theories can do a lot of harm, and that is why there is an urgent need to study them. By better coming to understand how they arise and spread, we can begin to propose concrete measures in order to better educate the public, and prevent them from being captivated by these narratives.

In recent times, celebrities in Western media have manifested interest in conspiracy theories. It is not entirely clear whether celebrities’ beliefs in conspiracy theories are genuine, or simply publicity stunts. When it comes to medical conspiracy theories, some celebrities do seem to honestly believe them. Jenny MacCarthy (Gottlieb 2016), Jim Carrey (Bearman 2010), Robert De Niro (Sharfstein 2017) and Bill Maher (Parker-Pope 2009) have been very vocal in their opposition to vaccines, and their alleged links to autism.

Although celebrity culture has been nourishing conspiracy theories for some decades, it appears that more recent political events have increased the popularity of conspiracy theories in the Western world. As early as 1964 Richard Hofstadter (2012) studied the so-called “paranoid style” in American politics. But, many observers converge on the idea that Donald Trump’s political ascendency has marked a new era of conspiratorial thinking in the United States and other countries under its sphere of influence (Hellinger 2019). Before being a politician, Trump was a celebrity; and thus, both roles have managed to influence his followers into making conspiracy theories more popular. He is on record for giving credibility to numerous conspiracy theories: global warming is a hoax, Barack Obama was not born in the United States, Rafael Cruz participated in J.F. Kennedy’s assassination, Bill Clinton ordered the assassination of Vince Foster, Antonin Scalia was murdered, vaccines cause autism.

But, even if celebrity influence is more of a modern phenomenon, it is nevertheless true that conspiracy theories have been present throughout history. In fact, as documented by Uscinski and Parent (2014), conspiracy themes have been persistent in American opinion for more than a century. But, even going further back, Joseph Roisman (2006) has documented how the rhetoric of conspiracy was already prominent in Ancient Greece, and Suetonius’ telling of the lives and times of Rome’s first twelve Caesars are also filled with all sorts of conspiracy theories and rumors. Anthropologists have documented conspiracy theories in peoples as diverse as the Yanomami (Chagnon 1983) and the Azande (Evans-Pritchard 1963). The fact that even hunter gatherers (Von Rueden and van Vugt 2015) have conspiracy theories, seems to indicate that this is indeed a universal phenomenon (West and Sanders 2003).

Therefore, even though particular social contexts may magnify the prevalence of conspiracy theories, it is well established that conspiracy theories have deep psychological bases that are present in all human beings. In this article, I shall rely on principles of cognitive science to attempt to understand why people believe conspiracy theories. This is an important endeavor in relationship to medical ethics, because some of the most prominent conspiracy theories pertain to medicine. As documented by Oliver and Woods, the percentage of Americans accepting medical conspiracy theories is alarmingly high; for example, only 44% disagree that doctors want to vaccinate children, even though they know vaccines are harmful; 37% agree that the FDA refuses to release the cure of cancer; only 46% disagree that fluoridation is a secret plot to poison people (Oliver and Woods 2014). This has important implications, as exposure to medical conspiracy theories influence health behaviors (Jolley and Douglas 2014).

Consequently, medical ethicists and public health officials must find a way to overturn medical conspiracy theories. In order to do this, we must come to an understanding of why people believe these theories in the first place, as this will allow us to take proper steps to design public policies so that acceptance of conspiracy theories remains limited, the public is better informed, and thus they make their decisions on the basis of informed consent, hence satisfying the principle of autonomy in medical ethics, as well as other ethical principles specific to public health such as public engagement and communication (Heldman et al. 2013).

Therefore, the aim of this article is to review the existing literature regarding the psychological and sociological reasons why people believe in conspiracy theories, on the basis of the findings of cognitive science. By “belief”, we shall understand the affirmation that something is true, regardless of its rationale; the concept may also relate to personal attitudes related to particular claims about the world. With that information, another aim of this article is to introduce some ethical implications, and provide an exploratory framework for the ethical design of public policies, in order to attempt to eradicate some of the most prominent conspiracy theories in healthcare. This will be a particularly innovative approach, since there is much theoretical material on the workings of conspiracy theories, but very little on how these theoretical approaches help us understand conspiracy theories specific to the medical realm.

Medical conspiracy theories: a brief review

Health-related conspiracy theories are not necessarily a new phenomenon. And, to understand how they have come to be, we first need to come to terms with a working definition. We can provisionally define them as attempts to explain particular events or situations, as the result of the actions of a small, powerful group, with perverse intentions. That does not imply that conspiracy theories are necessarily false, because, in some cases, some small evil groups have indeed conspired to bring about unfortunate situations. But, for the most part, conspiracy theories rely on sloppy thinking, and they present scenarios that are not accurate. In the medical world, this has been a constant.

Edward Jenner’s discovery of the vaccine against smallpox is a major milestone in the history of medicine, but this event marked the beginning of a new wave of conspiracy mongering (Dube et al. 2015). Public opinion did not properly understand how vaccines work, and soon enough, there were rumors that taking vaccines would make people grow horns (it must remembered that the vaccine originated with cows), and vaccines would actually kill people.

Ever since, conspiracy theories regarding vaccines have remained popular in public opinion. In the 1980s, Dr. John Wilson made a great fuss about DPT vaccine allegedly causing convulsions and cerebral damage (Dyer 1987). In 1998, Andrew Wakefield published an article claiming that the MMR vaccines is linked to autism. Although this paper was thoroughly refuted, it was retracted from the journal where it was published, and most of Wakefield’s co-authors have also retracted from their views, it ultimately unleashed a new wave of a moral panic against vaccination (Goldacre 2008). In recent years there have been occasional outbreaks of measles in affluent areas, and the main factor for this seems to be that parents choose not to vaccinate children, out of fear that they may turn out autistic. In this conspiracy theory, pharmaceutical companies know that vaccines are not safe, but they are still making big profits on them, so they deliberately keep this information hidden.

Conspiracy theories about vaccines have become even more popular in non-Western countries. For example, Pakistan is one of the sole three remaining countries where polio has not been eradicated. In the early 1990s, the annual incidence of polio in each of those countries was about 20,000 cases per year in Pakistan. This was due to a failure in vaccination campaigns. In Pakistan, there is a persistent conspiracy theory that the polio vaccine is a ploy designed the by the CIA to make Muslim men sterile (Andrade and Hussain 2018).

Narratives about the origins of viruses are very popular in conspiracy theories. AIDS has been particularly interpreted as an invention by the US government to reduce black populations. Consequently, this theory is notoriously believed by African American men, and as a result, they tend to use preservatives less frequently (Bogart and Bird 2003). In fact, in this population, the belief that birth control methods are a plan for genocide, is also prevalent, thus further reducing the use of preservatives for safer sex practices (Thorburn and Bogart 2005).

Apart from the narrative about the origins of AIDS, there is also the conspiracy theory, according to which, HIV does not cause AIDS, and retroviral medication is the actually culprit for most causes of death in AIDS patients. This conspiracy theory is particularly popular in sub-Saharan Africa, with various prominent politicians giving it credit and promoting it (Fourie and Meyer 2010). This form of AIDS denialism has caused considerable damage in Africa, and it is of urgent epidemiological concern.

Another virus that frequently draws the attention of conspiracy theorists is Ebola. Conspiracy theorist Leonard Horowitz has been very active in promoting the idea that Ebola has been manufactured by the US government, and as a result, some of his followers have recommended not vaccinating children against any disease whatsoever (Knight 2013). SARS and COVID-19 have also been discussed in conspiracy theory circles, either as a biological weapon against the Chinese, or as an invention of the Chinese government.

The trope that big pharmaceutical companies have the cure for cancer or other deadly diseases, yet do not release it (either to make profits or simply as population control), is also persistent in conspiracy theories. Likewise, some alternative therapies for cancer have been proposed, and despite their lack of evidence in their support, many conspiracy theorists claim that they are effective, but the scientific establishment conspires against it. This has been especially the case with Laetrile, a synthetic form of amygdalin that has been defended as a cure for cancer by many conspiracy theorists, advocating it as replacement for more effective treatments (Ernst 2019).

Issues of substance abuse have also been the subject of various conspiracy theories. It is frequently alleged that marijuana is a safe drug, and was only outlawed under pressure from the paper industry, as the hemp plant was a competitor. By contrast, conspiracy theorists typically accept that cocaine is a dangerous drug, but many believe that the crack cocaine epidemic across the United States in the 1980s, was actually due to a US government plan to specifically target African Americans and keep them addicted, while at the same time profiting from the illegal trade to finance paramilitary groups in Nicaragua (Webb 2019).

Cognitive science of medical conspiracy theories

It should be noted that, up to date, there is no single explanation for conspiracy theories. There are multiple correlations and explanations of particular aspects of conspiracy beliefs, but not necessarily a coherent whole that theoretically encapsulates all conspiracy theories. In this section, I shall approach some important findings of cognitive science pertaining to conspiracy theories, but it is important to keep in mind that this does not necessarily constitute a unified theoretical approach, because it is still a developing field.

Nevertheless, there is some unifying threat in the approaches that will be addressed. That threat is an explanatory framework as to why people believe in conspiracy theories, on a neuroscientific, psychological, and sociological level. Even though these approaches may come from different theoretical perspectives, they complement each other, to the extent that they establish correlations amongst variables, and offer some measure of predictive factors regarding the proclivity to believe conspiracy theories. Inasmuch as the disposition to believe conspiracy theories has some firm biological grounding, I shall approach the way evolutionary theory accounts for specific psychological mechanisms that in human evolution primed us to believe things like conspiracy theories. But, given that conspiracy theories are further developed on account of environmental factors, I shall also examine studies that relate to environmental variables (both psychological and sociological) that facilitate the rise of conspiracy theories.

Conspiracy theories spread very easily. Although the technological advance of social media plays a significant role in their dissemination, it is still true that, even in a preindustrial world without media technologies, conspiracy theories were easily widespread by word of mouth. Conspiracy theories rely on rumor, and cognitive science has produced significant research documenting how gossiping is hardwired in human brains (Rosnow 1991).

When scientists say that something is “hardwired” into the brain, they mean that particular beliefs or behaviors are constant in the human species, because they arise from predetermined arrangements of physical connections between nerve cells (Ottersen and Helm 2002). Now, it is important to keep in mind that rumoring by itself, is not the same as a conspiracy theory. Rumoring is a daily affair in human behavior. Humans are hardwired to rumor, but not necessarily to form conspiracy theories. For conspiracy theories to arise, there probably need to be additional psychological and sociological environmental conditions that facilitate their development. But, these developments are ultimately built on the biological basis of the brain, that prime humans to activities such as gossiping.

Gossiping was an important adaptation in human evolution (McAndrew and Milenkovic 2002). Robin Dunbar defends the view that the main factor in the origin of language is gossip itself (Dunbar 1996). In fact, it is estimated that 80% of conversations are about other human beings. Hominids likely were required to form bands as a way to ensure survival, and their greatest threats came, not only from predators, but also from other bands. Thus, in order to ensure group alliance as a defense against others, constant gossip served as a way to cement bonds, and exclude individuals perceived as dangerous.

It is thus expected that, when considering the causes of particular health problems, human beings will always have the inclination to talk about other human beings in relation to these problems. Consequently, the conversation will turn more interesting if the culprits of diseases are not just microorganisms, cancer cells or unhealthy foods, but rather, other human beings. And, since these are initially rumors about other people, they will ultimately spread rather quickly.

The rise of electronic technology, and most especially, the internet, is also a considerable factor in more recent conspiracy theories (Clarke 2007). In natural conditions, rumor is prevalent, but still limited because communication relies on proximity. However, with the rise of internet, the effects of rumor have even been more potentiated, because now conspiracy theorists are able to connect via forums with people in more distant locations, thus reinforcing their worldviews (Wood 2013).

Although the discipline of memetics has come under sustained criticisms, research into how ideas spread more easily has made important advances (Blackmore 1999). One particularly useful tool in the study of how ideas stick comes from the cognitive science of religion: minimally counterintuitive effects. It has been established that belief in conspiracy theories is more associated with intuitive rather than analytic thinking (Swami et al. 2014). But, conspiracy theories are more popular if they retain an element of minimal counterintuitiveness. As Boyer explains this notion, concepts that violate a few ontological expectations of a category, are more memorable than intuitive and maximally counterintuitive concepts (Boyer 1994). Religious concepts such as fairies, demigods, healing powers, miracles, and so on, are more easily remembered because they step out of the ordinary, and defy the way things happen conventionally. In the same manner, conspiracy theories stick relatively easily, because they involve concepts that are not so common: there is no expectation that everyday, evil scientists in labs manufacture deadly viruses, or that dentists advance water fluoridation in order to make people more stupid. This appeal to counterintuitive concepts ultimately leads conspiracy theorists to frequently make contradictory claims. For example, research shows that many people who believe Princess Diana staged her death, also believe she was killed by the royal family (Wood et al. 2011). Likewise, medical conspiracy theories frequently claim that pharmaceuticals profit from making people buy ineffective cures for cancer (and thus letting people die), yet at the same time keep them alive so that they continue to be clients.

Yet, concepts that are too strange (maximally counterintuitive) do not stick either. That is how Slone (2004) explains why people frequently defend “theologically incorrect” views that, although not approved by official doctrinal teachings of religions, make more sense on an intuitive level. In one particularly useful study, Norenzayan et al. (2006) document how minimally counterintuitive narratives are more easily remembered by subjects.

This also applies to conspiracy theories. Medical conspiracy theories are most likely false, but they are not outrageously bizarre. On the surface, they do have some level of plausibility, especially taking into account that some medical conspiracy theories have turned out to be true. African Americans in greater proportion falsely believe that AIDS was designed by the US government to reduce their population, but it is not false that the US Public Health Service did engage in human experimentation with African American males in the infamous Tuskegee Study of Untreated Syphilis of the 1930–1970s. It is false that vaccines cause autism, but it is true that vaccinations in St Louis caused the death of 13 children in 1901. It is likely false that the US government conspired to get African Americans addicted to crack cocaine, but it is true that the CIA carried out experiments with LSD to test mind control in the MK-Ultra project. And, the list of real medical conspiracy theories does not end there. Moreno (2013) provides an extensive list of secret State experiments in humans throughout history, and there have been plenty of documented cases of unethical human experimentation and reckless medical procedures (McNeill 1993). These serve as foundations for conspiracy theorists to elaborate on the basis of factual information that ultimately makes their claims more intuitive.

Mark Fenster even believes that conspiracy theories may serve an ethical purpose, as in democratic societies where public opinion is a force to be reckoned with, they hold in check potential conspirators (Fenster 1999). However, the evidence more strongly suggests that conspiracy theories have numerous detrimental effects, both social and psychological. On a social level, conspiracy theories are empirically associated with populism (Silva et al. 2017), political extremism (Van Prooijen et al. 2015), and radicalization of fringe groups (Bartlett and Miller 2010). On a psychological level, research shows that belief in conspiracy theories is associated with paranoia (Darwin et al 2011), schizotypy, narcissism (Cichocka et al. 2016a, b) and insecure attachment (Green and Douglas 2018a, b). In fact, ever since Hofstadter’s seminal The Paranoid Style in American Politics, the assumption has been that conspiracy theorists suffer from a form of psychopathology associated with paranoia.

It is true that some influential conspiracy theorists have been markedly paranoid. For example, Nesta Weber notoriously never attended her door without carrying a gun. However, the consensus is now that most conspiracy theorists are not pathological, precisely because their beliefs ultimately rely on cognitive tendencies that are neurologically hardwired and probably have deep evolutionary origins. Paranoia works on a personal level (the individual feels personally attacked), whereas conspiracy theories are about threat perception as a group (Van Prooijen and Van Lange 2014). And even if conspiracy theorists do feel paranoid on a personal level, this is not necessarily pathological, as paranoid traits exist on a continuum in the general population (Bebbington et al. 2013).

Conspiracy theories are not so much explained by paranoia, but rather, by natural inclinations towards agency detection. Steve Guthrie’s (1995) cognitive science of religion is relevant in this regard: according to his theory, religious beliefs come mostly as a result of the human brain’s tendency to attribute agency and detect patterns, usually in the form of anthropomorphism. The same principle applies to conspiracy theories. It has been empirically established that the tendency to detect agency in inanimate stimuli predicts belief in conspiracy theories (Imhoff and Bruder 2014). Evolutionarily, agency detection was an important advantage, as error management theory would predict, under the principle of “better safe than sorry” (Haselton 2000). As Gray and Wegner (2010) explain this principle, under the threat of predators, “the high cost of failing to detect agents and the low cost of wrongly detecting them… [suggests] that people possess a Hyperactive Agent Detection Device, a cognitive module that readily ascribes events in the environment to the behavior of agents”. In medical conspiracy theories, unfortunate things (such as, say, the outbreak of some virus) cannot just happen without a purpose. Some agent must be behind it. And thus, instead of accepting that AIDS spread because of contact with chimpanzees in Africa, conspiracy theorists are better satisfied with attributing agency to the whole phenomenon, preferring to believe that some cabal actually designed the deadly virus.

Heider and Simmel’s (1944) famous experiment of purposeless movements of shapes demonstrated that most subjects tend to attribute intentions and agency to those shapes. Developmental psychologists have long asserted that teleological thinking is deeply enshrined in preschool children (Kelemen 1999), and understanding randomness requires more mature cognitive functions that not all human beings develop to the same extent. It has been documented that believers of conspiracy theories are even more likely to detect nonexistent patterns in random data (Van Prooijen et al. 2018). In fact, many conspiracy theorists acknowledge that their work is mostly about “connecting the dots”, as in David Icke’s Dot Connector video series. Indeed, conspiracy theories frequently fall under the category of “monological belief systems” (Hagen 2018), i.e., a set of interconnected ideas that are mutually reinforcing. Thus, medical conspiracy theories are frequently not just about health issues. They are enshrined in a grander scheme of things, and usually involve the typical suspects: Masons, Illuminati, etc., as well as greater conspiratorial themes, such as the New World Order and population control. This has been very typical of Nancy Turner Banks (2010), a prominent writer about medical conspiracy theories, who frequently brings Jews into her explanations. Conspiracy theorists try to make sense of the world by providing an overly simplistic explanation of phenomena. This is usually done by focusing exclusively on one single idea, and explaining everything else on the basis of that idea.

The monological aspect of conspiracy theories has two important implications. First, inasmuch as everything is connected in a grand conspiracy, the single best predictor of belief in one conspiracy theory is belief in a different conspiracy theory (Goertzel 1994). And second, inasmuch as conspiracy theories reinforce each other, they ultimately become incorrigible: evidence against them is interpreted as evidence of a conspiratorial effort to try to suppress them (Grimmes 2016), thus confirming the original conspiracy theory. This ultimately becomes a form of cognitive dissonance. As documented by Festinger (1957) in a famous study, whenever individuals strongly adhere to beliefs that turn out not to be true, this causes discomfort. But, only rarely, will individuals acknowledge they are in error. More frequently, individuals will accommodate to that discomfort by adjusting the original belief, so that the evidence against it can now be reinterpreted as confirming the original belief.

It is also true that, inasmuch as conspiracies are enshrined in a grand scheme of things, there is also a tendency to explain big events with big causes. This is the so-called proportionality bias. Conspiracy theorists cannot accept that something as big and deadly as the AIDS epidemic came out of something so trivial as casual contacts between chimpanzees and humans. In their mind, such a big phenomenon needs to have bigger causes, such as evil scientists designing HIV to wipe out specific populations. Rob Brotherton (2013) neatly observes that there are countless conspiracy theories about JFK’s assassination, but very few about Ronald Reagan’s assassination attempt. The difference between both cases reflect this proportionality bias: inasmuch as in the first case the president was assassinated, big explanations are sought; in the second case, the president survived, so conspiracy theories about that event were soon left in oblivion.

This is also the case with explanations for medical phenomena. In one particular study, Ebel-Lam et al. (2010) found that when subjects read about a disease outbreak that does not lead to deaths, they are less likely to believe that the outbreak was intended; by contrast, when another group of subjects read a story in which the outbreak does result in deaths, they are more likely to attribute it to a conspiracy.

Piaget and Inhelder (2008) documented how children in preoperational stages rarely believe that accidents just happen. This suggests that, intuitively, we are intention seekers. In fact, Evelyn Rosset’s (2008) empirical studies demonstrate that, subjects pressed with time, are more likely to explain things with greater intentionality bias, inasmuch as in shortness of time, intuition overtakes analytical thinking. Likewise, for conspiracy theorists, there are no accidents. In their mindset, bad things always come from bad agents. Many medical procedures may have minor unfortunate side effects, but conspiracy theorists have trouble understanding that these side effects are not necessarily intended.

Therefore, conspiracy theorists have more difficulties in accepting coincidences, and may struggle with the idea that events that superficially appear connected, in fact are not. For example, most symptoms of autism are first observed when the child turns three years old. This is slightly after children usually receive the MMR vaccine. Consequentially, by “connecting the dots”, conspiracy theorists fall pray to the post hoc ergo propter hoc fallacy (“after this, therefore because of this”), and erroneously come to believe that, simply because the MMR vaccine antecedes the first symptoms of autism, the former causes the latter.

Humans are natural intention seekers, but this tendency is especially enhanced under conditions of anxiety. In one particular study, subjects under anxiety were more prone to perceive patterns in random sequence of dots (Brotherton 2013, p. 7). As Malinowski (1992) theorized, magical thinking becomes especially preponderant in the face of uncertainty. Superstitious behavior that also “connects the dots” by establishing causal relationships amongst unrelated phenomena, becomes more prominent in times of difficulties. Thus, it is expected that conspiracy theories abound more in times of crisis, and in marginalized populations that face greater challenges.

This has been empirically confirmed. Conspiracy theories appear more frequently in the contexts of fires, flood, epidemics and wars (McCauley and Jacques 1979). Feelings of powerlessness also predict conspiracy beliefs (Abalakina-Paap et al. 1999). People who make a connection between vaccines and autism are frequently parents of autistic children themselves. There is no known cause or cure for autism, so in those cases, feelings of powerlessness are considerable, and this feeds more into the theory that there is a conspiracy at play. By contrast, diabetes has well-established causes, and it also has better prospects of treatment; consequentially, little conspiracy mongering surrounds this disease.

Likewise, empirical studies assert that conspiracy beliefs are high particularly among members of stigmatized minority groups (Davis et al. 2018). White Christian Americans are not likely to argue that the US government is out to make them sterile, presumably because they are not stigmatized; this sort of claim is more likely made by African Americans or Muslims, who feel the heat of discrimination more closely.

Anxiety-provoking situations elicit more easily so-called “illusions of control”, and that partly explains how magical thinking arises in difficult situations, as a way to attempt to control the world. In this regard, conspiracy theories also operate similarly to religions, as explanations for incomprehensible phenomena. It is relatively hard to understand how fluoridate helps prevents cavities; in the face of this anxiety evoked by the lack of knowledge, an easier explanation is simply to say that fluoridation is actually an evil Communist plot to destroy America. Conspiracy theories therefore provide an “illusion of explanatory depth” (Rozenblit and Keil 2002), and the best way for conspiracy theorists to assure themselves that they are on the right explanatory track is by constantly engaging in confirmation bias (Klayman 1987).

Additionally, Dan Sperber argues that even when religious believers (or conspiracy theorists) are aware that their theories do not explain sufficiently well the phenomena they address, they are guided by “meta-representations”: they delegate to experts filling in the details (Sperber 2000), and go on continuing to hold their beliefs.

Evolutionarily, anxiety was an important adaptation, and consequently, it is no surprise that human beings are hardwired for constant anxious feelings and behaviors. The sympathetic nervous system activates the fight-or-flight reaction, and this was surely an adaptive mechanism in the face of predators and other threats. Neuberg et al. theorize that human brains are equipped with “threat management systems” that, very much as error management theory would predict, condition humans to constantly focus and react on things that may pose dangers (Neuberg et al. 2010). Unsurprisingly, we react quicker to snakes than to flowers, as has also been empirically documented (Ohman et al. 2001). This particular mind module has facilitated avoidance of diseases; we recognize danger in germs (only intuitively, of course, as a formal theory of microorganisms only came to be in the 19th Century), so consequently we avoid excrement, even in cases when we know it is just fudge, as has been empirically tested in studies (Rozin and Fallon, 1987). However, this threat management system frequently backfires by interpreting as dangerous situations that, in fact, are not. That is how we come to believe that fluoridation, retrovirals, vaccines, etc., are dangerous.

Perhaps even more so than predators and germs, other human beings also represented significant dangers in human evolution. Human beings naturally form coalitions against other human beings, and tribal feelings easily arise (McDonald et al. 2012). Similar patterns have been observed in chimpanzees (Wrangham 1999). Thus, the capacity to detect alliances and figure out how outsiders get together against our own inner group, was a very important adaptation. As Tobby and Cosmides (2015) posit it, human brains are equipped with an “alliance detection” system. Consequently, conspiracy theories put this system in play: in their mental patterns, they bring together unrelated people, and conclude that they are forming an alliance behind closed doors, planning to harm a particular collective. For the most part, physicians are unrelated to politicians, but given that physicians are frequently perceived as a different group (and indeed, they are, given that they are professionally organized as such) in its own right, conspiracy theorists align them with the rest of outsiders, and imagine that they form coalitions to plot against patients.

Alliance detection systems enhance “us-against-them” mentality (Cikara et al. 2011). This is of course very typical in nationalism, and unsurprisingly, it has been empirically established that conspiracy theories are related to “collective narcissism” (Cichoka et al. 2016a, b). In European and Middle Eastern history, Jews have frequently been suspected of being disloyal to the countries in which they live (“rootless cosmopolitans”), and that is presumably one additional factor why they are frequently included in medical conspiracy theories. Scapegoating also plays a significant role in conspiracy mongering (Girard 1986). Inner divisions and difficulties can be channeled towards an outsider, who takes the blame for the community’s problems. African leaders have failed to control the AIDS epidemic, but in order to divert blame and cement group unity, they opt to engage in conspiracy mongering by attributing the origin of the epidemic to outside conspirators, whoever they may be.

Ethics and implications for policy

Although some philosophers have attempted an ethical defense of conspiracy theories (Dentith 2014) (mostly on the basis that it keeps a healthy democratic check on powerful elites, and some conspiracy theories have turned out to be true), it is safe to argue that conspiracy theories do more harm than good. As previously mentioned, conspiracy theories have deleterious social and psychological effects, and especially in the medical realm, they lead to poor health behaviors. So, it can be assumed that there is an ethical duty for physicians and public health officials to attempt to mitigate medical conspiracy theories. But, how? The answer is not so clear, although the preceding information and arguments may provide some guide.

First, it is important to acknowledge that conspiracy theories are not necessarily pathological, and that they rely on evolved mental mechanisms that are hardwired in human brains. Consequently, public health officials can never hope to entirely eradicate medical conspiracy theories, and when they encounter them, they must patiently attempt to refute them, but never disrespecting those who defend them, because alas, conspiratorial thinking is quite natural.

As argued above, given their adherence to monological belief systems, conspiracy theories are frequently incorrigible, and attempts at refutation with convincing evidence, would presumably be interpreted as confirmation of the original conspiracy theory. This is known as the “backfire effect” (Nyhan and Reifler 2010). For example, one particular study found that showing vaccine skeptics a story about a baby who is hospitalized because of measles, nearly doubled the portion of skeptics’ who thought it very likely vaccines had serious side effects (Nyhan et al. 2014).

It would then appear that greater levels of education are useless in countering conspiracy theories. On one level, this appears to be true. Bogart and Thorburn document that higher levels of education do not necessarily prevent against acceptance of conspiracy theories. In fact, especially in a medical context, greater education may increase adherence to conspiracy theories, because individuals can reaffirm their suspicion by learning about real conspiracy theories, as it appears to be the case with African Americans who learn about the Tuskegee syphilis experiment (Nelson et al. 2010).

However, as a whole, education does predict decreased belief in conspiracy theories, and this has been empirically examined with larger sets of data (Van Prooijen 2016). Recall that conspiracy theories rely more on intuitive (and also minimally counterintuitive) approaches. So, as thinking becomes more analytical and less intuitive, conspiracy theories make less sense. In fact, more powerful than the “backfire effect” is the “elusive backfire effect”, i.e., people do abandon conspiracy thinking once they encounter their inconsistencies and lack of evidence (Wood and Porter 2019). This has been especially true in health-related contexts. Health information campaigns do turn out to be successful, and they are effective in correcting the distortions of conspiracy theories (Bode and Vraga 2018).

So, one important implication of this analysis is that health literacy, critical thinking, and general education as a whole, can reduce belief in conspiracy theories. Public health officials need to keep this mind when designing public policy, and physicians need to be prepared to act as educators as complement of their clinical role.

Cognitive science has established some concrete parameters as to how to make communicative campaigns more effective, especially if they pertain to medical conspiracy theories. One important feature of this approach is the emphasis on rhetorical tools that rely less on the emotional centers of the brain. Scare tactics have long been discouraged in public health campaigns, although occasionally, they have been tried, with mixed results. For example, a 1997 campaign in Australia used massive images for scaring purposes in anti-smoking campaigns, with seemingly positive results (Hill et al. 1998). But, more extensive research has proven otherwise. Backer et al. (1992) have done extensive studies showing that techniques, such as showing the effects of tobacco on dentition, are generally not effective.

Cognitive science informs that information processed in the amygdala (such as in the fear response), is received differently, with no due rational consideration (Dolan and Vuilleumier 2003). Consequently, when the dangers of, say, not vaccinating children, are presented with stark images of children suffering measles, subjects typically fail to process the message that the public health campaign may be trying to convey. Paradoxically, subjects may continue to engage in the behavior that public health officials aspire to eradicate.

Given the “elusive backfire effect”, it is more useful for policy makers to design campaigns that engage the rational aspect of information processing in people, when attempting to address medical conspiracy theories. The excessive use of catastrophic scenarios (say, a measles epidemic as a result of not vaccinating children) may hasten individuals to develop anxious attachments. Studies in magnetic resonance imaging suggest that, individuals with higher levels of anxious attachment, increase significantly activity of the amygdala (Riem et al. 2012). In one important study, results came out showing that anxious attachment predicts the general tendency to believe conspiracy theories (Green and Douglas 2018a). Therefore, the use of disturbing material to address conspiracy theories (even in the attempt to refute them), may further contribute to people accepting such theories.

One relevant contribution of cognitive science to the design of public health campaigns, is the development of frame, appeal type, and outcome extremity, in its relationship to the way public information is processed by the brain. In the case of information campaigns addressing medical conspiracy theories, these three elements must be considered, so as to get a clearer picture of what is to be achieved, and to what effect. Various studies have shown that messages that loss-framed messages with more extreme outcomes, have more probability of being remembered (Leshner and Cheng 2009); this implies that information campaigns addressing medical conspiracy theories must include more extreme outcomes. In concrete terms, this implies that if a campaign is to address, say, a conspiracy theory regarding HIV denialism, the message should sufficiently emphasize the details that theory does not sufficiently explain well.

In their educational efforts, public health officials also need to clarify things and make themselves understood. Recall that conspiracy theories frequently fill explanatory gaps, and they serve as heuristics to reduce anxiety in the face of the unknown. In this regard, educational campaigns addressing medical conspiracy theories must make sure to include the rationale for addressing the conspiracy theories in the first place, as well as the use of communication-persuasion matrixes (McGuire 1984). In particular, the use of visual aids has proven to be crucial in health campaigns, as confirmed in various studies (Garcia-Retamero and Cokley 2013), and they should prove especially apt in refuting conspiracy theories. Visual aids rely on intuition (Weitlaner et al. 2013), and recall that conspiracy theories also arise out of intuitive thinking. So the message that runs counter to conspiracy theories must be presented in a similarly intuitive manner, or else, it will not be able to compete in grabbing the attention from the public.

In order to ensure that the public is getting the right message, public health officials must consider lobbying for more advertising campaigns in media. Some people might fear that talking about a conspiracy theory, might raise the issue amongst people who never thought about it in the first place. But, as the principles of cognitive science discussed above suggest, if the theory is properly addressed with sufficient persuasive power, bringing up the topic may even put people on guard so as to be better cognitively prepared when they encounter conspiracy theories for the first time.

Furthermore, apart from advertising campaigns, mandatory screenings of short films whenever citizens have to comply with State requirements (school registration, acquisition of drivers’ license), can also prove effective in the wider awareness of the need to disavow medical conspiracy theories. This approach has proven to work in vaccination campaigns, as well as signing up for organ donation (Evers et al. 1988).

Another important aspect of any health literacy campaign in addressing conspiracy theories is a reliance on a more thorough understanding of what people believe, and the reasons they offer for doing so. The use of focus groups is very important in this regard. For example, prior to targeting African Americans in a public health campaign explaining why it is important for them to seek preventive medical care, it is important to form focus groups so as to hear from them, what they know and think about the Tuskegee Syphilis experiments. In fact, research of this kind has been done with focus groups (Freimuth et al. 2001), and it has been found that, although subjects are aware of the incident, they do not understand the full details. Hearing from subjects themselves, situates public health officials in a better position to address the particular concerns that members from disadvantaged communities may have, and specifically target aspects that may lend themselves to misinterpretation, and consequently, distortion in conspiracy theories.

Cognitive science informs that focus groups are particularly important, given the powerful effect of information transmission amongst communities (Acocella 2012). Recall that conspiracy theories are related to gossiping, for the same evolutionary reasons. Therefore, the reliance on group dynamics facilitates the inhibitions of opinions regarding conspiracy theories (Kitzinger 1995), and researchers can therefore get a better grasp of what ideas are more likely to be spread. On the basis of this information, public health officials can target particular ideas in their health literacy campaigns, placing educational efforts on those aspects that most frequently arise in focus groups discussions.

Likewise, conspiracy theories are typically defended by the dispossessed and those individuals who feel powerless. It has long been established that big social and economic inequalities leads to suspiciousness and collective paranoia (Swami and Coles 2010). Groups that find themselves in the lower end of the socio-economic scale begin to wonder how they got there in the first place, and they inevitably conclude that they have been cheated in a conspiracy.

One particularly influential study is informative in this regard. Foster (1974) studied how rural communities in Mexico become resented whenever some acquires a greater share of land. In these communities’ worldview, land is a “limited good”, and therefore, whoever increases their share, must have done so on the basis of some conspiracy. Eventually, the more prosperous landowners are accused of using witchcraft. This case clearly expresses how dispossession and powerlessness may lead to conspiracy mongering.

Given that inequality and powerlessness is a significant cause of conspiracy mongering, policy designers must address this problem. One particularly effective approach is wealth redistribution through universal service policies (Mueller 1999). Political attempts to increase universal accessibility to health care may in turn further feed conspiracy theories. For example, the Affordable Care Act (Obamacare) in the United States played into the hands of conspiracy theorists who were already suspicious of Obama’s background and intentions (Quadagno 2014). In fact, it has been empirically shown that in the American public there are significant misperceptions of Obamacare (Pasek et al. 2015), which makes it again necessary to address these misconceptions before they turn into conspiracy theories.

Increased access to health care does predict a lower adscription to conspiracy theories, and for that reason, any attempt to eradicate conspiratorial thinking regarding medical issues must rely on an attempt to make universal healthcare more expansive. Expanding a safety net is an ambitious goal, and may be more of a political talking point, than an actual concrete proposal by public health officials. But, one important aspect is the communication to the public that, ultimately, public health is in the interest of the common good. Recall that, as individuals preserve a sense of community, they rely more on communal links, and therefore, become less suspicious of each other. If proper political steps are taken, so that citizens strengthen links to each other by universally receiving health care, the levels of paranoia that typically arouse conspiracy theories, may be significantly reduced.

The building of a sustainable safety net is also of great importance in this regard. The presence of a safety net would prevent the dispossessed from engaging in conspiracy mongering, because even if they come to feel that they do not have a greater say in the dictum of society, they at least preserve the satisfaction of being secure in case of extreme hardship.

Nevertheless, as Calomiris (1999) advocates it, this safety net must be incentive compatible, so that it remains sustainable. One important feature of this safety net, which specifically pertains to medical conspiracy theories, is universal health care. Most industrialized countries have robust systems of universal healthcare, but the United States is lacking in it (Lasser et al. 2006). Not coincidentally, it has been empirically established that the perception that Big Pharma is just a business whose sole motivation is profit, actually induces conspiracy mongering (Blaskiewicz 2013). A system of universal healthcare would decrease that perception, and in turn, would reduce the proliferation of medical conspiracy theories.

John Rawls’ arguments in favor of a welfare state, on the basis of a “veil of ignorance”, are very relevant here (Korobkin 1998). If individuals design a society in which they envision themselves to be in the lowest position, they may be more apt at understanding what the society as a whole needs in order to keep its citizens healthy. Cognitive science has provided a thorough understanding of how imagination is crucial for forming moral opinions (Johnson 1994). One particularly important recommendation in this regard is to appeal, not necessarily to excessive emotions or scare tactics, but at least to plausible imagined scenarios in awareness campaigns, so that people may come to understand why particular policies, such as more expansive healthcare, are needed.

The internet has a big role to play in public health campaigns targeting medical conspiracy theories. Officials have realized that, of all social media, Twitter in particular plays a huge role in the shaping of opinions and transmission of information, related to health issues (Denecke et al. 2013). Twitter has the particular advantage of conveying messages in a limited amount of words. From a cognitive science perspective, this proves very useful, because studies do show that shorter messages can have more powerful effects in brain processing (Saharia 2015), especially if they pertain to emotional issues, as the case of medical conspiracy theories tends to be. Consequently, one important ethical implication from the cognitive science of medical conspiracy theories, is that, inasmuch as these theories rely on simplistic bites, one efficient way of combating misinformation might be by relying on similarly short information that debunks the false narratives circulating. For this endeavor, Twitter is ideal. One study has shown that misinformation and conspiracy mongering regarding the Zika virus effectively countered by Twitter (Wood 2018).

Given the power of Twitter, public health officials must also encourage physicians, nurses, and other healthcare workers, to embrace Twitter more proficiently, so as to push back whenever medical conspiracy theories arise. Hospitals may very well organize training sessions in which healthcare workers are taught to synthetize relevant information in the short space provided by Twitter. So far, it is unclear what the extent of Twitter and other social media is amongst doctors (for medical purposes) (Hawn 2009), but as a bulwark against medical conspiracy, its use should be more expansive amongst health professionals.

Yet, Twitter, social media, and the internet as a whole, also plays a big role in the spread of medical conspiracy theories. In that sense, one important aspect of public health campaigns to address medical conspiracies, is the regulation of the internet. It is important to remember that, apart from the natural disposition towards gossiping, internet has potentiated the effects of rumor. This facilitates the spread of lies, but as Lidsky (2008) explains, the deliberate spread of false information cannot be protected as free speech. Although the internet has been an immensely valuable resource, communication ethicists now understand that more regulation is needed (Weiser 2009), and this is an important aspect in addressing conspiracy mongering, particularly in the healthcare sector. Experts still debate whether internet has made conspiracy theories more prevalent; for now, there is no definite consensus, and it may still be too early to tell (Wood 2013). Yet, the internet is here to stay, and given that reality, public health officials must give more consideration to lobbying action to lawmakers and politicians, in order to call for a greater control of the information that is divulged in the cyberspace.

Furthermore, patient empowerment is also a useful resource in the eradication of conspiracy theories, for the reasons already discussed. Medical ethics in the past did not place much emphasis on the principle of autonomy, and paternalism was the rule. Things have changed over the last few decades, but physicians need to further ensure that patients retain the power of decision through informed consent. On a concrete level, this implies that public health officials emphasize to public health workers about the utmost importance of not imposing decisions on patients, and about the need to convey all the relevant information to them. Yet, we should keep in mind that in the current discussion, ethical imperatives can go beyond informed consent, given the different forms of public engagement. For example, O’Neill (2003) argues that “since the point of consent procedures is to limit deception and coercion, they should be designed to give patients and others control over the amount of information they receive and opportunity to rescind consent already given.”

In this manner, patients will feel that they do have the power to decide over their own bodies, and thus, will not easily come to believe the conspiracy theories that are more common amongst persons who do not have the privilege to decide on their own.

Physicians and public health officials also need to take a more activist political role. This may seem counterintuitive, since doctors who get involved in engagement in public discourse may easily be perceived to be in alliance with politicians, thus giving rise to all sorts of conspiracy theories. But in fact, by more actively participating in engagement in public discourse, physicians and public health officials can take steps to ensure that marginalized populations receive proper healthcare and become better integrated to society. By doing this, powerlessness can again be reduced, and thus one factor fueling medical conspiracy theories can be mitigated.

One concrete way of expanding the political participation of physicians is by encouraging the formation of guilds and local chapters of medical associations, in which health workers may gather to discuss, not just technical issues, but also how health relates to society. Hospitals need also to encourage community life amongst its staff (sports tournaments, cultural events), so that staff can form a greater sense of commitment to social issues (and thus alienation is prevente), and consequently come up with more effective ways of approaching policymakers as to what may be the most effective way of empowering dispossessed communities in the access to healthcare. Furthermore, hospitals can arrange for weekly seminar series open to the wider public, in which particular social and political problems related to conspiracy theories claims are discussed (e.g., the price of medications, government healthcare plans, race representation in particular diseases), and seize the opportunity to hear attendees that may potentially be sympathetic to medical conspiracy theories, and engage them in dialogue.

Finally, recall also that conspiracy theories feed into the evolved “us-versus-them” mentality, along with scapegoating. If doctors are perceived as outsiders, then they are more likely to be the object of conspiracy speculations. Health workers need to ensure that they find common links with their patients. This implies respect (although not necessarily agreement) with patients’ local cultures and even ways of understanding disease and medicine (Flores 2000).

Some empirical data suggests that when patients and doctors have different ethnicities, compliance rates are lower (McQuaid and Landier 2018), although this is not an insurmountable obstacle. Patients may not fully trust doctors of different ethnicities, and that may contribute to conspiracy theories about their procedures. One potential way of dealing with this problem is by ensuring the medical profession is represented by all ethnicities, through a program of affirmative action (Magnus and Mick 2000). Once again, public health officials can take lobbying action, so as to motivate lawmakers to take decisive steps in that direction.

However, affirmative action in medicine can also become very divisive (Sowell 2005), thus deepening the “us-versus-them” mentality that sustains conspiracy theories in the first place. Explicit racial and ethnic preferences can contribute to stereotypes in healthcare services, and ultimately, these stereotypes nourish conspiracy mongering. One possible countermeasure is the developing of strategies for the advancement of cosmopolitanism and supraethnic, supraracial and supranational identities, in order to bridge groups that have suspicion for each other. On a concrete level, this can be done by medical associations endorsing civic messages that call for the unity of a country through public messages, broadcast on TV, radio, and other media.

To achieve this purpose, health workers need to find a balance between engaging with local cultures so as not to appear as outsiders, but not become too parochial, so as to encourage the cosmopolitanism that prevents against conspiracy mongering. Concretely, this balance can be struck by including more cultural diversity and sensitivity training in hospitals and medical schools, as part of professional development plans.

Conclusion

Recent developments in both the United States and Europe have given occasion to the rise of post-truth politics; i.e., massive misinformation for pure electoral gain. This in turn has given rise to a flourishing of conspiracy theories, that feeds a paranoid style, not only in political activities, but in society as a whole.

Despite the fact that suspicions regarding medical procedures have always existed, this sudden rise of conspiracy mongering has also had important implications pertaining to medical information. In the past, some unethical medical procedures have been done, and on the basis of this, new conspiracy theories have arisen.

Although it offers no coherent, unified view to explain why people believe in conspiracy theories, the emerging field of cognitive science has offered some guidance in the attempt to understand how these ideas are transmitted, and why they stick. Pattern recognition, powerlessness, and anxiety-induced illusions of control, are some of the most important mechanisms underlying the prevalence of conspiracy theories.

This information can better sustain some of the policies that can be designed in order to counter the spread of medical conspiracy theories. Concrete measures such as avoidance of scaring tactics, improved communication skills, increase of Twitter use amongst doctors, use of focus groups, greater respect for patients’ autonomy, lobbying for Affirmative Action, and cultural and diversity training, could theoretically be useful means of pushing back against the prevalence of medical conspiracy theories. All these measures ultimately have a connection with the understanding that cognitive science offers of conspiracy theories in general. Unfortunately, given the current political climate of Europe and the United States, medical conspiracy theories are likely to either stay, or morph into new ones. Precisely for that reason, a deeper understanding of why people believe them is necessary (and for this, cognitive science offers a relevant approach), and further consideration about effective policies to counter them, is also needed.

References

  1. Abalakina-Paap, M., W. Stephan, T. Craig, and W.L. Gregory. 1999. Beliefs in conspiracies. Political Psychology 20: 637–647.

    Google Scholar 

  2. Acocella, Ivana. 2012. The focus groups in social research: advantages and disadvantages. Quality & Quantity 46 (4): 1125–1136.

    Google Scholar 

  3. Andrade, Gabriel, and Azhar Hussain. 2018. Polio in Pakistan: Political, Sociological, and Epidemiological Factors. Cureus 10 (10): e3502.

    Google Scholar 

  4. Backer, Thomas E., Everett Rogers, and Pradeep Sopory. 1992. Designing health communication campaigns: What works?. Newbury Park, CA: Sage.

    Google Scholar 

  5. Bandura, Albert. 1977. Social Learning Theory. New York: Prentice Hall.

    Google Scholar 

  6. Banks, Nancy. 2010. AIDS, Opium, Diamonds, and Empire: The Deadly Virus of International Greed. New York: I Universe.

    Google Scholar 

  7. Bartlett, J., and C. Miller. 2010. The power of unreason: Conspiracy theories, extremism and counter-terrorism. London, UK: Demos.

    Google Scholar 

  8. Bearman, P. 2010. Just-so stories: Vaccines, autism, and the single-bullet disorder. Social Psychology Quarterly 73 (2): 112–115.

    Google Scholar 

  9. Bebbington, P., O. McBride, C. Steel, E. Kuipers, M. Radovanic, T. Brugha, R. Jenkins, H. Meltzer, and D. Freeman. 2013. The structure of paranoia in the general population. The British Journal of Psychiatry 202: 419–427.

    Google Scholar 

  10. Blackmore, Susan. 1999. The Meme Machine. Oxford: Oxford University Press.

    Google Scholar 

  11. Blaskiewicz, R. 2013. The Big Pharma conspiracy theory. Medical Writing 22 (4): 259–261.

    Google Scholar 

  12. Bode, L., and E. Vraga. 2018. See Something, Say Something: Correction of Global Health Misinformation on Social Media. Health Communication. 33 (9): 1131–1140.

    Google Scholar 

  13. Bogart, L.M., and S.T. Bird. 2003. Exploring the relationship of conspiracy beliefs about HIV/AIDS to sexual behaviors and attitudes among African-American adults. Journal of the National Medical Association 95 (11): 1057.

    Google Scholar 

  14. Boyer, Pascal. 1994. The Naturalness of Religious Ideas. Los Angeles: University of California Press.

    Google Scholar 

  15. Brotherton, Rob. 2013. Suspicious Minds: Why We Believe Conspiracy Theories. New York: Bloomsbury.

    Google Scholar 

  16. Calomiris, C.W. 1999. Building an incentive-compatible safety net. Journal of Banking & Finance 23 (10): 1499–1519.

    Google Scholar 

  17. Carstairs, C., and R. Elder. 2008. Expertise, health, and popular opinion: Debating water fluoridation, 1945–80. Canadian Historical Review 89 (3): 345–371.

    Google Scholar 

  18. Chagnon, Napoleon. 1983. Yanomamo: The Fierce People. New York: Holt, Rinehart and Winston.

    Google Scholar 

  19. Cichocka, A., M. Marchlewska, and A. Golec de Zavala. 2016a. Doe self-love or self-hate predict conspiracy beliefs? Narcissism, self-esteem, and the endorsement of conspiracy theories. Social Psychological and Personality Science 7: 157–166.

    Google Scholar 

  20. Cichocka, A., M. Marchlewska, A. Golec de Zavala, and M. Olechowski. 2016b. “They will not control us”: In-group positivity and belief in intergroup conspiracies. British Journal of Psychology 107: 556–576.

    Google Scholar 

  21. Cikara, Mina, Matthew Botvinick, and Susan Fiske. 2011. Us versus Them: Social Identity Shapes Neural Responses to Intergroup Competition and Harm. Psychological Science 22: 3.

    Google Scholar 

  22. Clarke, S. 2007. Conspiracy theories and the Internet: Controlled demolition and arrested development. Episteme 4 (2): 167–180.

    Google Scholar 

  23. Darwin, H., N. Neave, and J. Holmes. 2011. Belief in conspiracy theories: The role of paranormal belief, paranoid ideation and schizotypy. Personality and Individual Differences 50: 1289–1293.

    Google Scholar 

  24. Davis, J., G. Wetherell, and P.J. Henry. 2018. Social devaluation of African Americans and race-related conspiracy theories. European Journal of Social Psychology. https://doi.org/10.1002/ejsp.2531.

    Article  Google Scholar 

  25. Denecke, K., M. Krieck, L. Otrusina, P. Smrz, P. Dolog, W. Nejdl, and E. Velasco. 2013. How to exploit twitter for public health monitoring? Methods of information in medicine 52 (04): 326–339.

    Google Scholar 

  26. Dentith, Matthew. 2014. The Philosophy of Conspiracy Theories. New York: Palgrave.

    Google Scholar 

  27. Dolan, Raymond J., and Patrick Vuilleumier. 2003. Amygdala automaticity in emotional processing. Annals of the New York Academy of Sciences 985 (1): 348–355.

    Google Scholar 

  28. Dube, E., M. Vivion, and N.E. MacDonald. 2015. Vaccine hesitancy, vaccine refusal and the anti-vaccine movement: Influence, impact and implications. Expert Review of Vaccines 14 (1): 99–117.

    Google Scholar 

  29. Dunbar, Robin. 1996. Grooming, Gossip and the Evolution of Language. Salem: Harvard University Press.

    Google Scholar 

  30. Dyer, Clare. 1987. Whooping cough vaccine on trial again. Medicolegal. 295: 1053–1054.

    Google Scholar 

  31. Ebel-Lam, A.P., L.R. Fabrigar, T.K. MacDonald, and S. Jones. 2010. Balancing causes and consequences: The magnitude-matching principle in explanations for complex social events. Basic & Applied Social Psychology 32: 348–359.

    Google Scholar 

  32. Ernst, Edzard. 2019. Alternative Medicine. A Critical Assessment of 150 Modalities. New York: Springer.

    Google Scholar 

  33. Evans-Pritchard, E.E. 1963. Witchcraft, Oracles and Magic Among the Azande. London: Clarendon Press.

    Google Scholar 

  34. Evers, S., V.T. Farewell, and P.F. Halloran. 1988. Public awareness of organ donation. CMAJ Canadian Medical Association Journal 138 (3): 237.

    Google Scholar 

  35. Fenster, Mark. 1999. Conspiracy Theories: Secrecy and Power in American Culture. Minneapolis: University of Minnesota Press.

    Google Scholar 

  36. Festinger, Leon. 1957. A Theory of Cognitive Dissonance. Stanford: Stanford University Press.

    Google Scholar 

  37. Flores, Glenn. 2000. Culture and the patient-physician relationship: Achieving cultural competency in health care. The Journal of Pediatrics. 136: 1.

    Google Scholar 

  38. Foster, G.M. 1974. Limited good or limited goods: Observations on Acheson. American Anthropologist 76 (1): 53–57.

    Google Scholar 

  39. Fourie, Pieter, and Melissa Meyer. 2010. The Politics of AIDS Denialism. New York: Routledge.

    Google Scholar 

  40. Freimuth, Vicki S., et al. 2001. African Americans’ views on research and the Tuskegee Syphilis Study. Social Science & Medicine 52 (5): 797–808.

    Google Scholar 

  41. Garcia-Retamero, R., and E.T. Cokely. 2013. Communicating health risks with visual aids. Current Directions in Psychological Science 22 (5): 392–399.

    Google Scholar 

  42. Girard, Rene. 1986. The Scapegoat. Baltimore: John Hopkins University Press.

    Google Scholar 

  43. Goertzel, T. 1994. Belief in conspiracy theories. Political Psychology 15: 733–744.

    Google Scholar 

  44. Goldacre, Ben. 2008. Bad Science. London: Harper Collins.

    Google Scholar 

  45. Gottlieb, S.D. 2016. Vaccine resistances reconsidered: Vaccine skeptics and the Jenny McCarthy effect. Biosocieties 11 (2): 152–174.

    Google Scholar 

  46. Gray, K., and D. Wegner. 2010. Blaming God for our pain: human suffering and the divine mind. Personality and Social Psychology Review. 14 (1): 7–16.

    Google Scholar 

  47. Green, R., and K.M. Douglas. 2018a. Anxious attachment and belief in conspiracy theories. Personality and Individual Differences 125: 30–37.

    Google Scholar 

  48. Grimmes, David. 2016. On the Viability of Conspiratorial Beliefs. PLoS ONE. https://doi.org/10.1371/journal.pone.0147905.

    Article  Google Scholar 

  49. Guthrie, Steven. 1995. Faces in the Clouds: A New Theory of Religion. Oxford: Oxford University Press.

    Google Scholar 

  50. Hagen, Kurtis. 2018. Conspiracy theorists and monological belief systems. Argumenta. 3: 2.

    Google Scholar 

  51. Hagen, Kurtis. 2018 Conspiracy theories and the paranoid style: do conspiracy theories posit implausibly vast and evil conspiracies? Social Epistemology 32 (1):24–40.

    Google Scholar 

  52. Haselton, Martie. 2000. Error management theory: A new perspective on biases in cross-sex mind reading. Journal of Personality and Social Psychology 78 (1): 81–91.

    Google Scholar 

  53. Hawn, C. 2009. Take two aspirin and tweet me in the morning: How Twitter, Facebook, and other social media are reshaping health care. Health affairs 28 (2): 361–368.

    Google Scholar 

  54. Heider, F., and M. Simmel. 1944. An experimental study of apparent behavior. American Journal of Psychology 57: 243–259.

    Google Scholar 

  55. Heldman, A.B., J. Schindelar, and J.B. Weaver. 2013. Social media engagement and public health communication: Implications for public health organizations being truly “social”. Public Health Reviews 35 (1): 13.

    Google Scholar 

  56. Hellinger, Daniel. 2019. Conspiracy and Conspiracy Theories in the Ae of Trump. New York: Palgrave.

    Google Scholar 

  57. Hill, David, Simon Chapman, and Robert Donovan. 1998. The return of scare tactics. Tobacco Control 7 (1): 5–8.

    Google Scholar 

  58. Hoffman, Steven, Yasmeen Mansoor, Navneet Natt, Lathika Sritharan, Julia Belluz, Timothy Caulfield, Yoni Freedhoff, John Lavis, and Arya Sharma. 2017. Celebrities’ impact on health-related knowledge, attitudes, behaviors, and status outcomes: Protocol for a systematic review, meta-analysis, and meta-regression analysis. Systematic Reviews 6: 13.

    Google Scholar 

  59. Hofstadter, R. 2012. The Paranoid Style in American Politics. New York: Vintage.

    Google Scholar 

  60. Icke, David. 2007. The David Icke Guide to the Global Conspiracy (and How to End It). London: David Icke Books.

    Google Scholar 

  61. Imhoff, R., and M. Bruder. 2014. Speaking (un-)truth to power: Conspiracy mentality as a generalized political attitude. European Journal of Personality 28: 25–43.

    Google Scholar 

  62. Johnson, M. 1994. Moral Imagination: Implications of Cognitive Science for Ethics. Chicago: University of Chicago Press.

    Google Scholar 

  63. Jolley, D., and K. Douglas. 2014. The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS ONE 9: e89177.

    Google Scholar 

  64. Kelemen, Deborah. 1999. The scope of teleological thinking in schoolchildren. Cognition 70: 241–272.

    Google Scholar 

  65. Kitzinger, Jenny. 1995. Qualitative research: Introducing focus groups. BMJ 311 (7000): 299–302.

    Google Scholar 

  66. Klayman, J., and Y.-W. Ha. 1987. Confirmation, disconfirmation and information in hypothesis testing. Psychological Review. 94: 211–228.

    Google Scholar 

  67. Knight, Peter. 2013. Conspiracy Culture: From the Kennedy Assassination to the X-Files. New York: Routledge.

    Google Scholar 

  68. Korobkin, R. 1998. Determining health care rights from behind a veil of ignorance. U. Ill. L. Rev. 1998 (3): 801–836.

    Google Scholar 

  69. Lasser, K.E., D.U. Himmelstein, and S. Woolhandler. 2006. Access to care, health status, and health disparities in the United States and Canada: Results of a cross-national population-based survey. American Journal of Public Health 96 (7): 1300–1307.

    Google Scholar 

  70. Leshner, G., and I.H. Cheng. 2009. The effects of frame, appeal, and outcome extremity of antismoking messages on cognitive processing. Health Communication 24 (3): 219–227.

    Google Scholar 

  71. Lidsky, L.B. 2008. Where's the Harm: Free Speech and the Regulation of Lies. Wash. & Lee L. Rev. 65: 1091.

    Google Scholar 

  72. Magnus, S., and S. Mick. 2000. Medical schools, affirmative action, and the neglected role of social class. American Journal of Public Health 90 (8): 1197–1201.

    Google Scholar 

  73. Malinowski, Bronislaw. 1992. Magic, Science and Religion. New York: Waveland.

    Google Scholar 

  74. McAndrew, F.T., and M.A. Milenkovic. 2002. Of tabloids and family secrets: The evolutionary psychology of gossip 1. Journal of Applied Social Psychology 32 (5): 1064–1082.

    Google Scholar 

  75. McCauley, C., and S. Jacques. 1979. The popularity of conspiracy theories of presidential assassination: A Bayesian analysis. Journal of Personality and Social Psychology 37: 637–644.

    Google Scholar 

  76. McDonald, Melissa, Carlos Navarrete, and Mark Van Vugt. 2012. Evolution and the psychology of intergroup conflict: The male warrior hypothesis. Philosophical Transactions of the Royal Society B. 367 (1589): 670–679.

    Google Scholar 

  77. McGuire, W. 1984. Public communication as a strategy for inducing health-promoting behaviorial change. Preventive Medicine. 14: 3.

    Google Scholar 

  78. McNeill, Paul. 1993. The Ethics and Politics of Human Experimentation. Cambridge: University of Cambridge Press.

    Google Scholar 

  79. McQuaid, Elizabet, and Wendy Landier. 2018. Cultural Issues in Medication Adherence: Disparities and Directions. Journal of General Internal Medicine. 33 (2): 200–206.

    Google Scholar 

  80. Moreno, J.D. 2013. Undue Risk: Secret State Experiments on Humans. New York: Routledge.

    Google Scholar 

  81. Mueller, M. 1999. Universal service policies as wealth redistribution. Government Information Quarterly 16 (4): 353–358.

    Google Scholar 

  82. Nelson, Jessica C., Glenn Adams, Nyla R. Branscombe, and Michael Schmitt. 2010. The role of historical knowledge in perception of race-based conspiracies. Race and Social Problems 2 (2): 69–80.

    Google Scholar 

  83. Neuberg, Steven, Douglas Kenrick, and Mark Schaller. 2010. Human Threat Management Systems: Self-Protection and Disease Avoidance. Neuroscience Biobehavioral Review. 35 (4): 1042–1051.

    Google Scholar 

  84. Norenzayan, Ara, Scott Attran, Jason Faulkner, and Mark Schaller. 2006. Memory and Mystery: The Cultural Selection of Minimally Counterintuitive Narratives. Cognitive Science 30: 531–553.

    Google Scholar 

  85. Nyhan, Brendan, and Jason Reifler. 2010. When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior. 32: 2.

    Google Scholar 

  86. Nyhan, Brendan, et al. 2014. Effective messages in vaccine promotion: A randomized trial. Pediatrics 133 (4): e835–e842.

    Google Scholar 

  87. Öhman, A., A. Flykt, and F. Esteves. 2001. Emotion drives attention: Detecting the snake in the grass. Journal of Experimental Psychology 130: 466–478.

    Google Scholar 

  88. Oliver, Eric, and Thomas Woods. 2014. Medical Conspiracies and Health Behaviors in the United States. Jama Internal Medicine. 174 (5): 817–818.

    Google Scholar 

  89. O'Neill, O. 2003. Some limits of informed consent. Journal of Medical Ethics 29 (1): 4–7.

    Google Scholar 

  90. Ottersen, O.P., and P.J. Helm. 2002. How hardwired is the brain? Nature 420 (6917): 751–752.

    Google Scholar 

  91. Parker-Pope, T. 2009. Bill Maher vs. the Flu Vaccine. Well Blog, New York Times.

  92. Pasek, J., G. Sood, and J.A. Krosnick. 2015. Misinformed about the affordable care act? Leveraging certainty to assess the prevalence of misperceptions. Journal of Communication 65 (4): 660–673.

    Google Scholar 

  93. Piaget, Jean, and Barbara Inhelder. 2008. The Psychology of the Child. New York: Basic.

    Google Scholar 

  94. Quadagno, J. 2014. Right-wing conspiracy? Socialist plot? The origins of the patient protection and affordable care act. Journal of Health Politics, Policy and Law 39 (1): 35–56.

    Google Scholar 

  95. Riem, Madelon M.E., et al. 2012. Attachment in the brain: Adult attachment representations predict amygdala and behavioral responses to infant crying. Attachment & Human Development 14 (6): 533–551.

    Google Scholar 

  96. Roisman, Joseph. 2006. The Rhetoric of Conspiracy in Ancient Athens. Los Angeles: University of California Press.

    Google Scholar 

  97. Rosnow, R.L. 1991. Inside rumor: A personal journey. American Psychologist 46: 484–496.

    Google Scholar 

  98. Rosset, Evelyn. 2008. It's no accident: Our bias for intentional explanations. Cognition 108 (3): 771–780.

    Google Scholar 

  99. Rozenblit, Leonid, and Frank Keil. 2002. The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science. 26 (5): 521–562.

    Google Scholar 

  100. Rozin, Paul, and April Fallon. 1987. A perspective on disgust. Psychological Review. 9 (1): 23–41.

    Google Scholar 

  101. Saharia, N. (2015). Detecting emotion from short messages on Nepal earthquake. In 2015 International Conference on Speech Technology and Human-Computer Dialogue (SpeD) (pp. 1–5). IEEE.

  102. Sharfstein, J.M. 2017. Vaccines and the trump administration. JAMA 317 (13): 1305–1306.

    Google Scholar 

  103. Silva, B.C., F. Vegetti, and L. Littvay. 2017. The elite is up to something: Exploring the relationship between populism and belief in conspiracy theories. Swiss Political Science Review 23: 423–443.

    Google Scholar 

  104. Slone, Jason. 2004. Theological Incorrectness: Why Religious People Believe What They Shouldn't. Oxford: Oxford University Press.

    Google Scholar 

  105. Sowell, Thomas. 2005. Affirmative Action Around the World. New Haven: Yale University Press.

    Google Scholar 

  106. Sperber, Dan. 2000. Introduction. In Metarepresentations: A Multidisciplinary Perspective, ed. Dan Sperber. Oxford: Oxford University Press.

    Google Scholar 

  107. Swami, V., M. Voracek, S. Stieger, U.S. Tran, and A. Furnham. 2014. Analytic thinking reduces belief in conspiracy theories. Cognition 133: 572–585.

    Google Scholar 

  108. Swami, V., and R. Coles. 2010. The truth is out there: Belief in conspiracy theories. The Psychologist 23 (7): 560–563.

    Google Scholar 

  109. Thorburn, S., and L. Bogart. 2005. Conspiracy beliefs about birth control: Barriers to pregnancy prevention among African Americans of reproductive age. Health, Education & Behavior. 32 (4): 474–487.

    Google Scholar 

  110. Tooby, J., and L. Cosmides. 2015. Conceptual foundations of evolutionary psychology. In Handbook of Evolutionary Psychology, ed. D. Buss. London: Wiley.

    Google Scholar 

  111. Uscinski, Joseph, and Joseph Parent. 2014. American Conspiracy Theories. Oxford: Oxford University Press.

    Google Scholar 

  112. Van Prooijen, J.-W., and P.A.M. Van Lange. 2014. The social dimension of belief in conspiracy theories. In Power, politics, and paranoia: Why people are suspicious of their leaders, ed. J.-W. van Prooijen and P.A.M. van Lange. Cambridge: Cambridge University Press.

    Google Scholar 

  113. Van Prooijen, J.-W., K. Douglas, and C. De Inocencio. 2018. Connecting the dots: Illusory pattern perception predicts beliefs in conspiracies and the supernatural. European Journal of Social Psychology 48: 320–335.

    Google Scholar 

  114. Van Prooijen, J.-W., A.P.M. Krouwel, and T. Pollet. 2015. Political extremism predicts belief in conspiracy theories. Social Psychological and Personality Science 6: 570–578. https://doi.org/10.1177/1948550614567356.

    Article  Google Scholar 

  115. Van Prooijen, J. 2016. Why Education Predicts Decreased Belief in Conspiracy Theories. Applied Cognitive Psychology. https://doi.org/10.1002/acp.3301.

    Article  Google Scholar 

  116. Von Rueden, C., and M. van Vugt. 2015. Leadership in small-scale societies: Some implications for theory, research, and practice. The Leadership Quarterly 26: 978–990.

    Google Scholar 

  117. Webb, Gary. 2019. Dark Alliance: The CIA, the Contras, and the Cocaine Explosion. New York: Seven Stories Press.

    Google Scholar 

  118. Weiser, P.J. 2009. The future of Internet regulation. UC Davis L. Rev. 43: 529.

    Google Scholar 

  119. Weitlaner, D., Guettinger, A., & Kohlbacher, M. (2013). Intuitive comprehensibility of process models. In International Conference on Subject-Oriented Business Process Management (pp. 52–71). Berlin, Heidelberg: Springer.

  120. West, H.G., and T. Sanders. 2003. Transparency and conspiracy: Ethnographies of suspicion in the New World Order. Durham, NC: Duke University Press.

    Google Scholar 

  121. Wood, M. 2013. Has the internet been good for conspiracy theorising. PsyPAG Quarterly 88: 31–34.

    Google Scholar 

  122. Wood, M.J. 2018. Propagating and debunking conspiracy theories on Twitter during the 2015–2016 Zika virus outbreak. Cyberpsychology, Behavior, and Social Networking 21 (8): 485–490.

    Google Scholar 

  123. Wood, Michael, Karen Douglas, and Robbie Sutton. 2011. Dead and Alive: Contradictory Conspiracy Theories. Social Psychology and Personality Science. 00: 1–7.

    Google Scholar 

  124. Wood, Thomas, and Ethan Porter. 2019. The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence. Political Behavior. 41: 1.

    Google Scholar 

  125. Wrangham, R.W. 1999. Evolution of coalitionary killing. Yearbook of Physical Anthropology 42: 1–30.

    Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Gabriel Andrade.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Andrade, G. Medical conspiracy theories: cognitive science and implications for ethics. Med Health Care and Philos 23, 505–518 (2020). https://doi.org/10.1007/s11019-020-09951-6

Download citation

Keywords

  • Medical conspiracy theories
  • Cognitive science
  • Ethics
  • Human brain
  • Public policy