1 Engaging with Conspiracy Believers

Conspiracy theories are beliefs that two or more actors have coordinated in secret to achieve an outcome and that their conspiracy is of public interest but not public knowledge (Douglas and Sutton 2023). Such theories were rife during the COVID-19 pandemic, ranging from the belief that the virus was a government hoax designed to control the population, to the idea that it was a Chinese bioweapon designed to wage war on the West. Conspiracy theories tend to be prominent in times of crisis when people are trying to cope with difficult circumstances (Marchlewska et al. 2022; van Prooijen and Douglas 2017). However, they are not only a modern phenomenon. Previous decades have seen conspiracy theories surrounding the death of Princess Diana, the September 11th attacks on New York in 2001, the assassination of President John F. Kennedy, the Apollo moon landings, and there is evidence of conspiracy theories dating back to antiquity (Pagán 2020).

Scholars across several academic disciplines have made significant progress in understanding why conspiracy theories appeal to millions of people across the world. In addition to political, historical and cultural factors that have shaped conspiracy beliefs (see Douglas et al. 2019 for a review), examinations of the psychological factors associated with conspiracy beliefs have grown rapidly in recent years. Specifically, researchers have studied how conspiracy beliefs are linked to paranoid ideation (e.g., Greenburgh and Raihani 2022; Pierre 2023), and how they appeal to people with unmet psychological needs (Biddlestone et al. 2021; Douglas et al. 2017). However, only limited research so far has focused on the issue of engaging with conspiracy believers, and specifically when and how to do so. In this article, we offer some perspectives on these less well-understood questions about conspiracy beliefs. We outline when it is important to engage with conspiracy believers after briefly considering the consequences of conspiracy theories for individuals, groups and communities. We then review research on how to engage with conspiracy believers in interpersonal contexts, and in large-scale efforts focusing on addressing communities’ susceptibility to conspiracy theories and misinformation more generally.

2 When do we Need to Engage with Conspiracy Believers?

A crucial question to ask before engaging with conspiracy believers is whether their beliefs pose specific dangers to themselves, other people, or to broader communities. Some conspiracy theories seem to pose little or no harm to anyone at face value. For example, if a person agrees with the idea that Elvis Presley faked his death, this is unlikely to cause any negative consequences for other people. However, growing evidence suggests that many conspiracy theories, such as those related to vaccines and climate change, can be harmful for individuals, groups, and for broader communities (Douglas and Sutton 2023). In such cases, engaging with believers may be crucial to prevent harms and to maintain the epistemic health of societies.

Although it is not straightforward to differentiate between potentially harmful and harmless conspiracy beliefs, it is important to note that conspiracy beliefs about different topics do tend to correlate with each other, ranging from weak to moderate relationships (e.g., Goertzel 1994; Wood et al. 2012). Furthermore, psychometric scales used to measure belief in conspiracy theories—that typically ask people to rate their agreement with a set of statements on a scale from “strongly disagree” to “strongly agree” for each statement—are internally reliable, meaning that they capture a single construct (Douglas and Sutton 2023). Therefore, approaches designed to reduce conspiracy beliefs in general are likely to be appropriate and effective to deal with harmful conspiracy theories. In the following sections, we briefly outline the dangers of conspiracy theories that have been thus far studied in the literature.

3 Dangers to Individuals

Conspiracy beliefs have long been linked with paranoia (e.g., Imhoff and Lamberty 2018) and indeed there are several similarities between the two constructs such as the idea that malicious groups are responsible for harmful outcomes (e.g., Freeman and Bentall 2017; van der Tempel and Alcock 2015), and the perception that people are coordinated in their malicious intent (Douglas et al. 2019; Raihani and Bell 2019). Paranoia and conspiracy beliefs also share various psychological correlates such as biases in reasoning (Brotherton and French 2014; Swami et al. 2014) and pattern perception (van Prooijen et al. 2018). Like paranoia, there are also concomitants of conspiracy beliefs that indicate harm to those who hold them. For example, conspiracy believers are more likely to be subjected to victimisation, social isolation, stigmatisation, and tend to experience other poor life conditions such as poverty and low social status (e.g., Alsuhibani et al. 2022; Freeman and Bentall 2017; Lantian et al. 2018). Conspiracy believers also tend to be lonelier, more anxious, more depressed, and experience other anomalous experiences (Baum et al. 2023; Freeman and Bentall 2017).

It is important to stress that conspiracy belief is not by itself a psychopathology, and that there is no reason to think that most people who believe in conspiracy theories present with clinical conditions. Nonetheless, recent meta-analytic evidence suggests that scores on measures of psychopathologies including schizotypy, paranoia, and psychoticism are among the strongest correlates of conspiracy belief (Bowes et al., 2023). Conspiracy beliefs are not only associated statistically with psychopathology but may combine with it to produce some of their most consequential outcomes. For example, Baum and colleagues (2023) discovered that for people scoring high in conspiracy beliefs, depression was associated with support for the violence that occurred during the January 6th US attack on the Capitol building. These findings may be explained by the link between depression and reduced behavioural inhibition (e.g., Swann et al. 2008). Those who believe in certain conspiracy theories may experience anger and outrage and be less able to inhibit any violent behavioural tendencies that arise from these feelings.

There is also evidence that if people are already drawn to conspiracy theories, they tend to be vulnerable to other conspiracy theories (Uscinski et al. 2016) and are more likely to expose themselves to conspiracy content online, leading to polarised attitudes (Bessi et al. 2015; Del Vicario et al. 2016). Conspiracy theories also appear to influence people without their knowledge (Douglas and Sutton 2008). Furthermore, there is growing concern that conspiracy beliefs can affect people’s interpersonal relationships with close others who have fallen down the “rabbit hole” (Sutton and Douglas 2022; Toribio-Flórez et al. 2023). Many people reach out for advice about their relationships that are being torn apart by conspiracy theories (Reddit n.d.). Finally, there is evidence from longitudinal research that people who turn to conspiracist explanations, arguably to alleviate their own personal feelings of threat, anxiety, and uncertainty may only feel worse and turn to conspiracy theories even more, thus creating a vicious cycle in which conspiracy beliefs further aggravate and intensify negative feelings (Liekefett et al. 2021). Although much of this research is cross-sectional and it is difficult to infer cause and effect, much of it does suggest that conspiracy believers are more likely to have negative life experiences compared to non-believers.

4 Dangers to Groups

Growing evidence also suggests that conspiracy beliefs have negative consequences for intergroup relations. For example, conspiracy theories about Jewish people are associated with prejudice towards this group (Bilewicz et al. 2013; Golec de Zavala and Cichocka 2012; Kofta et al. 2020) and other groups who are not part of the alleged conspiracy (Jolley et al. 2020). Groups with little or no power such as feminists and immigrants are often targeted by conspiracy theories (Nera et al. 2021), suggesting that conspiracy theories are associated with distorted perceptions of groups and what they are capable of achieving. Conspiracy theories may therefore generate and reinforce negative perceptions of groups and tensions between them. In fact, despite the long-documented effects of intergroup contact on reducing prejudice (e.g., Paluck et al. 2019), evidence indicates that intergroup conspiracy beliefs cannot be easily reduced with simple contact alone (Bilewicz 2007). While conspiracy beliefs are not the only cause of intergroup conflict (e.g., dehumanisation also plays an important role; Kteily and Bruneau 2017), they do appear to pose unique dangers and challenges to reconciliation efforts. For example, conspiracy theories tend to focus their accusations on groups rather than systemic forces that may pose opportunities for meaningful long-term change, thus diverting efforts to improve intergroup conflicts by distracting from these systemic issues (e.g., Jolley et al. 2018).

Conspiracy beliefs also appear to have the potential to damage relations within groups. Specifically, they are associated with an inflated sense of the importance of one’s own group, known as collective narcissism (Cichocka et al. 2016; Sternisko et al. 2023). This inflated sense of importance—accompanied by feelings that the group is not appreciated enough by others—predicts a readiness to conspire against one’s own group such as supporting secret surveillance activities against citizens of one’s own country, and conspiring against one’s co-workers (Biddlestone et al. 2022a). It therefore appears that conspiracy beliefs can lead to suspicion and potentially damaging behaviour toward people with whom individuals share a sense of identity and who would normally be a source of trust and security.

5 Dangers to Communities

Researchers studying the consequences of conspiracy theories have largely focused on their dangers to broader communities. Some research suggests that people who are exposed to—or believe in—conspiracy theories are less likely to engage in mainstream politics such as voting and supporting political parties (e.g., Jolley and Douglas 2014a; Uscinski and Parent 2014), more likely to lose faith and trust in politicians (Green et al. 2023), and lose trust in mainstream political systems generally (Einstein and Glick 2015). Conspiracy believers’ distrust in the institutional system can further deteriorate their perception of social norms and ultimately translate into non-normative behaviour (Pummerer 2022). Thus, research has demonstrated that conspiracy beliefs are associated with a greater willingness to engage in small-scale criminal behaviours (Jolley et al. 2019), vandalism (Jolley and Paterson 2020), and non-normative political action (Mari et al. 2017). Conspiracy beliefs are also associated with extremism (i.e., holding political or religious attitudes that are far outside the societal mainstream), and radicalisation (i.e., the process of adopting increasingly extremist attitudes and behaviours) (Imhoff et al. 2022; Rottweiler and Gill 2020).

Health-related consequences of conspiracy beliefs include more pronounced anti-vaccine beliefs and higher levels of vaccine hesitancy (e.g., Romer and Jamieson 2020), and avoidance of birth control (Thorburn and Bogart 2005). Other conspiracy theories that can change the attitudes and behaviours of communities include claims that climate change is a hoax and that climate scientists are fraudulent, which are associated with lower commitment to climate change initiatives (e.g., Uscinski et al. 2017; see Biddlestone et al. 2022b, for a meta-analysis of belief in climate change conspiracy theories), and conspiracy theories about the supposed dangers of genetically modified foods (Rutjens et al. 2018). Alongside other science-related conspiracy theories (see also Bierwiaczonek et al. 2022), these examples have the potential to contribute to a climate of suspicion and skepticism about science and scientific experts.

Again, it is important to note that conspiracy beliefs may not necessarily cause these negative outcomes. Many of the above findings about the consequences for individuals, groups, and communities are from cross-sectional studies examining relationships between variables, and it is therefore often not possible to infer cause and effect. Some experimental evidence suggests that negative consequences of conspiracy theories (e.g., political disengagement and vaccine hesitancy) arise because conspiracy theories create feelings of powerlessness and disillusionment (Jolley and Douglas 2014a, b). There is therefore some evidence indicating why conspiracy theories can be dangerous and consequential. Overall too, there is evidence to suggest that conspiracy believers experience a wide variety of negative outcomes more than non-believers, and that there are negative outcomes for groups and communities. There are therefore good reasons to engage with conspiracy believers to try to understand their beliefs, understand the consequences of those beliefs, and address those beliefs when they are harmful.

6 How do we Engage with Conspiracy Believers?

In the following section, we consider how it is possible to engage with conspiracy believers. Given the potential dangers of conspiracy beliefs, most of the points we will cover are oriented toward addressing those beliefs. We discuss strategies that are likely to be effective at an interpersonal level, as well as strategies that have proven effective in larger-scale interventions amongst groups and communities.

7 Interpersonal Communication

Researchers studying the psychology of conspiracy theories are often asked how it is possible to interact with close others who believe in conspiracy theories. One notable resource for those dealing with loved ones who believe conspiracy theories is the online Reddit community known as QAnonCasualties (Reddit n.d.), in which people share their experiences and seek advice on how to engage with their close others, and in particular how to address their belief in the American far-right political QAnon conspiracy theory. The first thing people often try to do when engaging with conspiracy believers on an interpersonal level is to try to debunk their claims with factual, authoritative information. We will talk more about debunking on a larger scale in the next section. However, empirical research on how to engage with conspiracy believers on an interpersonal level, and what strategies work and do not work, is very limited. Nevertheless, it is possible to draw upon psychological research to consider which strategies are likely to be effective in these situations. Here, we build on the steps set out by Jolley et al. (2023a) outlining how to talk to people about their beliefs in conspiracy theories.

7.1 Being Open-minded and Non-confrontational

Research suggests that belief in conspiracy theories may be linked to extremism and radicalisation processes through shared psychological factors (Lee 2020). While there is limited research on interventions targeting belief in conspiracy theories at the interpersonal level, tentative work exists on strategies to address radicalisation. Therefore, it could be useful to turn to research on deradicalisation to understand how to engage with people who believe in conspiracy theories. Drawing on the experiences of experts and first-line practitioners in this area, being empathic, open-minded, and non-confrontational appear to be essential when engaging with radicalised individuals (Ponsot et al. 2018). A key observation from this research is the importance of building a trusting relationship. Therefore, when engaging with someone who believes in conspiracy theories, it is likely to be helpful to start by asking questions and forging a connection. For instance, asking “when did you first start believing in [conspiracy theory]?” and “how does it make you feel to believe this?” can foster an atmosphere of trust and pave the way for a meaningful conversation. This is also likely to enhance perceived credibility and make the acceptance of misinformation corrections more likely (see Walter and Tukachinsky 2020). On the other hand, being confrontational or hostile from the outset can discourage people from considering new ideas. Whilst this approach may be more helpful for engaging with conspiracy believers who are not close relationship partners, it is still a good starting point for engaging with close others.

7.2 Being Receptive

Along a very similar vein, conversational receptiveness fosters empathy which can bridge gaps between opposing views (Yeomans et al. 2020). For example, in a conversation with a person who is talking about their conspiracy beliefs, one could say: “I’m listening; tell me more”. Recent research on conversational receptiveness with vaccine-hesitant people suggests that this strategy could be effective in addressing conspiracy beliefs. In five pre-registered studies, Minson et al. (2023) found that vaccine-supportive people trained in conversational receptiveness were perceived as more reasonable and trustworthy when responding to vaccine-hesitant people’s messages about vaccines, compared to those who were asked to respond to these messages without such training (they were simply asked to be as persuasive as they could). Furthermore, vaccine hesitant people were more willing to talk about vaccines with someone trained in conversational receptiveness (Minson et al. 2023). Receptiveness fosters a sense of personal integrity and adequacy, which are important components of self-affirmation that are known to neutralise the interpersonal effects of ostracism on conspiracy beliefs (see Poon et al. 2020). Receptiveness also seems to help build a mutual understanding between people with opposing beliefs and promotes an environment for productive dialogue.

7.3 Affirming the Value of Critical Thinking

Many conspiracy believers feel that they are critical free thinkers and defenders of the truth (Lantian et al. 2021). In a conversation with a person who is talking about conspiracy theories, it is possible to turn this idea to them but ask them to critically think about the source of the information and to provide more about the details of the conspiracy theory itself (e.g., “It’s important that we think about the evidence. We need to work out where it’s coming from and weigh it all up”). One could talk to them about the various cognitive biases that are associated with conspiracy beliefs and how to avoid them. Meta-analyses have shown that belief corrections are most effective when they are less direct and appeal more to a person’s natural inclination to remain epistemically coherent (Walter and Murphy 2018). This approach mirrors larger scale interventions (e.g., Biddlestone et al. 2023) that address the thinking styles that make people vulnerable to misinformation and conspiracy narratives in the first place (we will discuss these strategies later in the article). Attention to, and comprehension of, these messages are likely to be improved through the use of humor (Vraga et al. 2019) and the use of narrative storytelling as demonstrative examples (Biddlestone et al. 2023) in interpersonal contexts.

7.4 Boosting a Sense of Control, Security, and Self-esteem

One of the key reasons why people turn to conspiracy theories is because important psychological needs are threatened. People feel threatened about events and circumstances that make them feel powerless, uncertain and unsafe, and turn to conspiracy theories in an attempt to feel better (Douglas et al. 2017). In a conversation with someone who is sharing conspiracy theories, it would be useful to highlight these fundamental reasons why people turn to conspiracy theories. In particular, opening up a conversation with a person about how existential concerns and anxieties can make people more vulnerable to false beliefs and maladaptive responses to stress would likely provide a basic knowledge of why they might find themselves endorsing conspiracy ideas (e.g., Jackson et al. 2017; Marchlewska et al. 2022). It would also be helpful to acknowledge and attempt to address a conspiracy believer’s frustrated psychological needs. Boosting their sense of worth and wellbeing may mean that they feel less inclined to turn to conspiracy theories to attempt to cope with difficult situations.

7.5 Knowing when to Back Off

The above four techniques relate back to the psychological needs that people are drawn to conspiracy theories for in the first place (e.g., reassuring people if they feel uncertain, making them feel more in control when they feel powerless or insecure, and helping them make social connections if they feel isolated). On one side, this can be straightforward when discussing how the world may not be such a one-dimensional hellscape. On the other hand, it can be difficult and more sensitive to explain how the answers to genuine societal corruption and immorality are not as simple as identifying one small malevolent group who are responsible for these problems. Exposing this complexity could potentially cause even stronger feelings of powerlessness and disillusionment and in turn strengthen conspiracy beliefs. Indeed, pushing too strongly against people’s beliefs can lead to psychological reactance, which is the perception that one’s freedom to make choices or express one’s beliefs are threatened (Steindl et al. 2015). For instance, during the COVID-19 pandemic it was suggested that psychological reactance increased conspiracy belief and played a role in people’s non-compliance with virus-mitigating behaviours (see Adiewna et al., 2020). We stress that these recommendations are intended to be applied to interactions in which persuasion and competitive debate are discouraged, and instead empathic understanding and accepting disagreements are preferred. Intellectual humility is vital from both sides.

There may be other potential downsides to engaging with conspiracy believers on an interpersonal level. For example, research has demonstrated that exposure to conspiracy theories can sometimes make people more receptive to them (Mulligan and Habel 2013), even without people being aware that their attitudes have changed (Douglas and Sutton 2008). This raises two important issues. First, talking to a person about their conspiracy beliefs and frequently repeating the conspiracy theories may only reinforce the person’s beliefs. That is, the very act of trying to convince a person to change their mind may only serve to strengthen their resolve. Second, having this type of conversation raises the possibility that the person trying to engage with the conspiracy believer is themselves influenced by the conspiracy theory, or entail that they are spreading it unintentionally and potentially influencing others.

There are other concerns about engaging with conspiracy believers in interpersonal contexts. Specifically, evidence suggests that attempting to change people’s political beliefs can sometimes result in stronger entrenchment of those beliefs, dubbed the “backfire effect” (Nyhan and Reifler 2010; but see Wood and Porter 2019 for evidence that concerns over this are exaggerated). Furthermore, work on the “continued influence effect”—in which belief in false information remains despite it being categorically refuted (e.g., Ecker et al. 2010)—suggests that people may believe conspiracy theories even when they are likely aware that they are factually incorrect to a certain extent. In some cases, therefore, people may be unresponsive to all attempts to engage with them and trying to do so would be a waste of time and energy (Cassam 2019). Finally, there is the possibility that the person engaging with the conspiracy believer will come across as insincere, and this may further alienate the believer and potentially strengthen their beliefs (Fantl 2018).

Taken together, there are several strategies that people can adopt to attempt to change the minds of conspiracy believers. Nevertheless, they are not without their pitfalls and empirical research is still needed to test the success of these strategies. Research on larger-scale interventions against conspiracy theories, however, has a more solid scientific basis. We turn to these strategies aimed at reducing susceptibility to conspiracy beliefs and misinformation at the level of broader communities.

8 Large-scale Interventions

Much of our knowledge on how to engage with conspiracy theories at the community level comes from research on misinformation more broadly, which is defined as deceptive or misleading information—relative to the best available evidence—that is disseminated publicly, but may not necessarily be intended to mislead (Southwell et al. 2022). Misinformation often involves conspiracy theories but also includes outdated news that has since been found to be incorrect, half-truths, rumors, and “fake news” that is spread deliberately to achieve specific political objectives (also known as disinformation; e.g., Swire et al. 2017).

8.1 Inoculation, and “Pre-bunking”

Like conspiracy theories, misinformation tends to be “sticky” and resistant to corrections (Johnson and Seifert 1994; Lewandowsky et al. 2012), and some efforts to address misinformation have therefore attempted to prevent it from “sticking” in the first place. This process is known as “pre-bunking” and borrows from research on attitudinal inoculation (e.g., McGuire 1964; see Banas and Rains 2010; for a meta-analysis of research on inoculation theory). By exposing people to a weak “dose” of the false information and making them aware of the manipulative argument techniques or precarious thought patterns involved in misinformation susceptibility, the logic is that people are able to develop psychological “immunity” against the misinformation (e.g., Traberg et al. 2023). In turn, this should make them better at resisting epistemic traps or manipulative persuasion attempts when they encounter them in the future (see also Compton et al. 2021).

There are many examples of the use of technique-based inoculation (i.e., refuting manipulative argument strategies) to confer resistance against misinformation in several domains, including climate change by pre-bunking the use of fake experts (e.g., Cook et al. 2017; see also Maertens et al. 2020 and van der Linden et al. 2017), vaccination intentions by pre-bunking anti-vaccine conspiracy theories (e.g., Jolley and Douglas 2017), and political misinformation by pre-bunking the use of polarising arguments (e.g., Roozenbeek and van der Linden 2020). While the long-term “immunity” that could be produced by this technique is still somewhat unclear (see Maertens et al. 2021), meta-analytic evidence indicates that it produces a moderate improvement in the detection of false information in general, and even leads to a slight improvement in the detection of reliable information (Lu et al. 2023). Furthermore, Mason et al. (2023) demonstrated that the efficacy of inoculation is equal among participants who hold both high and low prior conspiracy beliefs.

More recently, scholars have designed creative ways to administer inoculation in different contexts. For example, researchers have developed online games that put players in the position of someone spreading disinformation and using various tactics to do so (Basol et al. 2020, 2021; Neylan et al. 2023; Roozenbeek and van der Linden 2019). Others have developed comic strips pre-bunking against climate change misinformation (Cook et al. 2022), as well as online quizzes that improve players’ abilities to spot inauthentic social media “trolls” spreading misinformation (Lees et al. 2023). Finally, recent focus has turned to inoculation based on fostering logical skills as a method to confer resistance against the thinking styles that make people vulnerable to misinformation in the first place. For example, Biddlestone and colleagues (2023) used narrative storytelling to reduce belief in conspiracy theories by pre-bunking the conjunction fallacy—the tendency to incorrectly assume that two connected events are more likely to occur than independent events. As a result, researchers have a whole trove of successful pre-bunking techniques to choose from, and the goal is to make them scalable in the real-world.

8.2 Accuracy Nudges and Fact-checking

Other strategies have focused their efforts on scalability from the start. For example, researchers have developed accuracy nudges to prompt social media users before they share posts to simply remind them to remain accurate (Pennycook et al. 2020). Meta-analytic evidence suggests that the efficacy of this technique in reducing the sharing of misinformation is small but significant (Pennycook and Rand 2022), and scholars have even tailored this approach to be used by social media platforms (see Mosleh et al. 2022). Based on this evidence, researchers suggest that misinformation susceptibility can be explained by a lack of engagement with logical, reflective thinking processes rather than biased interpretation of information (Pennycook and Rand 2019). Others argue that this process is more complex, however, and that identity concerns drive information seeking and perceptions of source credibility when people engage in reflective processes (e.g., Van Bavel et al. 2024).

More straightforward interventions have also investigated the efficacy of fact-checking—an easily scalable technique on social media platforms. Meta-analyses confirm the modest but significant improvement that debunking misinformation can have (Chan et al. 2017; Walter and Murphy 2018; Walter et al. 2019). Recent work has even demonstrated that crowdsourcing fact-checks on Twitter (now rebranded as “X”) are similarly effective to the efforts of fact-checking experts in refuting false information online (e.g., Saeed et al. 2022).

8.3 Issues with Large-scale Interventions

Aside from the reduced efficacy of implementing these techniques in the real-world compared to laboratory settings (e.g., Roozenbeek et al. 2022), there are important caveats to addressing people’s strongly held convictions. Regarding debunking conspiracy theories and misinformation, science-relevant misinformation appears more difficult to correct than misinformation in general (Chan and Albarracín 2023), and even if corrections are effective, the influence of misinformation on one’s beliefs still lingers significantly (Walter and Tukachinsky, 2020). Regarding accuracy nudges, researchers have raised concerns over the replicability of its overall efficacy (Roozenbeek et al. 2021), as well as the asymmetrical efficacy it can have depending on the political ideology of participants (Rathje et al. 2022). Finally, while inoculation appears to be reliably effective at improving the detection of false information, some scholars argue that there is evidence indicating less promising improvement in the detection of real information depending on the analyses conducted (e.g., Modirrousta-Galian and Higham 2023).

There are also new interventions that have not yet been tested on larger groups of people. For example, Jolley and colleagues (2023b) recently found that contact between groups may reduce belief in intergroup conspiracy theories, and that communication of social norms emphasising how others intend to vaccinate may reduce anti-vaccine conspiracy beliefs and improve vaccine uptake (see Cookson et al. 2021; see also Pummerer 2022; Pummerer et al. 2022; Winter et al. 2021). This is promising considering existing evidence of the improved efficacy of interventions when combined. Cook and colleagues (2017) showed that communicating scientific consensus alongside inoculation messages improved the intervention’s effect through a sort of “norm-enhancement”, and theoretical perspectives from the inoculation framework have been adopted to improve the efficacy of media literacy intervention techniques (e.g., Zhang et al. 2022). It should also be noted that the success of interventions could be dependent on a variety of factors such as the nature of the sample, the measures of conspiracy beliefs that are used to test the success of the intervention, and the nature of the conspiracy theories that are being intervened upon (Stasielowicz 2024).

9 Future Research

Much work still needs to be done to understand when and how to engage with conspiracy believers, especially on an interpersonal level. In this article, we have offered insights from psychological research regarding (1) interpersonal concerns that need to be considered when discussing conspiracy theories with conspiracy believers, and (2) the success of large-scale strategies to reduce susceptibility to misinformation, including conspiracy theories. We hope that this review can pave the way for effective engagement techniques to be developed with sensitive interpersonal contexts in mind. For example, tailoring logic-based pre-bunking messages so that they are most effective among individuals experiencing social exclusion or even mental health concerns could be a successful avenue for future research on engagement strategies. This type of strategy could include a forewarning about how psychotic delusions or social exclusion can lead vulnerable individuals to seek comforting answers for their difficult circumstances (e.g., Poon et al. 2020), and that awareness of this may reduce one’s risk of endorsing false and harmful information. We hope that this review will also open up new avenues for research to assist people who are desperate for advice to help them understand and maintain their relationships with close others who have fallen down the “rabbit hole” of conspiracy theories (e.g., Sutton and Douglas 2022; Toribio-Flórez et al. 2023).

It is important to note that insights from large-scale interventions can inform engagement strategies at the interpersonal level. For example, individuals and communities could promote a culture of pre-bunking, where the strategies that can be used to manipulate people in political and social life become a common part of everyday conversation and vocabulary. Online competitions between friends playing the many inoculation games available could be a viable option to improve the enjoyment of these educational strategies for younger people. Finally, pro-active discussions around important health and political topics could set the stage to communicate constructive norms that make people more accepting of pro-social behaviours such as vaccination.

We also hope that understanding more about how to remain sensitive while addressing conspiracy beliefs on an interpersonal level could inform large-scale community-level ways to engage with conspiracy believers. For example, scholars could pay attention to adjusting their pre-bunking and debunking messages so that the messages build trust and affirm the beliefs that the receiver already holds whilst also incorporating specifically curated material to persuade the believer otherwise. Highlighting the importance of critical thinking (Jolley et al. 2023a), affirming key psychological needs (Biddlestone et al. 2021; Douglas et al. 2017), and emphasising the prospective benefits to both the self and others (Cakanlar et al. 2022) could also be incorporated into community-level strategies to engage with conspiracy believers. Furthermore, recent research suggests that person-specific debunks delivered via AI chatbots can effectively reduce conspiracy beliefs, even amongst strong believers (Costello et al. 2024). Further developments in AI technology could therefore be leveraged to broaden interventions against conspiracy theories.

In future research, it will also be important to consider that some conspiracy theories could have positive consequences such as providing a sense of community and belonging for people with marginal or stigmatised views (Franks et al. 2017; Nera et al. 2022), or inspiring people to push for social change (Mari et al. 2017; Wagner-Egger et al. 2022). Interventions oriented toward reducing the harmful consequences of conspiracy beliefs need to carefully consider the potential to diminish such positive outcomes. Researchers will need to think more generally about the balance between the benefits and pitfalls of engaging with conspiracy believers.

Researchers will also need to carefully reflect upon the ethics of interventions that are oriented toward changing people’s personal beliefs. We argue that societal consequences of conspiracy beliefs such as reduced vaccine uptake and lack of climate engagement warrant such interventions. Furthermore, we assert that personal pleas from relatives of QAnon believers for social support, and assistance to support the mental health of their loved ones, also warrant intervention. However, it will always remain important for researchers to be able to justify intervening on people’s personal beliefs.

10 Conclusions

In this article, we have outlined some evidence-based strategies to assist people in engaging with conspiracy believers interpersonally. We have also outlined ongoing and successful attempts to engage with conspiracy believers on a larger scale. The question of when to engage with conspiracy theories remains an important one, however. We have argued that it is important to engage when individuals’ conspiracy beliefs are potentially harmful to themselves or others. Of course, however, it is not always straightforward to know the difference between a harmful and a harmless belief and indeed some conspiracy beliefs could have benefits. More research understanding the properties of conspiracy theories themselves is needed to better understand their consequences (Douglas and Sutton, 2023). Although these issues remain to be resolved in this growing literature on the psychology of conspiracy theories, we hope that the review we have presented here will assist ongoing investigations in this important area of enquiry.