1 Introduction

Conspiracy theorists believe that powerful agents are conspiring to achieve their nefarious aims and also to orchestrate a cover-up. People who suffer from impostor syndrome believe that they are not talented enough for the professional positions they find themselves in, and that they risk being revealed as inadequate. These are rough characterisations of two very different states of mind. The conspiracy theorist looks outwards to the world, whilst the sufferer from impostor syndrome looks inwards. A given conspiracy theory may have many adherents, whilst each sufferer from impostor syndrome is concerned with a different individual, herself. And conspiracy theorising can promote group cohesion and enhance self-regard, whilst impostor syndrome is typically isolating and stressful.

Nevertheless, there are also intriguing parallels between the two phenomena, insofar as they involve distinctive patterns of trust and distrust, distinctive interactions between the content of a belief and the evidence which tells for or against that belief, and, perhaps, distinctive epistemic practices. For example, both conspiracy theorists and sufferers from impostor syndrome think that they have special insight into what’s really going on, insight which most others lack; in contrast, ‘neutral’ observers typically regard such thinking as irrational or confused. Moreover both are primed to explain away new evidence seemingly contrary to their views. After all, conspiracy theorists are not perturbed by the fact that superficial appearances continue to reveal no conspiracy: to them, this is just more evidence of an effective cover-up. Likewise, sufferers from impostor syndrome are not moved by the fact that superficial appearances continue to suggest that they are talented high achievers: to them, this is just more evidence of their intellectual fraudulence.

The purpose of this paper is to explore these parallels, in light of philosophical and empirical research on the two separate phenomena, but without minimising the significant differences between them. The task is worth undertaking for reasons which go beyond any intrinsic interest it may have. First, so far as I can discover, the separate bodies of research have not previously been in dialogue, so cross-comparison may suggest further fruitful lines of empirical investigation. Second, the comparisons enable us as philosophers to think more carefully about a range of normative issues, for example about the (ir)rationality of conspiracy theorising and of impostor syndrome, about the types of social-epistemic environment which may encourage or discourage either phenomenon, and about the distinctive difficulties encountered when trying to ‘convert’ people away from either conspiracy theorising or impostor syndrome.

I will focus on the doxastic and epistemic aspects of both conspiracy theorising and impostor syndrome. That is, I will regard them primarily as patterns of belief, or more broadly as patterns of belief formation, maintenance, and updating in the light of available evidence and justifications. This is of course a simplification, most obviously for the case of impostor syndrome: we cannot fully understand impostor syndrome without understanding its relationship to behaviour, to anxiety, and to affective states more generally. After all, someone who believes she is not as talented as others take her to be, but who is quite happy with this situation, would not normally be classified as suffering from impostor syndrome. Conversely, there may be sufferers from impostor syndrome who do in fact believe that they are talented, but find themselves unable to relax into the affective attitudes and behaviour appropriate to that belief. I hope to explore these aspects of impostor syndrome in other work, but will stick to the doxastic for now.

In what follows, I first discuss conspiracy theories, then discuss impostor syndrome, before exploring the parallels between them.

2 Conspiracy theories

In 1994, sociologist Ted Goertzel wrote

There has been no published information about the prevalence of belief in any of these conspiracies [concerning e.g. the assassination of JFK, or the claim that AIDS is a government-induced epidemic]. Nor has anyone addressed the question of to what extent belief in conspiracies is a generalized ideological trait, that is, how likely are people who believe in one conspiracy to believe in others. Nor has there been any previous attempt to discover the psychological or sociological correlates of belief in conspiracies. (Goertzel 1994: 732)

A quarter-century later things have changed: Goertzel’s article has classic status in a burgeoning empirical literature on these topics. In the same period, alongside developments in psychology and political science, a small but active group of philosophers has explored conceptual and normative issues surrounding conspiracy theories. (Dentith and Keeley (forthcoming) reviews recent literature, whilst Coady (2006) collects some important earlier papers.) Across all of these disciplines and beyond, discussion of conspiracy theories has come to seem horribly urgent during the past few years.

Within philosophy, one central question has been how best to define the term ‘conspiracy theory’, and in particular whether to assign it a neutral or a pejorative meaning. All parties agree that to qualify as a conspiracy theory, a claim or belief must have a distinctive content. Approximately speaking, this is that some significant phenomenon can best be explained by positing a conspiracy, a group of people working together for nefarious ends who have successfully covered their tracks by generating a false ‘received wisdom’.

If we define ‘conspiracy theory’ just in these terms, then it is an open question whether many or most conspiracy theories are false, and whether many or most are believed on insufficient evidence. To answer these questions, we might examine conspiracy theories individually and try to establish their various credentials. Or we might investigate a larger class of conspiracy theories, trying to understand whether it is in general likely that powerful individuals can and do control major events whilst subverting public perceptions of social reality; the plausibility of this claim will vary between different historical periods, and between different societies today.

The alternative theoretical option is to define ‘conspiracy theory’ such that conspiracy theories have that distinctive content, but in addition are by definition false, or, perhaps, by definition unjustified. On this picture, conspiracy theories are always pathologies of reason. So when we investigate the epistemic merits of some theory about a group of people working together secretly, we thereby investigate whether it is a conspiracy theory or whether instead it is a hypothesis worth taking seriously. This makes sense of the way in which ‘conspiracy theory’ is used as a pejorative in everyday discussion: few of us regard our own beliefs as conspiracy theories. A theoretically fruitful definition need not mirror ordinary usage closely. Nevertheless, the further we move from ordinary usage, the harder it is to engage with the political and practical debates which ultimately motivate philosophical work in this area.

Another issue in philosophical debate has been whether to focus on epistemic assessment of conspiracy beliefs, or instead to focus on the broader patterns of thinking associated with conspiracy theories. For example, Cassam (2016) argues that we need the notion of intellectual vice in order to understand what is problematic about conspiracy theorising. A given conspiracy belief may be justified in the light of the evidence actually possessed by its subject. Nevertheless we can criticise the believer’s shoddy epistemic practices, for example over-emphasising fringe sources and failing to seek out a wider range of reliable voices. (Sunstein and Vermeule (2009) make a related point.) Cassam argues, with reference to psychological research in this area, that conspiracy theorists tend to exemplify vices of gullibility, selective cynicism, and prejudice.

These intellectual vices do seem to be associated with conspiracy thinking, at least in the paradigmatically bad cases. But they cannot fully explain why people adopt conspiracy theories, since they make no reference to the specific content of conspiracy theories (indeed, Cassam does not suggest that they are the only explanatory factor). Gullibility, selective cynicism and prejudice can lead us astray in a much broader range of beliefs. For example, these vices may lead us to think of politicians as ‘only looking out for themselves’ even when the evidence does not support this; such a belief does not amount to a conspiracy theory. Likewise, these vices may lead us to believe that homeopathy is effective, again without endorsing a conspiracy theory.

Broadly speaking, academic research beyond philosophy has tended to focus on the question of why people believe in conspiracy theories, looking for answers in individual psychology or societal conditions, according to discipline. Such research presupposes that belief in conspiracy theories requires some kind of special explanation, that the phenomenon is both significant and prima facie puzzling. In turn, this seems to presuppose that belief in conspiracy theories is typically irrational. After all, when people form beliefs in ways which we endorse as rational, we usually don’t seek causal explanations in terms of motivated reasoning. [The analogous approach is criticised by advocates of the ‘strong program’ in the sociology of scientific knowledge: their symmetry thesis demands that we seek the same type of explanations for others’ beliefs, whether or not we endorse them as rational (Barnes and Bloor 1982).]

A review of recent psychological research into conspiracy theorising distinguishes three types of motives for belief in such theories (Douglas et al. 2017): epistemic, existential, and social. ‘Epistemic’ motives include desires to preserve one’s existing beliefs, and to avoid feelings of uncertainty especially in the face of large-scale complex phenomena. Conspiracy theories offer a ready catch-all explanation in terms of intentional human agency, an explanation which minimises arbitrariness and loose ends. ‘Existential’ motives arise from people’s need to feel safe and in control, e.g. by seeing themselves as perceptive and not easily fooled. The conspiracy theory says that there is a cover-up which has effectively deceived most people, but that believers in the theory are the exception to that rule. ‘Social’ motives include a desire to belong, and to think highly of one’s own in-group relative to others. Conspiracy theorists often strongly identify with others who share their beliefs, and again the theory spells out the way in which believers are more perceptive on this matter than are ordinary sheeple.

This body of research on the motives for conspiracy thinking gives a central role to the distinctive content of conspiracy theories, explaining their attractiveness in terms of the picture they paint both of the world at large and of the believer and his or her social group. Interestingly, the review authors suggest that although there is good evidence that conspiracy thinking is motivated by these features of conspiracy theories, it is much less clear whether belief in conspiracy theories actually does make believers feel more certain, safer, or more high status, for example: ‘In this sense, conspiracy theories might be seen as an ironic or self-defeating manifestation of motivated social cognition’ (Douglas et al. 2017: 540).

The content of conspiracy theories is significant not just for explaining why such theories are adopted in the first place, but also because it underpins a range of epistemic practices associated with conspiracy theorists, practices which are more specific than vices such as gullibility. It is not unusual for us to seek out and value evidence which fits with our pre-existing beliefs. But when we believe in conspiracy theories, the content of what is believed has direct consequences for how we should handle evidence. Belief in conspiracy theory is self-perpetuating in several respects. Someone who believes a conspiracy theory believes that powerful agents have conspired to keep the truth on this matter hidden. If this is the case, then readily available evidence is likely to be misleading, and it will take investigation and special insight to find undistorted evidence of the facts.

Thus believers in conspiracy theories avoid reliance on official sources of information, since they expect such sources to be misleading. Moreover they have a ready-made explanation for any purported counter-evidence which may come to their attention, for example when non-believers try to talk them round: of course it looks as if the conspiracy theory is false, that’s just what they want you to think! Finally, conspiracy theorists seek and prize anomalies which seem to conflict with the official version of events (Keeley (1999) refers to ‘errant data’ in this context). Conspiracy theories typically concern large-scale and complex social phenomena, and even a highly accurate, explanatorily rich official story will inevitably leave some loose ends. The non-conspiracy attitude is to regard those as marginal peculiarities: ‘I guess it’s just a coincidence’. For the conspiracy theorist, those loose ends are overwhelmingly important, since they provide rare opportunities for us to see through the cover-up to the real facts below.

One of many reasons why conspiracy theorising is an intriguing topic for epistemologists is that such patterns of thought are not inherently and inevitably unreasonable. It is indeed true that if there is a successful cover-up, then this will be difficult to detect, and occasional anomalies will have heightened epistemic significance. If official sources of information have been co-opted, knowingly or unknowingly, to assist with covering up the truth, then they are not reliable sources of information. There is an internal coherence between the content of conspiracy beliefs and the epistemic practices which they reinforce, at least once the belief is adopted in the first place. This coherence is part of what makes it difficult to simply reason people into rejecting their own conspiracy theory beliefs.

There is much more to say about conspiracy theories on this front, and much has been said by philosophers in this area. But my key aim is to make the comparison with impostor syndrome, so it is time to move on. To summarise so far: conspiracy theorising involves a distinctive pattern of distrust in standard sources of information which other people rely upon, such as mainstream news media. It also involves a heightened degree of trust in one’s own capacities to see through the façade and understand what’s really going on; this is reflected in psychological research about the motives for conspiracy theorising. It may also involve an exaggerated degree of trust in certain non-standard sources of information, for example websites run by other conspiracy theorists, and in the evidential weight of residual anomalies.

3 What is impostor syndrome?

Roughly speaking, impostor syndrome is a condition suffered by people who have external markers of success, such as high grades and professional accolades, who nevertheless believe themselves to be inadequate. The term ‘impostor syndrome’ is used very widely in non-academic literature, but psychologists studying this topic prefer the term ‘impostor phenomenon’; in my discussion I will use the terms interchangeably. Psychologists Suzanne Imes and Pauline Rose Clance identified the phenomenon in their clinical work with professionally successful women, publishing a first article in 1978. Clance published a book aimed at sufferers—Impostor Phenomenon—in 1985, and there is now a huge amount of popular writing on the topic; Valerie Young’s The Secret Thoughts of Successful Women (2011) is an impressive example of the genre. Neither impostor syndrome nor impostor phenomenon features as an identified disorder in the American Psychiatric Association’s Diagnostic and Statistical Manual. But psychologists have generated extensive research into its prevalence and correlates, reviewed for example by Sakulku and Alexander (2011) and by Calvard (2018).

Like conspiracy theories, impostor beliefs have a characteristic content: that the believer isn’t really talented enough for the professional position or accolades she finds herself with, and, typically, that others have not yet noticed her lack of talent. (Some sufferers from impostor syndrome believe that others ‘know’ of their incompetence and are merely pretending otherwise for political or compassionate reasons.) Impostor syndrome involves more than such beliefs: a comprehensive account would also include its affective and behavioural aspects. Moreover impostor syndrome may often be characterised by mere lack of belief in one’s own talent, as opposed to an outright belief that one lacks talent: mere doubt on this front can certainly be damaging. But for simplicity in this paper, I will ignore these subtleties, and discuss impostor syndrome as founded on the belief that one is inadequate for one’s position.

Impostor syndrome has not been a focus of philosophical enquiry, although there are interesting informal discussions of impostor syndrome as suffered by academic philosophers (e.g. Olberding 2018). Following the lead of philosophical writers on conspiracy theories, however, we can formulate a definitional question: should the term ‘impostor syndrome’ be reserved for impostor beliefs which are false and/or unjustified, or should we offer a neutral account of ‘impostor syndrome’ in terms of the content of such beliefs, leaving it open whether impostor syndrome is sometimes, often, or always irrational? My own view (Hawley 2019) is that the beliefs associated with impostor syndrome are often justified, even for people who have external markers of success: many of us grapple with very mixed feedback which can make it perfectly reasonable to doubt our own capabilities.

In any case, I will use the term ‘impostor thinking’ as a shorthand for the type of beliefs and thought processes characteristic of the sufferer from impostor syndrome, without prejudice as to whether such beliefs are false or unjustified, and I will use ‘impostor thinker’ as a term for someone who habitually thinks this way.

As with conspiracy theorising, the main thrust of empirical research on impostor thinking is to understand the environmental and psychological factors which seem to promote it. For example, impostor thinking may be associated with perfectionism or neuroticism, and with certain types of parental attitudes towards childhood achievements (Sakulku and Alexander 2011). As with conspiracy theories, the content of impostor beliefs plays a crucial role in such explanations. It seems that impostor thinkers are drawn to the belief that they are professionally inadequate, not primarily because of its source or its evidential base, but because of its content: for example, because it coheres with their own perfectionism, or with their ideas about talent, e.g. that talented people succeed through effortless superiority not through hard work.

I suggested earlier that conspiracy theorising is associated with a distinctive approach both to evidence-gathering and to the updating of beliefs in light of evidence. Conspiracy theorists do not accept the reliability of standard or official sources of information about the events they are concerned with, since it is an element of the theory that such sources are knowingly or unknowingly reporting the cover-up story, rather than the real truth. Second, conspiracy theories provide resources for explaining away seeming counter-evidence: in the eyes of the conspiracy theorist, that’s just more indication of how widespread and successful the cover-up operation has been. Third, conspiracy theorists seize upon errant data or apparent anomalies, i.e. events and information which, while they may be logically consistent with the official story, seem nevertheless somewhat strange, or unexplained, in the light of that story.

We can recognise these three features in impostor thinking too. By definition, those who think this way do so in the face of at least some positive feedback about their own performance and talents. Impostor thinkers have passed exams, have been offered professional jobs or promotions, have been invited to publish or exhibit their creative work, and so on. Yet they do not believe that they have the underlying talent or capability normally associated with these external credentials. To make sense of this, impostor thinkers must regard external markers of success as untrustworthy guides to underlying talent in their own case; depending on the individual and her situation, this may be rationalised by thinking of others as poor or inattentive judges of talent, or by thinking of them as dishonestly willing to praise what they do not genuinely regard as good work.

So, just as conspiracy theorists reject standard or official sources as unreliable, impostor thinkers reject the standard external sources of information about their talents as unreliable guides to the truth. This is not just an incidental aspect of impostor thinking, but is built in from the start. Impostor thinkers know that they have high grades, for example—they are not oblivious to this evidence—but they nevertheless attribute such successes to their being likeable, or hard-working, or lucky, in ways which others ‘mistake’ for talent.

Second, impostor thinkers are not impressed by new evidence of their talents, as they get the next prize or promotion, or pass another exam. They have already established to their own satisfaction that such events are not a genuine guide to talent, at least in their own case, and so more of the same is unpersuasive. Oddly, whilst impostor thinkers regard themselves as inadequate in their professional roles, they simultaneously regard themselves as unusually perceptive about exactly this inadequacy, just as conspiracy theorists regard themselves as unusually perceptive about how the world really works. Thus impostor thinkers differ from those who are hesitant to believe in their own talents on the basis of early successes, but who are gradually led to self-belief through the accumulation of more such evidence. For impostor thinkers, further accolades may instead deepen the associated anxiety, by increasing the potential penalty when the supposed ‘incompetence’ is finally unmasked.

Third, as with conspiracy theorists, impostor thinkers may seize upon and prize ‘anomalies’, such as occasional failures or negative feedback. Someone not in the grip of impostor thinking will either disregard these as meaningless glitches, or else see in them opportunities to refine their talents and improve still further. But the impostor thinker will see these failures as revelatory of her true lack of talent: on such occasions the mask slips, confirming her suspicions about her true inadequacy.

So in summary impostor thinking involves a distinctive pattern of distrust in certain standard sources of information, trust in one’s own capacities to perceive one’s own inadequacies, and exaggerated trust in external sources of negative feedback. Once adopted, impostor beliefs are hard to shake off, because they dictate a self-reinforcing attitude to encounters with new evidence.

4 Reflections

Conspiracy theorists and impostor thinkers have distinctive beliefs about contingent empirical facts, and these beliefs have consequences for their epistemic practices. Conspiracy theorists reject standard sources of information about the events which interest them, and have exaggerated trust both in their own perceptiveness, and in anomalous pieces of evidence. Impostor thinkers reject standard sources of information about their own talents, and have exaggerated trust both in their own capacities to judge their talents, and in anomalous pieces of evidence. This is a neat parallel, and it should be evident that this kind of pattern—substantive belief with self-reinforcing consequences for epistemic practice—will be found in other domains too. For example, perhaps creationism paired with the idea that God has created the fossil ‘record’ to test our faith fits this pattern. Perhaps so too can the beliefs of the victim of ‘gaslighting’, who believes that although it appears she is being badly treated by her partner, things only seem that way because she herself is over-sensitive (Abramson 2014).

What can we learn from such parallels? I do not suggest that there are substantive underlying psychological mechanisms in common between these various phenomena: I have found no reason to suppose that impostor thinking encourages conspiracy theorising, or vice versa. Indeed I have emphasised the way in which the substantive content of conspiracy beliefs or impostor beliefs is central to their adoption and maintenance; this substantive content is of course different in each case. Nevertheless, I do think that testing the extent and limits of these comparisons can prompt creative thinking both about impostor syndrome and about conspiracy theorising, partly by defamiliarising each phenomenon through abstracting away from the usual questions.

For example, I have already suggested that the philosophical debate about neutral versus pejorative definitions of ‘conspiracy theory’ should have a counterpart in debate about the definition of ‘impostor syndrome’, and about whether sufferers from that syndrome are by definition thinking irrationally. Likewise, Cassam’s discussion of the intellectual vices associated with conspiracy theorising should help us think more carefully about whether, when impostor syndrome does embody flawed thinking, this is best understood in terms of lack of justification for impostor beliefs, or in terms of intellectual vices of the impostor thinker.

Shifting focus, we can consider whether there is scope for cross-fertilisation of empirical research programmes. Recall that recent psychological research into conspiracy theorising can be framed as a search for ‘epistemic’, ‘existential’ and ‘social’ motives for what are regarded as otherwise-puzzling beliefs (Douglas et al. 2017). That is, conspiracy beliefs are seen as responses to various needs in the subject. Prima facie it might seem peculiar to regard sufferers from impostor syndrome as motivated to adopt self-critical beliefs, since those very beliefs cause them distress. But there are some suggestive connections.

‘Epistemic’ motives include a desire to make sense of one’s experiences: if the impostor thinker mistakenly believes that genuinely talented people never struggle to succeed, yet notices that things don’t always come easy for her, then she may be drawn to the explanation that she is not genuinely talented, but has somehow fooled others into thinking she is. ‘Existential’ motives include a desire for control: perhaps we can see impostor thinkers as reluctant to make their self-image hostage to the whims of other people’s judgements and praise. Conspiracy thinking seems more tempting when people lack control over aspects of their lives; perhaps impostor thinking is also tempting when we lack control. ‘Social’ motives include group identity: being amongst a minority in one’s professional context may be a trigger for impostor thinking.

Comparisons between conspiracy theorising or impostor thinking may also help us understand why it is so difficult to change these patterns of thought through dialogue. Each phenomenon involves doubting the reliability of standard sources of information, so there is little prospect of changing someone’s mind by providing more evidence from those same old types of source. It is difficult to find evidential common ground in either case, to agree a neutral pool of evidence from which to draw conclusions. For example, in speaking to someone prone to impostor thinking, it’s unlikely to be useful to draw her attention to her qualifications and achievements, since she is already primed to reject evidence of that type, regarding it as systematically misleading. Likewise, in speaking to someone prone to conspiracy theorising, it’s unlikely to be useful to draw her attention to news reports in the mainstream media or to the testimony of conventional experts, since she is already primed to reject all this.

If we understand both conspiracy theorising and impostor thinking in terms of trust and distrust, it is not surprising that ‘just more evidence’ is often not enough to shift a person’s perspective in these domains. One strategy would be to reduce our emphasis on ‘rational’ persuasion, and instead to look at the emotional and social circumstances which make these kinds of thinking attractive, or at least better than the alternatives.

But whether or not we are seeking to change others’ minds, in both contexts it is useful to consider the social-epistemic environment. What sort of conditions may encourage imposter thinking or conspiracy theorising? First, such thinking will be attractive where standard information sources seem to be mutually co-ordinated, or to be motivated by goals other than conveying the truth. Co-ordination amongst sources means that apparently ‘new’ information does not offer any independent reason to doubt either the conspiracy theory or the impostor belief. Meanwhile, if conspiracy theorists or impostor thinkers believe that sources of ‘information’ have some ulterior purpose, then they will not find them epistemically persuasive. Conspiracy theorists, of course, are likely to assume a negative ulterior purpose. Impostor thinkers may believe that people around them are trying to be kind or helpful, by not offering them the harsh criticism they really deserve; or they may believe that others are restrained by political correctness or fear of giving offence.

Second, an environment in which what’s said is not reflected in what’s done will likely encourage conspiracy theorising or impostor thinking, depending on the context. We form our beliefs on the basis of a large complex set of evidence, including what people and institutions do, as well as what they say. For an impostor thinker, if warm words about her performance or achievement are not made concrete in action, for example by invitations, promotions or reliance, it will be easy to dismiss them as just words. Likewise, it’s no wonder if conspiracy theorists are encouraged in their beliefs when governments or politicians profess to have citizens’ best interests at heart, yet do not seem to behave as if this were true.

Third, an environment where distrust in the official story is rewarded may promote conspiracy or impostor thinking. Conspiracy theorists seem to benefit from their attitudes, via enhanced self-regard, identification with a group, or a sense of control; or at least they may hope for such benefits even if these are often not realised (Douglas et al. 2017). For impostor thinkers, there may be social rewards for modesty, including perhaps continued or re-emphasised praise from others, and penalties for ‘boasting’ or self-assurance, in ways which are affected by perceived gender and race (Leonhardt et al. 2017).

Fourth, an environment in which anomalies or disinformation are not challenged, or not given proper contextualisation will again feed a tendency to conspiracy or impostor thinking. In the context of conspiracy theorising, this brings us to difficult questions about the role of the media, education, and free speech. But for impostor thinking, the point is that it may make a significant difference how all of us frame and respond to negative comments, unfair criticism, or indeed fair criticism. Impostor thinkers are prone to giving undue weight to negative feedback, even when this is set in a context of generally positive feedback. My idea is not that negative feedback should be eliminated, but that it may matter more than we realise how such feedback is presented, since it cannot easily be balanced by sheer quantity of accompanying positive feedback.

Finally, the parallels between conspiracy theorising and impostor thinking should lead us to re-examine why the default mainstream response to the former is critical or dismissive, whilst to the latter it is more sympathetic. One difference is that impostor thinkers seem to suffer from their beliefs in a way which conspiracy theorists often do not. But this cannot be the whole story. Conspiracy theorists are not always made happy by their beliefs, and in general we do not resent people who are made happy by beliefs we do not share.

More likely, conspiracy thinking is perceived to be socially more dangerous than impostor thinking. For example, it seems to be easily spread, and to undermine trust in institutions and mainstream media in ways which are difficult to recover from. In particular instances, it is of course right and natural to feel outraged by conspiracy theorists who accuse grieving families of being ‘crisis actors’. (Sunstein and Vermeule (2009) portray conspiracy theorists as menaces to society.) But impostor thinking is also a damaging social phenomenon, insofar as it prevents people from enjoying their professional lives and achievements. Indeed, it is a problem for all of us if people hold themselves back from opportunities to contribute to collective endeavours, especially where such reluctance is disproportionately found in groups that are in any case socially disadvantaged.

Conspiracy theories and impostor syndrome are not the same thing. But I hope to have illustrated that each separately is a worthy topic for philosophical investigation, and that one way of pursuing such investigations is to explore the structural similarities between these phenomena.