Abstract
An epistemic agent A is a false epistemic authority for others if they falsely believe A to be in a position to help them accomplish their epistemic ends. A major divide exists between what I call epistemic quacks, who falsely believe themselves to be relevantly competent, and epistemic charlatans, i.e., false authorities who believe or even know that they are incompetent. Neither type of false authority covers what Lackey (2021) calls predatory experts: experts who systematically misuse their social-epistemic status as a cover for predatory behavior. Qua experts, predatory experts are competent and thus could (and maybe sometimes do) help their clients. But should we count them as genuine epistemic authorities? No. I argue that they are false epistemic authorities because in addition to their practical and moral misconduct, such experts systematically deceive their clients, thereby thwarting the clients’ epistemic ends.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
The world is full of error, evil, and confusion. Much of it originates with false epistemic authorities, i.e., agents who fail to be competent or trustworthy (or both), yet manage to recruit — sometimes significant numbers of — credulous devotees. Often, this has bitter consequences. For example, in 2021 an Austrian politician from the right-wing populist party FPÖ advised anti-vaccine activists to consume certain medical products containing Ivermectin as a Covid-19 antidote. Ivermectin is primarily used to deworm livestock or, in humans, to treat ectoparasites such as scabies. Nevertheless, people stormed the pharmacies, media reported serious poisoning, and for several weeks, horses could not be dewormed with Ivermectin compounds since they were sold out.Footnote 1 (My hunch is that enough might have been in stock had the anti-vaxxers who consumed the products recalculated the dosage and not taken quantities recommended for horses.) People put their faith in what misguided politicians without medical expertise fantasize about bizarre forms of medicationFootnote 2; sect leaders manage to drive their disciples into mass suicides or cajole them into committing brutal homicidesFootnote 3; conspiracy theorists set out to liberate the world from fictitious villains, like Don Quixote who fought (and lost) a battle against windmills. Many other examples could be mentioned where confused or evil demagogues, gurus, populists, or other manipulative agents manage to trash people’s minds with humbug, bullshit,Footnote 4 or dangerous illusions. How come?
People frequently pick out and listen to the wrong epistemic authorities. Epistemic authority is a form of authority that we ascribe to agents or sources that enjoy, relative to us, an advanced epistemic position concerning relevant epistemic goods. Adopting a distinction from Dormandy (2020), we may divide these goods into “representational” and “recognitional” ones. Representational epistemic goods include true belief, justified belief, knowledge, understanding, and maybe also goods such as engaging in the right kind of inquiry (cf. Grasswick, 2020). We can call them “representational” because they constitute or produce cognitive states that represent, or aim at representing, the world as it is (Dormandy, 2020:13). When we pursue these goods, we can profit from the fact that genuine epistemic authorities have more of them than we do. If they share them with us, this can help us make progress in our own epistemic endeavors.
Recognitional epistemic goods, in contrast, are goods that givers of information — for short, speakers — receive from their interlocutors. These include being recognized as a knower, as someone with an advanced level of understanding, and so on. These are additional epistemic goods since, obviously, the mere fact that a speaker enjoys an advanced epistemic position relative to their hearer does not entail that the latter will realize this and treat the speaker accordingly.
I call epistemic authorities with both the relevant representational and recognitional goods recognized epistemic authorities. The problem is that all too often people fail to recognize their true or genuine epistemic authorities. They misidentify those whom they should believe and mistake false authorities for genuine ones. This may have different sociological or psychological reasons. However, in what follows, I will not discuss the psychology of false authority but instead develop analyses and taxonomies that illuminate the topic from a social–epistemological point of view.Footnote 5 I begin by arguing for what I call a social-role account of genuine epistemic authority according to which epistemic authorities are in a position to help their clients accomplish their epistemic ends (section 2). I then distinguish putative authorities of various kinds who fail to meet the requirements of genuine epistemic authority and constitute what I call “false (epistemic) authorities.” Among such false authorities, I discuss unintended false authorities — people who do not pretend to be in an advanced epistemic position — and fake authorities, i.e., people who, in spite of not fulfilling the desiderata of genuine authority, wish to be treated as being in a superior epistemic position. Fake authorities subdivide into what I call epistemic quacks, i.e., agents who falsely believe themselves to be competent, and epistemic charlatans, people who do not believe themselves to be competent or even know that they are incompetent (section 3).
On closer inspection, it turns out that this initial picture is still incomplete. Consider what Jennifer Lackey has dubbed predatory experts, i.e., “experts who use their epistemic authority as a cover for predatory behavior within the domain of their expertise” (Lackey, 2021:144). Lackey’s chief example is the case of the convicted serial child molester Larry Nassar, who served as a long-term team doctor of the US women’s national gymnastics team, and in this role (and elsewhere) sexually assaulted hundreds of girls and young women. Could such experts constitute genuine epistemic authorities? Lackey’s characterization suggests a positive answer. This view is supported by the fact that, concerning the relevant subject matters, people such as Nassar are in an advanced epistemic position relative to their clients. Nassar is a trained physician, after all; his patients were not. However, I argue that, nevertheless, predatory experts fail to constitute genuine epistemic authorities since they systematically deceive their clients, thereby thwarting the latters’ epistemic goals (section 4). In order to accommodate this observation, we need to supplement the competence-oriented account of epistemic authority so far developed with a performance constraint. I argue for a reliability condition to the effect that genuine epistemic authorities not only have the epistemic abilities to help their clients, but that these abilities also reliably manifest themselves in the authority’s social-epistemic behavior (section 5). A short conclusion summarizes my results and outlines questions for further research (section 6).
2 A Social-Role Account of Epistemic Authority
2.1 Epistemic Authority: First Steps
A natural way to start exploring false epistemic authority is to begin with true, or genuine, epistemic authority. This notion has sparked some controversy.Footnote 6 Some authors propose a purely epistemic or “subjective” definition. For example, in a classic study, C.S. Lewis claims that “[b]elieving things on authority only means believing them because you have been told them by someone you think trustworthy” (2009:62, my emphasis). More recently, Sofia Bokros, focusing on credences, maintains that “A is an epistemic authority for S with respect to p iff S judges A to have a higher expected accuracy with respect to p than S takes herself to have independently of following A´s authority” (2021:7, my emphasis).Footnote 7 However, that someone is thought or judged to be in an epistemically advanced position does not entail that they actually have that status. Relatedly, such definitions do not require epistemic authority to be an asymmetric relation. If it depends solely on the agent’s belief whether or not a given source is an epistemic authority for them, A could be S’s authority, while S is A’s authority (at the same time, and regarding the same topic and the same epistemic good). Arguably, this is an unwelcome result which sits ill with our pre-theoretical understanding of the notion.Footnote 8
In view of such problems of purely epistemic definitions, one may opt for a purely objective account according to which epistemic authorities are people (or, more generally, sources) who actually enjoy the relevant epistemic advantages over those who seek advice from them, whether or not the latter recognize this. This would preserve asymmetry, and it would allow us to disagree with people about their epistemic authorities. If someone who drank disinfectant or consumed horse dewormer justified this by deferring to Trump or Kickl as their authorities, the reply could simply be that these people are in reality not the agent’s authorities on medical matters, for the simple reason that they fail to enjoy the required epistemic competence in these fields.
Nevertheless, it may be conceded that proponents of purely subjective accounts rightly capture an agent-relative, subjective connotation of the notion of epistemic authority. We do say things such as: “A was your authority on the matter, all right, but he shouldn’t have been!” We can account for this by adopting a proposal from Jäger (forthcoming) which combines both the subjective and objective connotations of what he calls “recognized epistemic authority”:
EA1: A given epistemic source A is a recognized epistemic authority for an epistemic agent (or group of epistemic agents) S at time t, and relative to some set of propositions or subject matter p and set of epistemic goods G, iff S correctly believes A to be in a significantly advanced epistemic position relative to S at t, concerning p and G.Footnote 9
I shall work with this definition for now, yet argue below that further fine-tuning is called for.
Calling A a “source” is liberal enough to cover non-human players, such as intellectual or cultural traditions and AI systems. Whether and, if so, in which ways, the authority such systems might enjoy is parasitic upon human authority is a matter of debate. Such complexities are not immediately relevant for present purposes, however, so I shall set them aside and focus on the paradigm of human epistemic authority.
The qualification that S (correctly) believes A to be in an advanced epistemic position reflects the subjective aspects outlined above. Still, according to EA1, A fails to constitute an epistemic authority for S if A is not in fact epistemically better off than S; S’s belief to this effect must be true. Moreover, the source’s superiority is topic- and time-relative and relative to specific epistemic goods. Consequently, A may, at (or during) t, be S’s authority concerning some subject matter and epistemic good while S is, at the same time, A’s authority on another topic, or on the same topic but with respect to another epistemic good.Footnote 10 (For example, one scientist may harbor more true beliefs in a certain field of inquiry while another has deeper understanding.) Moreover, at some other time t’, things may have switched and S may have become A’s authority concerning p and G. Finally, requiring a significantly advanced epistemic position reflects that there must be some appropriate epistemic distance between the parties. For example, in cases where true belief is the central epistemic goal and A has 90% true beliefs in a given field while S is right 87% of the time, we would not normally consider A to be S’s epistemic authority.Footnote 11
2.2 Expertise Versus Authority and a Social-Role Account
EA1, I maintain, captures essential aspects of the notion of (genuine) epistemic authority, but it is still silent on one important detail. In order to see this note, first, that it represents what may be called a competence approach to epistemic authority. The core idea is clear enough. If the epistemic goal is for example true belief, then according to the above definition, A is an epistemic authority vis-à-vis S only if A has more true beliefs (and, perhaps, fewer false onesFootnote 12) than S concerning the relevant subject matter. These may consist in not only object-level, first-order beliefs but also methodological beliefs about how to collect and assess evidence, how to draw appropriate conclusions, how to correctly base the relevant beliefs on the evidence, and so forth.
Second, note that according to EA1, not all authorities are experts and not all experts are authorities. Consider for instance one of the most influential accounts of expertise, the one proposed by Alvin Goldman. He argues that:
S is an expert about domain D if and only if (i) S has more true beliefs (or high credences) in propositions concerning D than most people do, and fewer false beliefs; and (ii) the absolute number of true beliefs S has about propositions in D is very substantial (Goldman, 2018:5).Footnote 13
According to this account, an expert’s true (and false) beliefs must meet certain quantitative thresholds relative to large epistemic communities. EA1 deliberately excludes such thresholds for the notion of epistemic authority. We should be able to say, for example, that parents with an average, non-expert education in algebra typically serve as authorities in algebra for their young children. Or to borrow an example from Croce (2019): In order for a grandmother to serve as an epistemic authority for her grandson about how fish breathe, she need not be an ichthyologist. Authority is a subject-relative notion. Whereas Goldman’s conditions (i) and (ii) make sense for expertise, they should not be incorporated in a definition of epistemic authority.
In addition to his threshold account, Goldman (2018) also considers another take on expertise, and the ideas he lays out there, I submit, do capture a feature that also applies to epistemic authority. Goldman also suggests that:
S is an expert in domain D if and only if S has the capacity to help others (especially laypersons) solve a variety of problems in D or execute an assortment of tasks in D which the latter would not be able to solve or execute on their own. S can provide such help by imparting to the layperson (or other client) his/her distinctive knowledge or skills (Goldman, 2018:4).
Others concur. Quast (2018) says, following Goldman, that “the main function of expertise is sharing some knowledge for the benefit of someone else” (2018:13). David Coady argues that.
[an expert] is someone laypeople can go to in order to receive accurate answers to their questions (Coady, 2012:30).
The exact nature of this social-epistemic function is controversial. Croce (2018b, 2019) argues that we should not, or not exclusively, focus on expert-layperson relations but also consider experts as epistemic agents who promote disciplinary progress in the expert community. This fits Coady’s general description well: After all, receiving accurate answers to one’s questions is also a goal of expert-expert discourse; indeed, it is a goal of testimonial exchange generally.Footnote 14
I will not go into the general debate about the nature of testimony here, rather I wish to apply this functional account of expertise to epistemic authority. Although not all experts are epistemic authorities and not all epistemic authorities are experts, the desideratum that epistemic superiors of this stripe be able to help others with their epistemic goals also applies to authorities. Inspired by Goldman’s (2001, 2018) account of expertise and Jäger’s (2016) notion of Socratic authority, which makes understanding a central goal of interacting with epistemic authorities, this has been argued convincingly by Croce. He suggests that.
someone is an [epistemic] authority for another insofar as she responds to (some of) his epistemic needs, which may vary, depending on the circumstances, from getting knowledge to understanding a subject matter, and from receiving what he needs through the authority’s testimony to achieving epistemic goals on his own thanks to the authority’s guidance (Croce 2018a:479).
So let us incorporate a corresponding qualification:
EA2: A is a recognized epistemic authority for S at t, relative to a given set of propositions p and set of epistemic goods G, iff S correctly believes that A has the capacity to help S – in suitable circumstances and in virtue of being in a significantly advanced epistemic position at t relative to S, concerning p and G – accomplish S’s epistemic goals in G regarding p.
I mention epistemic goals concerning epistemic goods, since one may have different goals concerning the same goods. For example, one person may seek to acquire just as many true beliefs as she can get about some topic; another may strive for understanding.
Adopting terminology proposed by Joseph Raz in his theory of political authority (Raz 2006, cf. Croce 2018b), we may call this account a version of a service conception of epistemic authority. Someone is your authority on climate change, for example, iff you correctly regard that person as being able — in virtue of their advanced knowledge and understanding of climate change — to help you accomplish the epistemic goal of acquiring knowledge about, and/or advancing your understanding of, climate change.Footnote 15 Equipped with this picture of genuine epistemic authority, I now turn to false authority.
3 A Taxonomy of False Authority
The above analyses suggest that A is a false epistemic authority vis-à-vis S iff S falsely believes A to have the capacities that enable A to help S achieve S’s epistemic goals. False authorities lack some or all of the positive epistemic features ascribed to them by their clients. This characterization still covers different social-epistemic situations.
First, there are what may be called unintended false authorities, to be distinguished from fake authorities. I call agents “unintended false authorities” iff they constitute false authorities despite having no intention to be regarded as authorities, let alone actively promoting this. Fake authorities, by contrast, are false authorities who wish and intend to be regarded as authorities.
Unintended authority may occur in various contexts. Consider people who enjoy advanced knowledge or understanding in a given field of inquiry, yet are falsely credited with special competences in other, perhaps neighboring fields where they lack these competences. Davis (2016) discusses the example of Uma Narayan, an “Indian professor in a western academic space” who was frequently consulted by students on topics such as “Indian novels in English, … Goddess-worship rituals in South India,” etc., even though none of these topics remotely fell into Narayan’s area of expertise (Davis 2016:488). In such cases, clients illegitimately extend justified authority ascriptions to areas where they are unjustified. Davis considers this an epistemic injustice that potentially harms the authority, e.g., by consuming their time and straining their intellectual resources.
Fake authorities, by contrast, are false authorities who deliberately wish to be treated as authorities. “Fake” is used here as a privative adjective. Examples that spring to mind are fake weapons, or the popular use of the term to denote the pretense of certain physical reactions that do not in fact happen (e.g., while having sex). For example, whereas fake news are news, a fake gun is not a real gun and, let’s say, a fake yawn is not a real yawn.
However, fake authorities subdivide into two varieties. According to the taxonomy I suggest, some fake authorities suffer from a deluded self-concept about their epistemic capacities relative to their clients.Footnote 16 They falsely believe themselves to be sufficiently competent. I call such agents epistemic quacks. Famous examples include esoteric healers (such as Rasputin), visionaries (such as Baba Wanga) or — an interesting contemporary example — “modern druids” such as the German Karl Burghard Bangert alias “Burgos von Buchonia,” who says he was born 2500 years ago, raised by the great wizard Merlin, and dresses like the fictional druid Miraculix from the French Comic book Asterix and Obelix. All this would be quite funny, were it not for the fact that Burgos is a member of the German sovereign citizen conspiracy movement, a Holocaust denier, an anti-Semite, and a Muslim hater, who regularly gave hate speeches and was found hoarding weapons to prepare for taking over when the German state would collapse. “Nazi Gandalf,” as his friends also called him, managed to acquire followers of his own. (Meanwhile, following a court appearance, the authorities seem to have stopped him.)
In other cases, the fake authority does not believe herself to be sufficiently competent or even knows that she is incompetent. I call such agents epistemic charlatans. (Famous examples are confidence tricksters or imposters, such as Cagliostro or Frank Abagnale.) So, the difference between epistemic quacks and epistemic charlatans is their beliefs about themselves.Footnote 17 The following table (Table 1 ) summarizes the distinctions drawn so far:
I now turn to the phenomenon of predatory experts. Are they on our taxonomy of false authority? And if not, then should they be counted, not as false, but as genuine authorities?
4 Are Predatory Experts Epistemic Authorities?
Lackey, as we heard in section 1, characterizes predatory experts as agents who misuse “their epistemic authority as a cover for predatory behavior within the domain of their expertise” (Lackey 2021:144). Consider her chief example Larry Nassar, a sports-medicine physician who served for almost two decades as a team doctor for the USA Gymnastics. In 2018, he was convicted of multiple accounts of sexual assault and possessing child pornography, upon which he was sentenced to up to 175 years in prison.Footnote 18 In order to illuminate the kind of authority Nassar enjoyed, and the ways he abused it, it will be helpful to look at some original statements of some of his victims as delivered at court in 2018.
“As it turns out,” McKayla MaroneyFootnote 19 maintains, “... Dr Nassar was not a doctor.”
[H]e in fact is, was, and forever shall be, a child molester and a monster of a human being. … He abused my trust, he abused my body and he left scars on my psyche that may never go away (quoted from Lutz 2018).
Another victim, Alexis Moore, says:
To me, he was like a knight in shining armor. But alas, that shine blinded me from the abuse. He betrayed my trust, took advantage of my trust and sexually abused me hundreds of times (quoted from Moghe and del Valle 2018).
Kate Mahon claims that:
Larry Nassar is a master manipulator. His conniving and calculating behavior not only tricked me, but he tricked my mom, who was present for all my appointments as a minor (quoted from Moghe and del Valle 2018).
Or consider a passage from the oft-quoted statement of Rachael Denhollander:
His ability to gain my trust and the trust of my parents, his grooming and carefully calculated brazen sexual assault was the result of deliberate, premeditated, intentional and methodological patterns of abuse (Denhollander 2018).
These quotations capture various aspects of the case. For the present discussion, the crucial point is that, according to the reports, Nassar abused his patients not only physically, but also by systematically exploiting and betraying their trust in him. But what kind of trust is at issue?
This is not the place to delve deeper into the extensive philosophical debate about the nature of trust.Footnote 20 However, it will be helpful to list some key aspects concerning the particular kind of trust involved in the present case. (i) First, the trust people placed in Nassar is so-called interpersonal trust, as opposed to mere reliance. To be sure, Nassar’s patients (and others) were also relying on him in the sense of having strong-enough predictive expectations toward him, that is, expectations that he would deliver sound medical practice. However, as many philosophers have argued in recent years, when expectations constitutive of reliance are frustrated, one may be disappointed but will not respond with reactive attitudes — such as feeling betrayed or let down — toward the person who failed to come through. (The FBI agent may rely on the criminal to take them to the location where it happened, but does not place trust in the criminal for doing so and will not feel let down when things go otherwise.)Footnote 21 Yet Nassar’s victims did have reactive attitudes toward him, which indicates that they had been engaged in an interpersonal trust relationship with him. Moreover, in virtue of Nassar’s social role as an expert and as a team doctor at national level, his victims’ trust in him was natural and well justified.
(ii) Second, the trust under consideration is so-called three-place trust, i.e., trust that people place in someone or something for something. In this case, Nassar’s patients (and others involved) trusted him, to begin with, to do the right things. Specifically, they trusted him to apply legitimate and appropriate diagnostic and therapeutic methods. This is a form of practical trust, i.e., of trusting an agent to act in appropriate ways. In cases like the present one, acting appropriately requires, first, choosing methods that are effective. When we see a doctor and there are diagnostic and therapeutical methods available that are more effective than others, then — if we can leave incentives in medicine and the pharma industry, financial considerations, etc. to one side — we trust the doctor to opt for one of these methods; if there is one most effective method, we trust them — other things being equal — to opt for the most effective one.
In addition, using legitimate or appropriate medical methods also means employing methods that are, given the circumstances, least costly in terms of pain, collateral physical damage, or collateral psychological damage or strain. It is morally obligatory for a physician to use such methods. For example, if we assume that monetary costs played no role, then, if our ankle fracture could be investigated by taking X-rays with a 40-year-old apparatus that would expose us to large quantities of radioactivity, whereas taking an MRI image would deliver equally good or even better results without such costs, we would legitimately trust our physician to (a) know about this and (b) use MRI diagnostics. The doctor would, in these circumstances, be morally obligated to do so. Similarly, if a certain therapy T is as effective as another therapy T*, but T causes less physical or psychological pain or damage than T*, then we trust our physician to opt for T instead of T*, and they are morally obligated to do so. (Of course, in actual practice a typical challenge is to weigh effectiveness against cost.) The upshot is that, according to these criteria, Nassar deliberately applied inappropriate diagnostic or therapeutic methods.
(iii) Third, the situation involves epistemic trust. Epistemic trust in an epistemic authority involves, to begin with, trust to the effect that the authority possesses a sufficient amount of representational epistemic goods, as characterized in section 1. Thus, when we consult a physician, we trust them to have sufficient knowledge in their areas of expertise, enough well-grounded beliefs, understanding, and so on. In the Nassar case, his patients, and others involved, legitimately trusted him to know or at least have well justified beliefs about which diagnostic and therapeutical methods were appropriate. Was this trust in him misplaced?
As far as we can tell, Nassar probably had the relevant medical knowledge and thus did not, in this regard, fall short of fulfilling the (legitimate) normative expectations toward him. Yet, by acting as he did, Nassar deceived his victims and tricked them, with base motivations, into holding false beliefs about which methods were appropriate. Some reports suggest that he sometimes did this by explicitly lying to them and their parents; others suggest that he deployed other means of deception.Footnote 22 According to standard definitions, a speaker S lies to a hearer (or group of hearers) if S makes a statement which they believe to be false, with the intention that the hearer believe it to be true. But even in cases where Nassar did not lie in this sense (or a similar one), he still implied through his actions that the techniques he employed were appropriate.
According to more liberal definitions of lying, we may say that even in these cases, Nassar deceived by lying. According to David Smith, for example, “any form of behavior the function of which is to provide others with false information or to deprive them of true information” amounts to lying (Smith 2004:14; see also Vrij 2008:6). Since Nassar, simply by acting as he did, indirectly provided his patients with false information and deprived them of true information, he may, according to this definition, even be said to have lied when he did state linguistically that what he did was right. In short, Nassar betrayed his clients’ epistemic trust either because he lied to them or because he led them in some other way to hold false beliefs about topics on which he was an expert.
So, did Nassar serve as a genuine or as a false epistemic authority for his patients? The core idea of the social-role account I have suggested is that epistemic authorities help their clients achieve their epistemic goals. Among the epistemic goals of Nassar’s patients, there surely was, at least implicitly, the one of not being deceived about appropriate diagnostic and therapeutical techniques, but to acquire correct beliefs about them. So, since Nassar systematically thwarted these goals, he should not be considered a genuine epistemic authority vis-à-vis the patients he deceived. In general, all this suggests that predatory experts fail to constitute genuine epistemic authorities. Although they are in an epistemically superior position relative to their clients, they fail to live up to one essential requirement of epistemic authority: that of responding to the epistemic needs of their interlocutors. I conclude that predatory experts are genuine experts but fail to constitute genuine epistemic authorities for those whose epistemic trust they betray.Footnote 23
However, on closer inspection, it turns out that the current wording of EA2 does not by itself adequately capture this point. As it stands, it only postulates that A has the capacity to help their clients, and that the latter correctly believe this. As the above discussion shows, predatory experts need not fall short of fulfilling this requirement. When Maroney, in the above quotation, claims that Nassar “was not a doctor,” presumably what she means is that he did not live up to the moral code of conduct and the self-professed norms that physicians are expected to respect (and usually do respect). That seems right. Yet Nassar did have special training and education as a sports-medicine physician and so had the capacity to help his clients get the right information about what would constitute appropriate diagnostic and therapeutical interventions. Unfortunately, then, the case cannot be subsumed under any of the three kinds of false authority so far distinguished. In all these cases, the agents — unintended false authorities, epistemic quacks, and epistemic charlatans — lack the relevant competences or abilities. What are we to say? Was the above conclusion premature? Do predatory experts constitute genuine epistemic authorities, after all?
5 A Reliability Constraint and False Authorities Who Are Systematic Trust Betrayers
The arguments discussed above suggest that we should not consider A to be a genuine epistemic authority vis-à-vis S if A merely fulfills the condition of having the capacity to help S pursue S’s epistemic goals, yet rarely or never exercises that capacity. I suggest, therefore, that we ameliorate EA2 by adding a reliability constraint to the effect that A’s competences or capacities reliably manifest themselves in A’s social-epistemic behavior. Call this a performance condition. We thus obtain:
EA3:
A given epistemic source A is a recognized epistemic authority for an epistemic agent (or group of epistemic agents) S at time t and relative to some set of propositions or subject matter p and set of epistemic goods G iff S correctly believes that:
-
(i)
A has the epistemic capacity to help S — in suitable circumstances and in virtue of being in a significantly advanced epistemic position — accomplish S’s epistemic goals in G concerning p (competence condition), and
-
(ii)
A reliably exercises this ability in suitable circumstances (performance condition).
The concept of reliability is hotly debated in epistemology. Here we need not delve into traditional controversies about the notion, however. What I mean when I say that A “reliably exercises their ability in suitable circumstances” is that A exercises this ability with sufficient frequency, in suitable circumstances, and in a sufficiently large and varied run of deployments. The locution “sufficient frequency” is admittedly vague; yet, prospects for coming up with a precise threshold seem bleak. What this amounts to will heavily depend on the context, potentially including the agents’ specific goals, their practical stakes, and so on. For example, if a pedestrian topic is at issue, the threshold will be lower than if your life depends on whether the potential authority is mostly right in their judgments. It seems reasonable to require, though, that if an ability or a capacity is reliably exercised, the rate of actualization will range above 50 percent. In many kinds of situation, it may have to be considerably higher.Footnote 24
EA3 excludes predatory experts, since their capacity to support their clients does not reliably manifest itself in their social-epistemic behavior. On the contrary, they systematically deceive and misinform their clients about important questions. This may happen implicitly or indirectly, or explicitly by flat-out lying. However, if predatory experts fail to constitute genuine authorities, according to EA3, then in which category of false authority do they fall?
A moment of reflection reveals that neither of the three types of false authority so far discussed covers predatory experts. Predatory experts act, as we may say, as systematic epistemic-trust abusers. But there is a difference between epistemic charlatans and predatory experts such as Nassar, namely that charlatans fail to be epistemically competent. The same holds for epistemic quacks and unintended false authorities: In all three cases, the agents fail to constitute genuine authorities because they lack the relevant epistemic capacities. Not so with predatory experts who, as we have seen, have the relevant capacities. This suggests an ameliorated taxonomy which accounts for the performance condition of genuine authority discussed above. Predatory experts are fake authorities who fail to fulfil the performance condition.
Moreover, in the same way that we distinguished incompetent false authorities who believe themselves to be competent (epistemic quacks) from those who don’t believe themselves to be competent (epistemic charlatans), we may distinguish fake authorities who believe themselves to perform reliably (unwitting non-performers) from those who don’t believe this (witting non-performers).Footnote 25 Nassar, the evidence suggests, is a witting non-performer. Examples of unwitting non-performers are harder to come up with, but good candidates are people who are competent, yet fail to reliably exercise their competence and do not realize this because, e.g., they rationalize their unreliable performance. The following table (Table 2) summarizes these distinctions and lists what I believe are good examples:
False authorities, I said in section 3, lack some or all of the positive epistemic features ascribed to them by their clients. EA3 says that these features include (i) a social-epistemic competence condition and (ii) a performance condition. A false authority is someone whose clients falsely believe that they fulfil the competence condition EA3 (i) and the performance condition EA3 (ii). I introduced the notions of unintended false authority, epistemic quackery and epistemic charlatanry to characterize agents who, in their own ways, fail to meet the competence condition EA3 (i). If this is the case, then a fortiori they cannot fulfil EA3 (ii): If they lack the relevant competences, they cannot reliably exercise them. Predatory experts, by contrast, fulfil the competence condition EA3 (i), but fail to fulfil the performance condition EA3 (ii). So, in this case, S correctly believes that the purported authority has the relevant capacities and could help S with S’s epistemic goals, but falsely believes that they reliably exercise these capacities.
6 Summary, Merits of the Account, and Questions for Future Research
I have argued for a social-role account of epistemic authority according to which epistemic authorities are agents who, in virtue of being in an advanced epistemic position relative to their clients, can help the latter achieve their epistemic ends. Such authorities can, but need not be, experts in a given field of inquiry. My arguments yielded a distinction of three kinds of false authority, namely (i) unintended false authority, (ii) epistemic quackery, and (iii) epistemic charlatanry. I then discussed what Lackey calls “predatory experts” and argued that, since experts of this stripe systematically deceive their clients and undermine the latters’ epistemic goals, they do not meet the requirements of my social-role account. However, in order to bring this point home, the suggested definition of epistemic authority EA2 needed fine tuning. It had to be supplemented by a reliability constraint to the effect that the capacity to help others also reliably manifests itself in the authority’s social-epistemic behavior. Equipped with this constraint, I then introduced the category of non-performers, i.e., of false authorities who are competent, yet systematically exploit the epistemic trust they receive from their clients to deceive and take advantage of them.
The account I suggest clarifies the difference between experts and authorities. Predatory experts, naturally enough, constitute experts. But they fail to be epistemic authorities because they fail to reliably exercise their capacity to help others with their epistemic goals and hence fail to reliably fulfil the function the social-role account requires of genuine authorities. Moreover, and most importantly, the account provides a taxonomy of the most important kinds of false epistemic authority.
Questions for further research raise their heads. Perhaps the most pressing one is which additional skills and virtues epistemic authorities must have in order to care for the needs of their clients and to effectively help them achieve their epistemic ends. I have argued that epistemic authorities can help others “in virtue of their advanced epistemic position,” while this capacity must also reliably manifest itself in their social-epistemic behavior. But this is not the end of the story. Consider, for example, understanding. In order for an epistemic authority effectively to advance interlocutors’ understanding of a given subject matter, they must be able to interact in specific ways with them. This may involve, for example, a social-epistemic virtue that Jäger and Malfatti (2020) call “epistemic empathy” — the ability to project oneself into one’s interlocutor’s system of thought, in order to see how cognitive dissonances in it can rationally be resolved. When the interlocutors are laypersons or novices, it may require specific “novice-oriented abilities” (Croce 2018b), including intellectual generosity, maieutic abilities, and so on. Further exploration of such intellectual and didactic virtues must await another occasion.
This paper does not directly solve the practical problems that false authorities create in our societies and our Lebenswelt. Often it is hard to tell whether some person or source is a genuine or a false authority, and even when we suspect a false one, we may not immediately be able to tell whether it is an epistemic quack, a charlatan, a predator, or a source of yet another kind that we should shun. Conceptual clarifications and taxonomies, however, are important pieces in the toolbox to be carried along when we wish to identify fake authorities and prevent them from contaminating our social-epistemic environment.Footnote 26
Data Availability
Not Applicable.
Notes
See, e.g., Der Spiegel (online), 16.11.2021, “Entwurmungsmittel in Österreich ausverkauft – weil FPÖ-Chef es bei Corona empfiehlt.” The FPÖ politician in question is Herbert Kickl. For similar incidents in the USA, see Martin Pengelly’s “You are not a horse: FDA tells Americans Stop Taking Dewormer for Covid,” The Guardian (online), August 23, 2021, online.
See also Donald Trump’s (in)famous remark on injecting disinfectant against Covid-19 infections: “And then I see the disinfectant, where it knocks it out in a minute. One minute. And is there a way we can do something like that, by injection inside or almost a cleaning. Because you see it gets in the lungs and it does a tremendous number on the lungs.” (https://trumpwhitehouse.archives.gov/briefings-statements/remarks-president-trump-vice-president-pence-members-coronavirus-task-force-press-briefing-31/). It is hard to believe that people responded to these strings of words by injecting or swallowing hydroxychloroquine, but some did.
Consider, e.g., the Jonestown massacre in 1978, the Heaven’s Gate collective suicide in 1997, or the mass suicide in 2000 of members of the “Movement for the Restoration of the Ten Commandments” in Uganda. Well-known examples of homicide include the brutal murders committed by members of the “Manson family” in the 1960s, or the terrorist attack on a Tokyo subway in 1995 by the Ōmu Shinrikyō sect.
Nor will I discuss potential forms of epistemic injustice done to those who are not identified as genuine epistemic authorities on a given subject matter even though they should be. Such injustice may concern an epistemic authority’s capacity or activity as a knower, inquirer, epistemically reliable testifier, etc. On this topic, see Grasswick (2018).
For the debate about the notion, see, e.g., Zagzebski (2012, 2014, 2016), Keren (2007, 2014), McMyler (2011, 2014), Anderson (2014), Coady (2014), Jäger (2016, forthcoming), Lackey (2016), Wright (2016), Dormandy (2018), Croce (2018a, 2018b, 2019), Bungum (2018), Popowicz (2019), Constantin and Grundmann (2020), Jäger and Malfatti (2020), Stewart (2020), Bokros (2021), Grundmann (2021), Hauswald (2021), Watson (2022), and Wolkenstein (2024).
For a similar account, see also Constantin and Grundmann (2020).
Of course, if one is justified in thinking someone an authority, then one is justified in taking their beliefs to be authoritative, whether or not they actually are situated in a better epistemic position. Justified beliefs can be false.
An anonymous referee asked what my account says about agents who, in virtue of being in a demon world, hold false but justified beliefs about who their authorities are. However, my definition is clear on this: Since their beliefs are false, the sources they identify fail to be recognized epistemic authorities for them.
In earlier work, I have relativized epistemic authority to domains. Jonathan Matheson convinced me that relativizing to propositions or subject matters might be more helpful since there are cases where someone is an authority regarding just one proposition. (I call you in Kuala Lumpur in order to receive information about what the weather is like there.) Though “domain” could be read so as to cover such cases as well, I concede that this would stretch the term’s standard meaning.
Note, however, that what counts as a sufficient epistemic distance may vary with contextual factors. If the stakes are high, a fairly small distance may be significant.
I am hedging this condition since several complexities come up. Suppose that A has altogether not only more true beliefs than S about some topic, but also more false beliefs. A may nonetheless be the authority, e.g., if A still has more true and fewer false methodological beliefs pertaining to the domain or field of inquiry. For this would put A in a better position to improve their system of thought in the long run and to help S improving their system. For a detailed discussion of this topic, see Hauswald (2024), chapters 6.3 and 6.4, who argues that A must have sufficiently more true beliefs and not disproportionately many false beliefs.
I have substituted “(i)” and “(ii)” for “(A)” and “(B)” in the original; see also a slightly different formulation in the same spirit in Goldman (2001).
Cf. for example Hinchman (2012), who argues that being “responsive to the addressee’s context-sensitive epistemic needs” is a normative feature of “warranted testimonial belief” (2012:49). Dormandy (2020) concurs that “what hearers trust speakers for is not just information in the abstract, but epistemic care in the form of information tailored to their needs.”.
EA2 allows us to account for experts who also constitute authorities for others. My distinction between experts and authorities is not meant to rule out that there are, in addition to authorities who fail to be experts and experts who fail to constitute authorities, expert authorities.
This may be due to overestimation of their own competences or underestimation of their clients’ competences, or both.
The terminology here is partly stipulative. In some contexts, “quack” or “quackster” may be used synonymously with “charlatan.”.
The central fact is that Nassar sexually abused hundreds of girls and young women by performing medically unnecessary and sexually motivated pelvic examinations on them. Coaches and colleagues who thought highly of him regularly sent him patients, thereby “feeding him victims” (Mack 2018).
The real names of many of the victims are public and mentioned in the media, and the evidence suggests that they have consented to their names being associated with their statements.
It is controversial whether or not interpersonal trust, though not being reducible to mere reliance, still involves it. The standard counterexample against making it a requirement of interpersonal trust is co-called therapeutical trust, where the trustor’s confidence does not meet the relevant threshold. The standard example (see McGeer 2008:241) is a parent’s trust in their teenage children that they won’t trash the house when the parents are away for the weekend. The parents may hand over the house to the kids for the weekend even though they think it unlikely that they won’t trash it.
Scott Hill raised the question whether Nassar may not have deceived himself about which methods were appropriate. I agree that self-deception is generally to be taken into account regarding predatory experts. The public information we have about the Nassar case suggests, in my eyes, that he was aware that he was deceiving others and abusing his authority. Below I’ll return to this point by distinguishing between witting and unwitting nonperformers.
This argument does not rule out that, due to their expertise, predatory experts still serve as authorities for other agents on whom they do not prey, or even for agents on whom they do prey, but on other occasions. They may, e.g., write papers that correctly and authoritatively inform others about topics from their area of expertise.
An anonymous referee raised the question whether the authority’s “reliable exercise” of their ability to help others amounts to exercising that ability successfully or whether, again, in an evil-demon scenario where A tries their best but falsely thinks they are successful, this would still constitute a reliable exercise of their social-epistemic abilities. My answer is: no, it would not; that’s the very point of the concept of reliability and of (new) evil demon arguments. They concede that agents in a demon world do not form their beliefs (or perceptual contents, etc.) reliably (in some standard reliabilist sense), even though they may still be epistemically justified in these beliefs.
Thanks to Rico Hauswald for suggesting this distinction to me.
For helpful discussions I am grateful to the audiences at the Bled Epistemology Conference, June 5–9, 2023, organized by Sarah Wright, and at the conference New Waves in the Philosophy of Epistemic Authority and Expert Testimony, organized by Rico Hauswald and Pedro Schmechtig, October 5–6 in Dresden. Special thanks go to Anne Bartsch, Michel Croce, Katherine Dormandy, Thomas Grundmann, Scott Hill, Arnon Keren, Pedro Schmechtig, Johanna Stüger, Jamie Watson, an anonymous referee for Acta Analytica, and especially to Rico Hauswald who extensively commented on two earlier drafts of this paper.
References
Anderson, C. (2014). Epistemic authority and conscientious belief. European Journal for Philosophy of Religion, 6(2014), 91–99.
Black, M. (1983). The prevalence of humbug, in id., The prevalence of humbug, and other essays, (pp. 115–145). Ithaca [N.Y.]: Cornell University Press.
Bokros, S. E. (2021). A deference model of epistemic authority. Synthese, 198, 12041–12069.
Bungum, D. J. (2018). Preemptionism and epistemic authority. Quaestiones Disputatae, 8, 36–67.
Coady, D. (2012). What to believe now. Malden/Oxford: John Wiley & Sons.
Coady, D. (2014). Communal and institutional trust: Authority in religion and politics. European Journal for Philosophy of Religion, 6, 1–23.
Constantin, J., & Grundmann, T. (2020). Epistemic authority: Preemption through source sensitive defeat. Synthese, 197(9), 4109–4130.
Croce, M. (2018a). Expert-oriented abilities vs. novice-oriented abilities: An alternative account of epistemic authority. Episteme, 5, 476–498.
Croce, M. (2018b). Epistemic paternalism and the service conception of epistemic authority. Metaphilosophy, 49(3), 305–327.
Croce, M. (2019). On what it takes to be an expert. Philosophical Quarterly, 69, 1–21.
Davis, E. (2016). Typecasts, tokens, and spokespersons: A case for credibility excess as testimonial injustice. Hypathia, 31(3), 485–501.
Denhollander, R. (2018). Rachael Denhollander’s full victim impact statement about Larry Nassar. CNN online. https://edition.cnn.com/2018/01/24/us/rachael-denhollander-full-statement/index.html . Accessed 23 Sept 2023.
Dormandy, K. (2018). Epistemic authority—Preemptive reasons or proper basing? Erkenntnis, 83, 773–791.
Dormandy, K. (2020). An overview of trust and some key epistemological applications, Introduction. In K. Dormandy (Ed.), Trust in epistemology. Routledge.
Frankfurt, H. G. (2005). On bullshit. Prince: Princeton University Press.
Goldman, A. (2018). Expertise. Topoi, 37(1), 3–10.
Goldman, A. (2001). Experts—Which one should you trust?. Philosophy and Phenomenological Research repr. In: E. Selinger, & R. C. Crease (Eds.), The philosophy of expertise 63,(1), 85–109). New York: Columbia University Press, 2006, 14-38.
Grasswick, H. (2018). Understanding epistemic trust injustices and their harms. Royal Institute of Philosophy Supplements, 84, 69–91.
Grasswick, H. (2020). Reconciling epistemic trust and responsibility, in Dormandy (pp. 161–188).
Grundmann, T. (2021). Preemptive authority: The challenge from outrageous expert judgments. Episteme, 18, 407–427.
Hauswald, R. (2021). The weaknesses of weak preemptionism. The Philosophical Quarterly, 71(1), 37–55.
Hauswald, R. (2024). Epistemische Autoritäten: Individuelle und plurale. Springer/Metzler.
Hinchman, E. (2012). Can trust itself ground a reason to believe the trusted? Abstracta 6 (Special Issue VI), 47–83.
Jäger, C. (2016). Epistemic authority, preemptive reasons, and understanding. Episteme, 13(2), 167–185.
Jäger, C., & Malfatti, F. I. (2020). The social fabric of understanding: Equilibrium, authority, and epistemic empathy. Synthese, 199(1–2), 1185–1205.
Jäger, C. (forthcoming). Epistemic authority. In: J. Lackey, & A. McGlynn (Eds.), The Oxford handbook of social epistemology, Oxford, New York: Oxford University Press.
Keren, A. (2007). Epistemic authority, testimony and the transmission of knowledge. Episteme, 4, 368–381.
Keren, A. (2014). Zagzebski on authority and preemption in the domain of belief. European Journal for Philosophy of Religion, 6, 61–76.
Lackey, J. (2016). To preempt or not to preempt. Episteme, 13(4), 571–576.
Lackey, J. (2021). Preemption and the problem of the predatory expert. Philosophical Topics, 49(2), 133–150.
Lewis, C.S. (2009). The case for Christianity, New York: HarperCollins Publishers, originally published 1943/44.
Lutz, T. (2018). Victim impact statements against Larry Nassar: “I thought I was going to die”. The Guardian, online, January 24th, 2018, https://www.theguardian.com/sport/2018/jan/24/victim-impact-statements-against-larry-nassar-i-thought-i-was-going-to-die, Accessed 23 Sept 2023.
Mack, J. (2018). Larry Nassar’s ‘perfected excuse’: He was a doctor. MLive, Michigan, published online, Jan. 29, 2018, 2:00 p.m. https://www.mlive.com/news/2018/01/why_nassar_got_away_with_it_fo.html. Accessed 27 Sept 2023
McGeer, V. (2008). Trust, hope and empowerment. Australasian Journal of Philosophy, 86(2), 237–254.
McLeod, C. (2020). Trust, The Stanford Encyclopedia of Philosophy (Fall 2023 Edition). In: E. N. Zalta, & U. Nodelman (Eds.). https://plato.stanford.edu/cgi-bin/encyclopedia/archinfo.cgi?entry=trust
McMyler, B. (2014). Epistemic authority, preemption, and normative power. European Journal for Philosophy of Religion, 6(2014), 101–119.
McMyler, B. (2011). Testimony, trust, and authority. Oxford: Oxford University Press.
Moghe, S., del Valle, L. (2018). Larry Nassar’s abuse victims, in their own words. CNN online, updated 5:03 PM EST, Wed January 17, 2018. https://edition.cnn.com/2018/01/16/us/nassar-victim-impact-statements/index.html. Accessed 23 Sept 2023
Pengelly, M. (2021). August 23: You are not a horse: FDA tells Americans stop taking dewormer for Covid, The Guardian (online). https://www.theguardian.com/us-news/2021/aug/23/fda-horse-message-ivermectin-covid-coronavirus. Accessed 27 Sept 2023.
Popowicz, D. M. (2019). Epistemic authority, autonomy, and humility. Dissertation University of California Irvine, UC Irvine Electronic Theses and Dissertations, https://escholarship.org/uc/item/99d4t6zq
Quast, C. (2018). Expertise: A practical explication. Topoi, 37(1), 11–27.
Raz, J. (2006). The problem of authority: Revisiting the service conception. Minnesota Law Review, 90, 1003–1044.
Smith, D. L. (2004). Why we lie: The evolutionary roots of deception and the unconscious mind. St. Martin’s Press.
Spiegel, D. (2021). November 16: Entwurmungsmittel in Österreich ausverkauft, weil FPÖ-Chef es bei Corona empfiehlt. https://www.spiegel.de/wirtschaft/unternehmen/ivomectin-entwurmungsmittel-in-oesterreich-ausverkauft-weil-fpoe-chef-es-bei-corona-empfiehlt-a-42689153-8d8b-4e52-bab1-e86a293fd1b8. Accessed 27 Sept 2023
Stewart, C. (2020). Expertise and authority. Episteme, 17, 420–437.
Vrij, A. (2008). Detecting lies and deceit: Pitfalls and opportunities (2nd ed.). Wiley.
Watson, J. C. (2022). A history and philosophy of expertise: The nature and limits of authority. Bloomsbury.
Wolkenstein, A. (2024). Healthy mistrust: Medical black box algorithms, epistemic authority, and preemptionism. Cambridge Quarterly of Healthcare Ethics, 2024, 1–10.
Wright, S. (2016). Epistemic authority, epistemic preemption, and the intellectual virtues. Episteme, 13(4), 555–570.
Zagzebski, L. T. (2012). Epistemic authority—A theory of trust, authority, and autonomy in belief. Oxford University Press.
Zagzebski, L. T. (2014). Epistemic authority and its critics. European Journal for Philosophy of Religion, 6(2014), 169–187.
Zagzebski, L. T. (2016). Replies to Christoph Jäger and Elizabeth Fricker. Episteme, 13(2), 187–194.
Funding
Open access funding provided by University of Innsbruck and Medical University of Innsbruck.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Jäger, C. False Authorities. Acta Anal (2024). https://doi.org/10.1007/s12136-024-00594-3
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s12136-024-00594-3