Abstract
Groups, like individuals, can communicate. They can issue statements, make promises, give advice. Sometimes, in doing so, they lie and deceive. The goal of this paper is to offer a precise characterisation of what it means for a group to make an assertion and to lie. I begin by showing that Lackey’s influential account of group assertion is unable to distinguish assertions from other speech acts, explicit statements from implicatures, and lying from misleading. I propose an alternative view, according to which a group asserts a proposition only if it explicitly presents that proposition as true, thereby committing to its truth. This proposal is then put to work to define group lying. While scholars typically assume that group lying requires (i) a deceptive intent and (ii) a belief in the falsity of the asserted proposition, I offer a definition that drops condition (i) and significantly broadens condition (ii).
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Group Action and Group Misconduct
Some actions are performed individually: I am writing this paper, and you are reading it. Others are performed together, by multiple agents, as a group. We say, for instance, that the Department of Geography hired a new professor, that the European Union has made huge investments in a new recovery plan, or that Amazon Watch strives to protect the rights of indigenous people. Group action is pervasive in our society, and not much would be accomplished without it.
Not all group action, however, is good or selfless. On the contrary, group actions often have atrocious consequences. Here’s three telling examples that will accompany us for the rest of the paper:
-
(1)
Between the thirties and the seventies, several gas companies (prominently Dupont, General Motors, and Standard Oil) made a concerted effort to promote the addition of tetraethyl lead to gasoline, leading to worldwide lead poisoning. This resulted in many premature deaths, and to the widespread development of many cognitive disfunctions.
-
(2)
Between 2008 and 2015, Volkswagen developed a “defeat device” to cheat the emission tests performed on their cars. This allowed them to market cars that emitted up to 40 times more pollutants than it was allowed in the US, with adverse effects for pollution levels and for the health of US citizens.
-
(3)
During the First World War, religious tensions between Muslim Turks and Christian Armenians progressively evolved into the persecution and systematic killing of Armenians in the Ottoman Empire. By the end of the War, it is estimated that up to one and a half million Armenians were killed (Morris and Ze’evi 2019), and many others were victims of persecutions and acts of barbarity.
When groups behave immorally, they often try to do what’s in their power to cover their tracks: they lie and deceive. Turkey still officially denies that the genocide of the Armenians ever occurred, and took active measures to promote denialist narrations internationally (Mamigonian 2015; Hovannisian 2015). Dupont, General Motors and Standard Oil conspired to bury evidence of the deadly risks of lead poisoning, and lied for decades about the availability of healthier alternatives (Kovarik 2012). Volkswagen’s first lied about the emissions of their cars; then, when evidence of fraud emerged, they lied by denying that they had cheated the emission tests.Footnote 1
It’s natural to say that Turkey, General Motors, and Volkswagen claimed that they didn’t do anything wrong, and that in doing so they lied to the public. But what do we mean, exactly, when we say that a group makes a claim, and when we say that a group lies? While the analysis of group action, belief, justification, and knowledge has attracted a lot of philosophical attention,Footnote 2 we have seen so far only a few attempts to characterise group assertion (Meijers 1999, 2007; Tollefsen 2020; Lackey 2017, 2020b; Townsend 2018; Ludwig 2019; 2020) and group lying (Staffel 2018, Lackey 2018a; 2020a, 2020b, Hormio 2022).
Building on scholarship in collective epistemology, linguistics, and speech act theory, this paper attempts to provide a precise characterisation of what it means for a group to make an assertion and lie. I begin by considering what group lying is, and then discuss how it relates to group assertion, group insincerity, and group deception.
2 What is Group Lying?
To approach the subject of group lying, it’s helpful to fist consider what it means for an individual to lie. Scholars tend to agree that lying necessarily involves saying what you believe to be false.Footnote 3 So, the following are assumed to be necessary conditions for lying:
Necessary Conditions for Lying
A speaker S lies to an audience A only if:
(1) S asserts that p
(2) S believes that p is false
Many scholars also think that lying is essentially a form of intended deception, so that a further condition should be addedFootnote 4:
(3) S intends to thereby deceive A about p
Quite straightforwardly, this account of individual lying can be “collectivised”. In other words, we can derive a definition of group lying from it (Staffel 2018; Lackey 2018a; 2020a; 2020b; Hormio 2022):
The Group Lie Schema
A group, G, lies to B iff
(G-Assert) G asserts that p
(G-Insincere) G believes that p is false
(G-Deceive) G intends to thereby deceive B about p
The Group Lie Schema is a useful starting point to guide discussion on the nature of group lying. To understand what group lying is, we need to establish whether it really requires satisfying these three conditions, and what satisfying each condition amounts to. Specifically, this paper will tackle three questions:
-
(i)
What does G-Assert require, exactly? In other words, what is group assertion?
-
(ii)
Does group lying really require that the group intends to deceive the audience, as required by G-Deceive?
-
(iii)
Does group lying really require that the group believes that p is false, as required by G-Insincere?
I will deal with each question in this order, starting from (i).Footnote 5
3 Group Assertion
3.1 Two Kinds of Group Assertion
Some authors before me (Hughes 1984; Tollefsen 2007; 2009; 2020; Meijers 2007; Fricker 2012; Lackey 2017; 2020b; Ludwig 2017, chaps. 13–14; 2019; 2020; Townsend 2018) have attempted to characterise group assertion and group testimony. Rather than tiring the reader by reviewing the strength and weaknesses of each view, I will take Lackey’s proposal as a point of reference to develop mine. This is because Lackey (2017, 2018a, 2020b) has already highlighted problems with every account other than her own.Footnote 6 To keep discussion concise, then, I will begin by presenting Lackey’s proposal, and discuss alternative views whenever they become relevant.
There is substantial consensus (e.g. Lackey 2017; Ludwig 2019; Tollefsen 2020) that groups can make assertions in at least two ways: (i) directly and together, through coordinated effort, or (ii) less directly, through a spokesperson. When two scientists write a journal article together, or when two demonstrators shout their slogan at a unison, they make a joint assertion of the first kind—let’s call this a collective assertion.Footnote 7 The second form of group assertion concerns speech acts performed by a spokesperson who has the authority to speak for a group. Following Ludwig (2019, 2020), I call these proxy assertions. Official statements issued by companies and governments belong to this category.Footnote 8
To accommodate both kinds of group assertions, Lackey proposes a disjunctive account. A group can assert in either of the following two waysFootnote 9:
- (CA):
-
A group G makes a collective assertion that p iff the members of G coordinate individual acts a1, … an, so that they all reasonably intend to convey that p together in virtue of these acts.
- (PA):
-
A group G makes a proxy assertion that p iff that p belongs to a domain d, and a spokesperson(s) S:
(a) Reasonably intends to convey the information that p in virtue of the communicable content of an individual act (or individual acts) of communication,
(b) Has the authority to convey the information in d, and
(c) Acts in this way in virtue of S’s authority as a representative of GFootnote 10
3.2 Problems with Lackey’s account
CA and PA handle previous examples efficiently. The joint manuscript and the demonstration chant involve coordinated actions designed to convey a message, so CA classifies them as (collective) group assertions. The statements issued by governments and corporations (like Turkey and Volkswagen) are typically delivered by a spokesperson (e.g. a representative, or a media agency) with the authority to speak on behalf of the institution, so they are also classified as (proxy) group assertions. As we are about to see, however, both characterisations face difficulties. I will start by highlighting a minor hitch for PA, before moving to a more pressing problem for both definitions.
PA ties proxy assertions to a particular domain D: the spokesperson must have the “authority to convey the information in D”. The idea is that spokespersons don’t merely repeat what they are told. They often enjoy some autonomy: they can address questions on the spot and improvise their declarations, within defined constraints. So, for instance, a legal representative of Volkswagen may be allowed to make impromptu statements about legal matters on behalf of the company (say, during interviews with the media), but they may have no authority to make comments on the company’s financial plans.
This seems right. By tying authority to domains of speech, however, PA risks losing track of the fact that there are many other factors that can make or unmake a spokesperson’s authority to speak for a group. Some have to do with context. For instance, the same lawyer may be allowed to speak for the company only on weekends, when the usual legal team is not operating. Some other factors may have to do with content. A company’s lawyer may be allowed to speak freely, but only if they don’t contradict previous statements made by the company. None of these restrictions have to do with domain. And we could expand the list indefinitely: the spokesperson’s authority might depend on the way in which the assertion is produced (e.g. Diego can speak on behalf of the Police Department on any matter, but only when he is wearing his uniform), the presence of defeaters (e.g. the Vice Dean can speak for the Department only when the Dean is absent), and so forth.
The bottom line is that domain is just one of the many factors that determines whether a spokesperson has the authority to assert for a group. Attempting to list them all in the definition would be hopeless. Instead, it’s preferable to simply require that the spokesperson does in fact have the authority to say what they said on behalf of the group, in the context in which they said it. Condition (b) and (c) in PA can therefore be replaced by the more concise (b*)Footnote 11:
- (b*):
-
S has the authority to convey p on behalf of G, and conveys p in virtue of that authority
This leads us to a second, more pressing problem for both PA and CA. Both definitions require that the group intend to convey a proposition. As we are about to see, however, it’s not clear that this condition is able to differentiate between a group asserting something and a group communicating something without asserting it. To understand why this is a problem, let’s first consider what asserting is, and how it differs from merely communicating something.
As with most definitions, there is substantial philosophical disagreement on what an assertion exactly is (Pagin and Marsili 2021). However, consensus is wide enough on some points. One is that “assertion” designates a direct and explicit speech act, so that a good definition should acknowledge the difference between assertions and implicatures (Gluer 2001; Stainton 2016; Pagin 2014, sec. 2; Alston 2000; Searle 1969; Borg 2019, but cf. García-Carpintero 2018). We might say, then, that a minimum desideratum for a definition of group assertion is to be able to distinguish between assertions and implicata.
Drawing this distinction is especially important given that our ultimate goal is to characterise group lying. A good definition of lying should distinguish lying from merely misleading (Saul 2012; Stokke 2013b; 2018).Footnote 12 To draw this distinction, condition G-Assert (from the Group Lie Schema) must be able to differentiate what a group asserts from what a group merely implies. Definitions like CA and PA, however, conflate the two notions, and therefore cannot distinguish misleading from lying.Footnote 13 An imaginary example can illustrate the point:
Misleading Smoke
Nicotella has invented a new kind of e-cigarette. Just before the launch, an internal review establishes that there is a problem with the e-cigarette’s battery: it can explode if it overheats. Nicotella’s executive board convenes to assess the risk. They decide to bury evidence of the problem and go ahead with the launch, while they attempt to fix the defect. To maintain plausible deniability, they also instruct their marketing division to refrain from making claims about the e-cigarette’s safety—ads can only claim that the smoke is safe. As a result, Nicotella’s ads state that the smoke of the e-cigarettes is safe, and poses no threat to one’s health.
The example illustrates how a group can mislead without lying. Let’s imagine that Nicotella’s official statement is strictly speaking true: inhaling the smoke of their e-cigarettes is safe. Nicotella is still aware that their e-cigarettes may explode, posing a much more immediate damage to their customers’ health. Nicotella’s assertion that the smoke is safe, while literally true, is designed to communicate something false: namely, that their devices pose no threat to their users.
Misleading Smoke illustrates why PA is inadequate as an account of proxy assertion. Nicotella only asserted that the e-cigarette’s smoke is safe, because this is all they explicitly stated in their ads. However, since the executive board’s reasonable intention was to convey that the e-cigarettes (and not only their smoke) are safe, PA incorrectly classifies the implied proposition as asserted. Crucially, the objection can be extended. Whenever a group implies something without asserting it, Lackey’s definition will incorrectly classify the implicature as an assertion.
Plugged into the Group Lie Schema, PA leads to similarly incorrect classifications. Since the board believes that it is false that the e-cigarettes are safe (and was aiming to deceive the public), the resulting definition classifies Misleading Smoke as a lie. But Nicotella misled the public without technically lyingFootnote 14: PA’s verdict is incorrect. The same problem arises with any misleading implicature. Crucially, these objections (about assertion and lying) apply also to CA, which (like PA) only requires that the group intend to convey some content, not that the group assert it.Footnote 15
This is already a damning objection to Lackey’s definition. But the problem runs deeper than this. PA and CA classify as group assertions every group speech act other than assertion (suggestions, hypotheses, assumptions, orders, etc.), since virtually any speech act is meant to convey a certain proposition. If Lackey’s account captures every group speech act, it is about as incorrect as a definition of group assertion can be. To illustrate the problem, consider an example involving advising:
Misleading Advice
Beautella is a company that sells beauty products. All their conditioners have the following statement printed on the label: We advise customers to use the Beautella conditioner with the Beautella shampoo. Beautella’s executive board is aware that there is no advantage in using their products together. In fact, some of Beautella’s studies even proved that using a Beautella conditioner with other company’s shampoos leads to better results. However, admitting it wouldn’t be in Beautella’s interest: they prefer to deceive customers and maximise sales.
In Misleading Advice, Beautella doesn’t make an assertion at all: the label only contains advice concerning how to combine conditioner and shampoo. Nonetheless, the label implies that Beautella’s conditioner and shampoo are more effective together, which is false. Since Beautella intends to convey this message, PA incorrectly classifies this misleading implicature as an assertion, and hence as a lie. This is an important failure, since the company made no assertion at all. Both for the purpose of defining assertion and for the purpose of defining lying, PA and CA deliver verdicts that are dramatically off the mark.
3.3 Asserting as Taking Responsibility
PA and CA should be narrowed down. Ideally, a good definition should specify that an assertion must explicitly state something (as opposed to implying it), and incorporate a criterion that allows us to tell assertions apart from other speech acts.
Tollefsen (2020, p. 339) recently proposed an account of group assertion that might do the trick. She considers various definitions of individual assertionFootnote 16 that could be “collectivised” into an account of group assertion, concluding that commitment-based accounts fare best at this task. According to commitment-based accounts, a speaker asserts a proposition p only if they undertake responsibility for p being true.Footnote 17 From this characterisation, Tollefsen (2020, p. 339) derives the following account of group assertion:
- (GA):
-
For a group G to assert that p, G must be committed to the truth of p.
This characterisation leaves open some questions. First, this is not a definition: GA only states a necessary condition for group assertion, but no sufficient condition(s). Second,Footnote 18 the nature of the commitment involved is left unspecified in GA.
Concerning the second issue, Tollefsen clarifies that she understands commitment in a Brandomian fashion (Brandom 1983; 1994). Broadly, on this view assertoric commitment is modelled as a responsibility to provide reasons in support of one’s claim, if challenged. Although this represents an improvement, it has been pointed out that assertions also give rise to another kind of responsibility: speakers are sanctionable for what they assert. If you assert that p, you stand to incur in certain social sanctions if p turns out to be false: your reputation may be damaged, your audience might be entitled to criticise you, and so forth (Peirce, CP, MS, Green 2007, 2009; Graham 2020).
To accommodate these observations, assertoric commitment can be characterised as involving both components: (i) a responsibility to provide reasons if the assertion is challenged, and (ii) liability to social sanctions in case the proposition turns out to be false. This account of commitment can be further refined (see Tanesini 2016; 2020; Shapiro 2018; Marsili 2020b, 2021b), but for our purposes this constitutes a sufficient degree of approximation.
We can now go back to the problem of identifying sufficient conditions for asserting. Is being committed to a proposition (in the sense just established) sufficient for asserting? No, and for two reasons. The first is that a requirement of explicitness is missing. To assert that p, you must explicitly say that p—you must undertake commitment to p by stating that p, rather than implying that p.Footnote 19
The second reason is that asserting also involves presenting a proposition as true: unless you are communicating that p is true, you are not asserting that p (Frege 1948; Wright 1992, 34; Adler 2002, 274; Marsili 2018a; Marsili and Green 2021). Pagin (2004; 2009) argues that a speaker can undertake commitment to a proposition p by stating p explicitly without thereby communicating that p is true.Footnote 20 If this is right, it’s preferable to explicitly require that assertions communicate that their content is true.
In previous work (Marsili 2020b; 2021b; Marsili and Green 2021), I have offered an account of individual assertion that incorporates all these requirements. It reads as follows:
- (AC):
-
A speaker S asserts that p iff (1) S utters an expression with content p, thereby (2) presenting p as true, and (3) undertaking assertoric commitment to p.Footnote 21
There are two ways to translate this definition into an account of group assertion. We might simply replace “a speaker S” with “a group G”. Or we might follow Lackey, and offer two separated accounts for collective and proxy assertion. Let me clarify how we might go about the latter.
For collective assertions, collectivising the account is straightforward: we simply replace the requirement that all group members “reasonably intend to convey that p” with the requirement that they all intentionallyFootnote 22 satisfy AC: each member must intentionally act so that they collectively utter an expression with content p, thereby presenting p as true and taking responsibility for its truth.
- (CAC):
-
A group G makes a collective assertion that p iff the members of G coordinate individual acts a1, … an, so that they intentionally satisfy AC(1–3) together in virtue of these acts.
Adapting PA to the commitment-based model is a similarly easy feat. Condition (a) (the spokesperson reasonably intends to convey… etc.) should be replaced with conditions (1–3) from AC. Condition (b) (the authority requirement) can be adjusted to highlight that the relevant authority is the authority to undertake commitment to the proposition on behalf of the group. We get the following definition:
- (PAC):
-
A group G makes a proxy assertion iff a spokesperson(s) S
-
(a)
(1) utters an expression with content p, thereby (2) presenting p as true, and (3) undertaking assertoric commitment to pFootnote 23 on G’s behalf
-
(b)
S has the authority to undertake commitment to p on G’s behalf, and undertakes commitment to p in virtue of that authority
This definition has no difficulty dealing with the examples encountered so far. Clause (a-1) rules out all content that is not conveyed literally, but merely implied. This correctly classifies the believed-false proposition in Misleading Smoke (that the e-cigarettes are indeed safe) as implied, not asserted—and therefore as misleading, not lying.Footnote 24 Similarly, non-assertoric speech acts (like Misleading Advice) are ruled out by condition (a): only genuine assertions explicitly present their propositional content as true, thereby committing the group to that content.
This definition represents an improvement on both Tollefsen and Lackey’s proposals. Unlike Tollefsen’s GA, it offers sufficient conditions for group assertion, and supplements this characterisation with a well-defined account of what assertoric commitment is. Unlike Lackey’s PA and CA, it distinguishes group assertions from other group speech acts and from group implicatures. Finally, unlike Lackey’s account, it discriminates between group misleading and group lying.Footnote 25
4 Group intention to deceive
4.1 Does Lying Require an Intent to deceive?
The Group Lie Schema states that a group assertion is a lie only if it is meant to deceive. However, this requirement is controversial. Consider the following example:
Impossible deception.Footnote 26
Pete took part in a robbery. He knows that his involvement in the crime was unmistakably recorded on CCTV camera, so that there is no chance that anybody will believe him if he denies that he was there. However, he also knows that (because of an odd regional law) if he denies being involved in the robbery the judge will set a lower bail. So he claims that he wasn’t present at the scene of the crime. He has no intention to deceive anyone. He only says this because he is planning skip bail.
Intuitively, Pete is lying, even if he lacks an intention to deceive his audience. The example shows that you can lie without intending to deceive. Crucially, this is not an exception: countless examples purporting to show that not all lying is deceptive have been discussed in the literature.Footnote 27 This led most contemporary scholars to conclude that lying typically, but not necessarily, involves deception—so that asserting what you believe to be false is sufficient for lying.
4.2 Intending to Deceive and Intending to be Deceptive
Not all philosophers are happy with the recent divorce between lying and deception, however. Lackey (2013) once again plays a central role, for she devised a strategy to re-establish a link between lying and deceptive intent. Liars need not aim to deceive, she concedes, but a case can be made that liars always aim to be deceptive. “Intending to be deceptive” is a stipulative notion, which is meant to cover both attempts to make someone believe something and attempts to conceal information from someone. Lackey’s (2013, p. 241) alternative deception-condition can therefore be phrased as follows:
- (Deceptive):
-
The speaker intends her assertion to make someone believe that p is true, or to conceal information from someone as to whether p
About the Impossible Deception example, Lackey would say that although Pete has no intention to make the police believe that he did not commit the crime, he intends to conceal information from the police as to whether he was present at the scene of the crime. If this is right, Pete’s assertion is intended to be deceptive after all.
As Fallis (2015) rightly pointed out, however, this reconstruction of the scenario is problematic. You cannot conceal information from someone that already has that information. In Impossible Deception, for example, there is no meaningful sense in which Pete can conceal from the police the information that he was at the robbery by denying that he was there. The police already possess that information: they have a video of Pete committing the robbery.Footnote 28
Of course, out of irrational wishful thinking, Pete could nonetheless intend to conceal information that the police already have, and intention is all that the Deceptive condition requires (Lackey 2020b). But what matters for our purposes is whether we can conceive a version of the scenario in which Pete reasons like a rational agent, and lacks this intention. Clearly, we can. Call this version of the scenario Impossible Deception*. In Impossible Deception*, Pete’s only motivation to make his statement is to skip bail, and he lacks an intention to conceal information from the police: Deceptive is not satisfied. Since Pete is clearly lying (just as much as he is lying in the scenario in which he forms the irrational intention), the unavoidable conclusionFootnote 29 is that Deceptive, too, is subject to the counterexamples that affect ordinary “deceptionist” definitions.
4.3 Group Lying Without Intended Deception
If individual lying need not involve deception, the same is likely true of group lying. Here’s an example that supports this generalisation, inspired by the Volkswagen scandal:
Impossible Group Deception
Volkswagen has been exposed. There’s unquestionable proof that the board of directors commissioned a “defeat device” to fake emission tests. But the company’s lawyers identified a loophole in the law, that will prevent anyone from being convicted. Thanks to an obscure technicality, as long as Volkswagen doesn’t admit wrongdoing, and maintains (in public and in trial) that the executive board was not aware of the defeat device, no one can be convicted. Accordingly, Volkswagen issues various statements claiming that the executive board was unaware of the defeat device. None of these statements is meant to convince the public or the jury: after all, the executive board’s involvement has been proven beyond doubt. Volkswagen simply needs to go on the record as stating this, if the board of directors is to avoid conviction.
In Impossible Group Deception, we have a situation that is comparable to its individual counterpart, Impossible Deception. Volkswagen’s statement is intuitively a group lie. However, it doesn’t involve an intention to be deceptive—deception is unattainable in the scenario. Ex hypothesi, the company’s statement is not motivated by a vain hope to conceal the board of directors’ involvement (since its involvement has already been proven beyond doubt). The goal is simply to avoid the social sanctions that would ensue if the false statement wasn’t made.
A strong case has been made that the “intention to deceive condition” can simply be scratched off definitions of lying,Footnote 30 as long as the remaining conditions require not only that the speaker say something that they believe to be false, but that they genuinely assert it (Fallis 2009; 2012; 2013; Stokke 2013a; 2018; Marsili 2021b). This is just what our definition of group lying requires, once we replace Lackey’s PA (which doesn’t fulfil this requirement), with PAC (which requires that the speaker literally assert a proposition). This solution matches a growing number of accounts of individual lying that drop the intention to deceive condition in favour of a commitment-based assertion condition (Marsili 2014; 2016; 2018b; 2020a; 2021b; Leland 2015; Viebahn 2017; 2021; Marsili and Löhr forthcoming, cf. also Carson 2006, 2010; Saul 2012). Of course, the insincerity condition (G-Insincere) will play a crucial role in the resulting definition of group lying (it will have to discriminate, alone, between lies and sincere assertions). Let’s have a look, then, at how group insincerity is best characterised.
5 Group Insincerity
According to G-Insincere (the second condition of the Group Lying Schema), a group lies only if that group believes that what they assert is false. To illustrate with an example: since General Motors was aware of the deadly effects of tetraethyl lead, they lied when they claimed that they were not aware of any health hazard. A question immediately arises: what do we mean when we say that General Motors was aware that what they claimed was false? What do we mean, more generally, when we say that a group believes something?
There is little scholarly consensus on these matters, and two factions dominate the debate. Summative views understand group belief as the ‘summation’ of the beliefs of the members of the group.Footnote 31Non-summative views take group belief to be irreducible to its members’ beliefs: what a group believes is determined at the collective level, for instance by a collective agreement to accept a proposition as true.Footnote 32
In what follows, I will not take a stance on what constitutes group belief. My goal is more modest. I want to show that a group can lie even if the group does not believe that the proposition it asserted is false (regardless of how one understands group belief), and consider some alternative accounts of group insincerity.
5.1 Group lying and group uncertainty
Most existing discussion of group insincerity implicitly presupposes an on/off conception of belief: either you believe something, or you don’t. But doxastic states often come in shades. Your confidence in a proposition can fall on a spectrum that goes from high confidence to complete uncertainty, and these graded attitudes will often fall short of a full belief (or full disbelief).
This has implications for how lying should be defined. To illustrate: suppose that you’re somewhat confident that Carolina went to the gym today, but (given your uncertainty) you don’t quite believe that she did. You would be lying if you told someone that Carolina didn’t go to the gym, even if you don’t outright believe that what you said is false (there is consensus on this; see Isenberg 1964; Carson 2006; Whyte 2013, Marsili 2014, 2021a, 2018b, Marsili and Löhr forthcoming, Krauss 2017, Benton 2018, Trpin et al. 2021). I call this kind of lie a “graded-belief” lie. Definitions that require outright belief in the falsity of the proposition (like the Group Lying Schema) don’t classify graded-belief lies as lies.
This is a problem, because groups, like individuals, often operate under conditions of uncertainty. A food company may have to determine whether consumption of a certain additive is safe, given the minimal (but not null) risk that it might cause long-term health issues. Similarly, a government may need to determine whether a foreign country really represents a threat. In each of these cases, the process of deliberation may not lead to a neat yes/no verdict. The food company may conclude that the evidence is unconclusive, and that the additive is probably safe. And the government may determine that the foreign country almost surely doesn’t represent a threat, without ruling out the possibility that it might. In both cases, the group is somewhat confident that the relevant proposition is true, but (given the salient doubts) it would be incorrect to say that the group simply believes that the proposition is true.
I didn’t take a stance on which conditions have to be met for a group to hold a belief. But unless we adopt a strongly revisionist account of group belief (one that goes against our intuitive doxastic ascriptions), these examples suggest that there are cases where a group’s confidence in a proposition falls short of full belief. Graded-belief lies are therefore a possibility for groups, too, as illustrated by the following example (inspired on real events, cf. Hersh 2003):
Iraq War
It’s 1993, right before the US invasion of Iraq. The United States National Security Council (USNSC) is meeting to discuss whether military action would be warranted. The CIA just received reports indicating that Iraq has bought uranium from Niger and is storing Weapons of Mass Destruction (WOMD) in secret facilities in the North-West of the country. But more reliable evidence indicates that the reports were fabricated, most likely by the Italian Secret Services (SISMI). After some discussion, the members of the USNSC agree that, although the possibility cannot be ruled out, it’s unlikely that Iraq possesses WOMD. However, the members of the USNSC also agree that this is a great chance to justify military intervention in Iraq. They decide to bury all the evidence contradicting the CIA reports, and to seal the records of the meeting. An official memorandum is then produced, with recommendations for the US congress and for the public. The memorandum states that immediate action should be taken, because “Iraq possesses weapons of mass destruction”.
In the scenario, none of the members of the USNSC has ruled out the possibility that Iraq possesses WOMD: their doxastic attitude falls short of belief, so that it would be incorrect to say that the USNSC outright believes that its official statement is false.Footnote 33 Intuitively, however, the USNSC lied to the public.
Graded-belief group lies are not uncommon. To go back to two of our initial examples, it’s not unlikely that the heads of General Motors thought (perhaps driven by wishful thinking) that there was a chance that the effects of tetraethyl would turn out to negligible for human health. Similarly, Philip Morris (who asserted for decades that tobacco is safe to smoke) may have thought that while the evidence of carcinogenic effects was convincing, conclusive proof was yet to be found. Group lies involving uncertainty are not merely a conceptual possibility. They are central cases, and as such they call for a refinement in our definition of group lying (cf. Ludwig 2019, fn 12).
Luckily, fine-grained accounts of insincerity that can handle these cases are available. We could take inspiration from Carson (2006; 2010) and Sorensen (2007, p. 256; 2011, p. 407), who have characterised insincerity as the absence of a belief in the truth of the asserted proposition (as opposed to an active belief in its falsity). The refined insincerity condition would read:
- (G-Insincere-weak):
-
G does not believe that p is true.
Plugged into the Group Lie Schema, this condition classifies Iraq War as a group lie. Arguably, however, it’s unable to track the intuitive distinction between lying and bullshitting, as shown by the following example (modified from Lackey 2020a, pp. 197–98):
Oil Company
After the oil spill in the Gulf of Mexico, BP began spraying dispersants in the clean-up process that were criticized by environmental groups for their level of toxicity. In response to this outcry, the executive management team of BP convened and its members jointly accepted to reply that the dispersants being used are safe and pose no threat to the environment, a view that was then made public through all the major media outlets. It later turned out that BP’s executive management team had no idea whether the claim was true, and they never even attempted to check whether it was. Their reply simply served their purpose of financial and reputational preservation.
Since BP didn’t even check whether the claim was true, its statement should arguably be classified as bullshit, rather than lying (Frankfurt 2005, p. 55; Saul 2012, p. 20; Marsili 2014; 2018b, p. 175; Falkenberg 1988, p. 93; Meibauer 2014, p. 162). However, G-Insincere-weak classifies it as a lie, since BP lacks a belief in the asserted proposition. If we want to keep these notions apart (cf. Lackey 2018a, p. 278; 2020a), G-Insincere-weak will not do, because it conflates group bullshit with group lying.Footnote 34
In earlier work (Marsili 2014, 2018b, 2021a, 2022a), I proposed an alternative way to characterise graded insincerity: an assertion is insincere if the speaker finds it is more likely to be false than true. Applied to groups:
- (G-Insincere-graded):
-
G is more confident in the falsity of p than in its truth.
According to G-Insincere-graded, Oil Company is not a lie: since BP didn’t assess the veracity of its claim, there is no sense in which BP is more confident in the falsity of p than its truth. This criterion also correctly classifies Iraq War as a lie, since the council opined that their statement was more likely to be false than true.Footnote 35 However, as we are about to see, there are cases that also this account has trouble handling.
5.2 Group Lying and Group Intentions
It has been pointed out that an individual can lie by misrepresenting their intentions, rather than their beliefs (Marsili 2016). Suppose I promise my jealous fiancée that I’ll never kiss Gelsomina. I promise this because I’m aware that I have virtually no chance of kissing Gelsomina: she doesn’t reciprocate my infatuation for her—in fact, she finds me repulsive. However, I have not been honest to my fiancée: I fully intend to kiss Gelsomina if she were to suddenly change her dispositions and try to kiss me. Intuitively, my promise is insincere: I lack an intention to act as I promise. Most people would agree that I lied to my fiancée.Footnote 36
Generalising, lying is not always a matter of misrepresenting doxastic states: sometimes it involves misrepresenting your intentions without misrepresenting your beliefs. And groups can lie in this way, too:
University Funding
Every year, the Faculty of Tetrapiloctomy receives funding from various donors. One of them, E-Corp, has just been involved in a terrible scandal. Students wrote a petition to ask the faculty not to accept E-Corp funding in the future. But the faculty desperately needs money. They meet to consider their options. A professor notes that E-Corp is going through financial hardship since the scandal, and they will almost certainly cut all funding to academic institutions. All Faculty members agree: they all believe that it is virtually impossible that E-Corp funding will be available in the future. At the same time, they collectively agree that funding is needed. They deliberate that if (against all likelihood) E-Corp funding were to become available, the faculty will accept it. But they also decide that it’s preferable to let the students believe that the faculty would not accept E-Corp funding. The next day, the faculty’s spokesperson sends an email to all students, solemnly pledging that “the faculty will never accept funding from E-Corp in the future”.
Intuitively, the Faculty of Tetrapiloctomy is lying about their funding plans. The case is analogous to the individual lie example, which involved a promise that was sincere in terms of belief (I believe I won’t kiss Gelsomina) but not in terms of intention (I intend to kiss her if possible). Similarly, although the faculty believes that funding won’t be available, it intends to accept it (if, against all odds, their predictions turn out to be mistaken). Their pledge is clearly mendacious. But the Faculty is aware that E-Corp funding will almost certainly never be available, so the statement is not classified as a lie by G-Insincere-Graded.
The example shows that a good insincerity condition should capture both insincere intentions and insincere beliefs. To incorporate both into the definition, we can take inspiration from existing speech-act theoretic analyses. Speech act theorists typically define insincerity as a discrepancy between the psychological state in which the speaker is and the state expressed by their utterance (Searle 1969; Falkenberg 1988; Marsili 2014; 2016; Stokke 2014; 2018). The underlying idea is that, by performing a speech act, a speaker can expressFootnote 37 psychological states, thereby representing themselves as being in those states (e.g., by shouting “ouch!”, I represent myself as being in pain). Whenever the speaker misrepresents the state that they are in, the speaker is insincere. Accordingly, group insincerity can be defined as follows:
A Group G is insincere iff:
(Insincere-1) By performing a speech act F(p), G expresses a psychological state Ψ(p).
(Insincere-2) G is not in state Ψ(p).
This works for University Funding. The utterance “we solemnly pledge that the faculty will never accept funding from E-Corp” expresses two psychological states: an intention not to accept funding, and a belief that the faculty will not accept funding.Footnote 38 According to Insincere-2, the faculty is insincere if the faculty believes that they will accept funding, or if they intend to accept funding, or both. Since the faculty intends to accept funding, their statement is correctly classified as insincere.
This definition, however, cannot accommodate the difference between group lying and group bullshitting. In Oil Company, BP doesn’t believe that the dispersants they used are safe (they have no idea whether they are), so their statement would be classified as a lie. To rule out group bullshitting, the definition should be narrowed down. A solution is to replace Insincere-2 with Insincere-2*, which is derived from G-Insincere-graded:
- (Insincere-2*):
-
S is closer to Ψ-ing(¬p) than Ψ-ing(p).
The notion of “being closer to a mental state than another” introduced here is meant to be a term of art. We saw that graded beliefs can be thought of as falling on a spectrum, with full conviction in a proposition at one end and full conviction in its negation at the other. For instance, in Iraq War, the USNSC was more confident in the falsity of their statement than in its truth, and therefore closer to believing that what they said is false. The notion of closeness is meant to apply to dichotomic mental states like intentions, too. In University Funding, the faculty intends to accept funding: they are closer to intending to accept it than they are to not intending to accept it—so their statement is insincere by Insincere-2*’s light.
So construed, Insincere-2* meets all the desiderata. Since Ψ-ing can be replaced with believing, it can draw all the distinctions that are drawn by G-Insincere-graded. And since Ψ-ing can be replaced with intending, it correctly classifies University Funding as a genuine lie.Footnote 39
6 Group Lying
It is now time to draw our conclusions. I argued that The Group Lie Schema should be refined, and I proposed three adjustments. First, against what was previously suggested in the literature, we should understand group assertion along the lines of PAC and CAC. Second, group lying does not require an intention to deceive. Third, group insincerity is best modelled in a way that allows for graded-belief lies and insincere intentions. The resulting definition reads as follows:
A group, G, lies iff
(G-1) G asserts that p (in the sense defined by PAC or CAC).
(G-2) By asserting that p, the group expresses a psychological state Ψ(p).
(G-3) S is closer to Ψ -ing(¬p) than Ψ -ing(p).
A qualification is in order at this point. While the present definition improves over previous ones, it’s still wanting in many respects. Many questions remain open—and depending on how we answer them, new difficulties will arise.
I did not attempt to settle, for instance, how group beliefs should be modelled. The examples introduced in this paper (University Funding, Iraq War, etc.) further complicate the task of modelling the attitudes relevant to group lying, as they bring more attitudes (credences, intentions) into the picture. While previous work assumed that an account of group belief would suffice (the only relevant possibilities being that the group believes that p is true or false), it turned out that a good account of group lying should also tell us something about how group credences and group intentions should be modelled.
Group credences in particular raise tricky questions. To keep the discussion manageable, this paper focused on cases involving consistent group credences: in Iraq War, every member of USNSC agreed that their assertion was likely false. But how should we model cases in which the members of a group hold inconsistent attitudes towards the asserted proposition? We might have a case where 20% of the group believes p likely true, 10% certainly true, 15% certainly false, 25% likely false, and 30% is simply uncertain. And the picture could be complicated further, by considering more fine-grained attitudes (and the graded attitudes that a group agrees to adopt). How we should aggregate attitudes in these cases, and under which conditions graded criteria like (G-3) would be satisfied, remain open questions.
Many challenges, then, are still to be met before we get a firm grasp on what group lying is—and the ones I highlighted here are by no means the only ones. The phenomenon remains understudied: this paper has tied some loose ends, but much work still needs to be done in order to understand what group assertion and group lying are.
Notes
‘Volkswagen Cheated Emissions Tests, UK Court Rules’, 06/04/2020. Euronews. 6 April 2020. https://www.euronews.com/my-europe/2020/04/06/volkswagen-used-defeat-devices-to-cheat-emissions-tests-uk-court-rules.
We might consider a fourth, more fundamental question: what is a ‘group’? Here I will simply rely on an ordinary language understanding of what a group is—taking it to refer to any social group, ranging from informal social groups to institutions. Different conceptions are available. For philophical discussion, see e.g. List and Pettit (2011); Tuomela (2013); Epstein (2015, chap. 10; 2021, sect. 5); Ludwig (2016, 2017).
Tollefsen (2020) draws a further distinction here. She reserves the term “collective” for the former case (the demonstrators’ slogan), and calls the latter a “joint assertion”. This is to highlight that the former assertion involves a collection of individual speech acts, whereas the latter is one joint speech act performed by both speakers, together. This distinction is helpful, but not needed for our purposes.
An earlier discussion of proxy speech is found in Goffman (1981, pp. 144–46), who distinguishes between the animator (the person who physically produces the utterance), the author (who selects “the sentiments that are being expressed and the words in which they are encoded”, Goffman 1981, p. 145), and the principal (who is responsible for the resulting illocution; for discussion, see Dynel 2011, sec. 2). When a group makes a proxy assertion, the spokesperson instantiates the role of the animator (and can take the role of the author, cf. §3.2), and the group always plays the role of the principal (since the group is responsible for the resulting assertion).
Lackey adopts a different vocabulary: she uses the term authority-based assertions for proxy assertions, and coordinated assertions for the collective assertions. Here I adapted her definition to my terminology.
Ludwig (2020, p. 307) proposes a simpler definition: “In proxy assertion one person or group (the principal) asserts something through another (the proxy) who speaks on the principal’s behalf”. While this characterization is arguably correct, it’s not very informative; PA represents a better starting point for our discussion.
Of course, the qualification “that p belongs to a domain d” should also be excised from the definendum in PA. Concerns about domain aside, I find myself mostly in agreement with Lackey’s characterisation of authority (2017, §2): what matters for proxy assertion is de facto authority, which can be acquired in a variety of ways (through explicit agreement, force, inheritance, etc.). Conflicts of authority can surely arise. In such cases, to the extent that it is unclear if the spokesperson had the authority to assert that p on G’s behalf, it will be equally unclear if G has asserted that p. I regard this as a correct prediction of the account.
While there is disagreement as to whether misleading is morally preferable to lying or on a par with it (Higgs 1985; Jackson 1991; Adler 1997; Saul 2000; Williams 2002; Strudler 2009; Webber 2013; Berstler 2019), virtually everybody agrees lying is a narrower concept than misleading, and that a good definition should reflect that (but cf. Meibauer 2005; 2011; 2014). Crucially, Lackey accepts this desideratum (2018a, pp. 267–8, 2020b, ch. 5).
To be sure, this would be an advantage if PA and CA were accounts of group testimony. Lackey (2006; 2008) has convincingly argued that one can testify not only by asserting a proposition, but also by implying it. I concur, but repurposed as definitions of group assertion, PA and CA are simply incorrect, since they classify implicatures as assertions. Incidentally, whether these definitions work as accounts of group testimony is currently matter of debate (Townsend 2020).
This is not to deny the executive board’s behaviour is objectionable: it obviously is. Nicotella is guilty of intentional deception, and of putting its customers’ lives at risk. The point here is simply that (assuming that there is a meaningful conceptual distinction between lying and misleading) this statement should be classified as misleading, not lying.
Could this objection be blocked by noting that PA includes the clause “in virtue of the communicable content of an act”? The clause would help if it only captured expressions that literally mean p. But it doesn’t: implicatures are also conveyed in virtue of the communicable content of the act. Lackey (2006, 188; 2017, 31) explicitly stresses that this clause doesn’t exclude non-literal content. It’s instead meant to exclude cases in which the truth of a proposition is “illustrated” rather than “expressed”. To wit, cases like: “I sing “I have a soprano voice” in a soprano voice and I intend to convey the information that I have a soprano voice in virtue of the perceptual content of this assertion” (2006, 188), not it in virtue of the propositional content of the assertion.
Tollefsen (2020) considers Neo-Gricean accounts, constitutive rules account, and common ground accounts, concluding that none would offer a good basis for an account of group assertion. I concur, although partially on independent reasons (see Marsili 2019; 2020a; 2020b). Although in passing, Meijers (2007, pp. 104–5) also characterises group assertion in terms of commitment.
This view has been articulated in various ways. See Peirce, (CP, MS); Searle (1969; 1975); Grice (1978, p. 126); Brandom (1983; 1994); Searle and Vanderveken (1985); Green (1999; 2000; 2017); Alston (2000); Rescorla (2009); MacFarlane (2011); Krifka (2014; 2019); Marsili (2015; 2020b; 2021b); Tanesini (2016; 2019).
A third issue, that I will not address here, concerns the dynamics by means of which a group can undertake a commitment. Engaging in this broader debate would lead us astray; for an overview, the reader can refer to Schweikard and Schmid (2021).
To explicitly allow for the possibility of subsentential and elliptical assertions (such as “For you” indicating a letter, or a positive nod to address an answer), minor amendments can be introduced in this characterisation. I won’t consider them here, but the reader can refer to Alston (2000, 114–119), Cull (2019), and Marsili (2020b) for available solutions.
Ludwig notes that sometimes spokespersons represent the group “not with respect to the particular words and phrase, but rather with respect to the gist of what is conveyed” (2019: 320–1, cf. also Goffman 1981). It might be argued that in these cases the spokesperson does not undertake responsibility for p, but for a looser “bundle” of propositions p1-pn (cf. Bowker 2019). We would have a situation that resembles familiar cases of semantic underdetermination, but in which the source of the ambiguity is a division of labour between communicative agents, as opposed to underdetermination at the sentential level. While I have no space to discuss this complication here, I am confident that PAC can deal with this difficulty (either by applying existing accounts of semantic underdetermination, or by loosening condition (a3), or both).
Among them, lies under coercion (Siegler 1966, 129; Carson 1988), ‘bald-faced lies’ (Carson, Wokutch, and Murrmann 1982; Sorensen 2007), ‘knowledge lies’ (Sorensen 2010), ‘tell-tale sign lies’ (Krstić 2019; 2022; Krstić and Wiegmann 2022), ‘alternative motivation’ lies (Rutschmann and Wiegmann 2017; Sneddon 2020), and many others (Marsili 2016, sec. 9.2; 2021b, sec. 2.3; Sorensen 2018; 2022).
In addition to this, it is disputable that concealing information is a form of deception (Mahon 2007, p. 187; cf. Dynel 2020; Fallis 2020). Even if it turned out that Lackey’s refinement accommodates all counterexamples, then, one could still dispute that this restores a link between lying and intended deception.
Lackey (2020b, p. 172, fn 17) since insisted that refusing to confess something that your audience already knows does amount to concealing information about p (because in failing to confess you are “concealing evidence which would be conveyed through [your] confession”). I find this unpersuasive: you can’t conceal a confession that has not been made (at most, you can withhold it—but Lackey explicitly acknowledges that withholding falls short of concealing, cf. Lackey 2018a, p. 64). Furthermore, even we ignored this difficulty, Lackey’s definition can’t handle the Impossible Deception* scenario (where we stipulate that Pete lacks such intention). Nor can it handle many of the counterexamples referenced in footnote 24—in this respect, Lackey’s most recent book (2020b) simply refuses to engage with the relevant literature and the existing counterevidence.
Against this move, Lackey (2013) has argued that a definition lacking an intention to deceive condition would be unable to account for selfless assertions (as defined in Lackey 2007; 2018b). But this objection has since lost its force, for various reasons. First, Fallis (2015, 13) has argued that selfless assertions can be accommodated without introducing a requirement like Deceptive. Second, these putative counterexamples are too complex and controversial to be decisive: how we should interpret selfless assertions is currently a matter of lively debate (Engel 2008; Montminy 2013; Turri 2014; Milić 2017; Gaszczyk 2019), and many interpretations don’t support Lackey’s counterargument. Third, it’s not easy to see how a group can make selfless assertions. I have no space to fully justify this third claim, so I will simply note that some accounts of group belief seem to implicitly rule out the possibility of selfless group assertion.
See Quinton (1976). The main challenge for Summative views is to explain how individual beliefs aggregate into a collective belief: is there a threshold to be met? If so, which is it? If not, how do we tell whether a group believes a proposition? Unsatisfied by the answers available, most scholars now endorse non-summative views.
There are various versions of this view (Gilbert 1987; 2004; Tuomela 1992; 1995; Tollefsen 2003; Pettit 2003; List 2005). One of its problems is that it allows for voluntary beliefs (Wray 2001; 2003; Meijers 2002; Hakli 2006; 2007; Gilbert and Pilchman 2014). This runs counter the idea that belief is a mental state that (unlike intention) is not the fruit of intentional deliberation. Furthermore, Lackey (2018a; 2020a; 2020b) notes that this feature makes Non-Summative views ill-suited for characterising group lying, since groups could then avoid accusations of mendacity simply by strategically accepting any false proposition as true. Lackey’s point is questioned by Bright (2020), who suggests that a notion of group lying with this implication could still be warranted, if it fits some explanatory purposes. The relevance of this objection can be questioned, however, given that the goal here is to characterise and refine our ordinary concept of lying, not stipulating an artificial one (but see Bright 2020 on whether our epistemological inquiry should pursue different goals).
I am stipulating this ex-hypothesi, but the scenario might need to be adjusted depending on one’s account of group belief. Whatever the conditions for being in a certain doxastic state are, the scenario should be interpreted (and, if needed, amended) so that the USNSC meets the condition for being confident that the statement is false, but not the conditions for full belief.
Arguably, G-Insincere-weak has a further problem: sincere assertions for which a group has low confidence are incorrectly treated as lies. Imagine a group G such that (i) G asserts that tomorrow will rain, (ii) G is confident that it will most likely rain, but (iii) G isn’t confident enough to believe that it will. G’s statement is surely misleading (since G communicated more confidence than appropriate), but (against G-Insincere-Weak) it would be odd to call it lying, since G is confident that what G said is most likely true.
Two caveats are in order here. First, as already noted in Marsili (2014; 2018b; 2021a; 2022a), criteria like G-Insincere-graded should not be understood as positing sharp boundaries. “Sincere” is a gradable adjective, whose strength depends (among other factors) on how confident the speaker is in the falsity of a proposition. The more confident the group is in the falsity of what it asserts, the more straightforward its lie. Second, a refinement might be needed to handle cases of misspeaking and self-deception (more generally: cases where G honestly believes that G is asserting a true proposition, but G actually believes p to be false). I consider refinements that might help in Marsili (2017, chap. A, 1.2; 2018b, 174, fn8; 176; 2021c), but alternative options are available (Sorensen 2011; Saul 2012, 15–19; Fallis 2012; Stokke 2014; Pepp 2018). Both caveats apply also to the derivative criterion discussed in the next section, Insincere-2*.
This claim is empirically verified: people overwhelmingly classify this kind of promises as lies (Marsili 2016).
The Faculty of Tetrapiloctomy has made a promise, rather than an assertion. Aren’t promises still ruled out by the G-assertion condition? Only if promises are not assertions. Against this assumption, it has been argued that whenever you promise that you will p, you also thereby assert that you will p. This point extends to any speech act that “illocutionary entails” an assertion: whenever you guarantee, swear, admit (etc.) that p, you are thereby asserting that p (Searle and Vanderveken 1985, pp. 24–25, pp. 129–37, pp. 182–92; Marsili 2015, sec. 3; 2016, sec. 3; 2020a, sec. 2; 2020b, sec. 5.1; 2021b, sec. 2). Indeed, all the conditions set by PAC are satisfied here: the faculty explicitly presented the proposition that they will not accept funding from E-Corp as true, thereby undertaking commitment to it (cf. Marsili 2021b, p. 3264). In other words, the definition of group assertion provided in this paper captures any illocution that illocutionary entails an assertion, including promises (Searle and Vanderveken 1985, pp. 129–37). What is a group promise, then? It would be too ambitious to provide a full answer in this paper, but the reader can refer to Meijers (2007, p. 106) for some tentative remarks (cf. also Hughes 1984, pp. 391–2).
References
Adler JE (1997) Lying, deceiving, or falsely implicating. J Philos 94(9):435–452
Adler JE (2002) Belief’s own ethics. MIT Press, Cambridge. https://doi.org/10.1080/15665399.2003.10819759
Alston WP (2000) Illocutionary acts and sentence meaning. Cornell University Press, Ithaca
Benton MA (2017) ‘Lying, belief , and knowledge’. In: The Oxford handbook of lying
Benton MA (2018) Lying, accuracy and credence. Analysis 78(2):195–198. https://doi.org/10.1093/analys/anx132
Berstler S (2019) What’s the good of language? On the moral distinction between lying and misleading. Ethics 130(1):5–31. https://doi.org/10.1086/704341
Betz-Richman N (2022) Lying, hedging, and the norms of assertion. Synthese 200(2):176. https://doi.org/10.1007/s11229-022-03644-8
Borg E (2019) Explanatory roles for minimal content. Noûs 53(3):513–539. https://doi.org/10.1111/nous.12217
Bowker M (2019) Saying a bundle: meaning, intention, and underdetermination. Synthese 196(10):4229–4252. https://doi.org/10.1007/s11229-017-1652-0
Brady MS, Fricker M (eds) (2016) The epistemic life of groups: essays in the epistemology of collectives. Mind association occasional series. Oxford University Press, Oxford. https://doi.org/10.1093/acprof:oso/9780198759645.001.0001
Brandom R (1983) Asserting. Noûs 17(4):637–650
Brandom R (1994) Making it explicit: reasoning, representing, and discursive commitment. Harvard University Press, Cambridge
Bright LK (2020) Group lies and reflections on the purpose of social epistemology. Aristot Soc Suppl 94(1):209–224. https://doi.org/10.1093/arisup/akaa011
Carson TL (1988) On the definition of lying: a reply to Jones and revisions. J Bus Ethics 7(7):509–514
Carson TL (2006) The definition of lying. Noûs 40(2):284–306
Carson TL (2010) Lying and deception. Oxford University Press, Oxford
Carson TL, Wokutch RE, Murrmann KF (1982) Bluffing in labor negotiations: issues legal and ethical. J Bus Ethics 1(1):13–22
Cull M (2019) When alston met brandom: defining assertion. Rivista Italiana Di Filosofia Del Linguaggio 13:36–50. https://doi.org/10.4396/09201902
Cullison A (2010) On the nature of testimony. Episteme. https://doi.org/10.3366/E1742360010000857
Davis WA (2003) Meaning, expression and thought. Cambridge University Press, Cambridge
Dynel M (2011) Revisiting Goffman’s postulates on participant statuses in verbal interaction. Lang Linguist Compass 5(7):454–465. https://doi.org/10.1111/j.1749-818X.2011.00286.x
Dynel M (2018) Irony, deception and humour. Seeking the truth about overt and covert untruthfulness. De Gruyter, Mouto, Berlin, Boston
Dynel M (2020) To say the least: Where deceptively withholding information ends and lying begins. Top Cogn Sci 12(2):555–582. https://doi.org/10.1111/tops.12379
Engel P (2008) In what sense is knowledge the norm of assertion? Grazer Philosophische Studien 77(1):99–113
Epstein B (2015) The ant trap. Oxford University Press, Oxford
Epstein B (2021) Social ontology. In: Edward N, Zalta, Winter (eds) The Stanford encyclopedia of philosophy. Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/win2021/entries/social-ontology/
Falkenberg G (1988) Insincerity and disloyalty. Argumentation 2(1):89–97. https://doi.org/10.1007/BF00179143
Fallis D (2009) What is lying? J Philos 106(1):29–56
Fallis D (2012) Lying as a violation of Grice’s first maxim of quality. Dialectica 66(4):563–581. https://doi.org/10.1111/1746-8361.12007
Fallis D (2013) Davidson was almost right about lying. Australas J Philos 91(2):337–353. https://doi.org/10.1080/00048402.2012.688980
Fallis D (2015) Are bald-faced lies deceptive after all? Ratio 28(1):81–96. https://doi.org/10.1111/rati.12055
Fallis D (2020) Shedding light on keeping people in the dark. Top Cogn Sci 12(2):535–554. https://doi.org/10.1111/tops.12361
Frankfurt HG (2005) On bullshit. Princeton University Press. http://journals.cambridge.org/production/action/cjoGetFulltext?fulltextid=5452992
Frege G (1948) Sense and reference (Über Sinn Und Bedeutung). Philos Rev 57(3):209–230
Fricker M (2012) Group testimony? The making of a collective good informant. Philos Phenomenol Res 84(2):249–276. https://doi.org/10.1111/j.1933-1592.2011.00565.x
García-Carpintero M (2013) Explicit performatives revisited. J Pragmat 49(238128):1–17
García-Carpintero M (2018) Sneaky assertions. Philosophical Issues, 1999:1–48
Gaszczyk G (2019) Are selfless assertions hedged? Rivista Italiana Di Filosofia Del Linguaggio 13(1):92–99. https://doi.org/10.4396/09201909
Gilbert M (1987) Modelling collective belief. Synthese 73(1):185–204. https://doi.org/10.1007/BF00485446
Gilbert M (2004) Collective epistemology. Episteme 1(2):95–107. https://doi.org/10.3366/epi.2004.1.2.95
Gilbert M, Pilchman D (2014) Belief, acceptance, and what happens in groups: some methodological considerations. In: Lackey J (ed) Essays in collective epistemology. Oxford University Press, Oxford
Gluer K (2001) Dreams and nightmares conventions, norms, and meaning in Davidson’s philosophy of language’. In: Pagin P, Segal G, Kot̓átko P (eds) Interpreting Davidson. CSLI Publi, Stanford
Goffman E (1981) Forms of talk. University of Pennsylvania Press, Pennsylvania
Graham PJ (2020) Assertions, handicaps, and social norms. Episteme 17(3):349–363. https://doi.org/10.1017/epi.2019.53
Green M (1999) Illocutions, implicata, and what a conversation requires. Pragmat Cognit 7(i):65–91
Green M (2000) Illocutionary force and semantic content. Linguist Philos 23:435–473
Green M (2007) Self-expression. Oxford University Press, Oxford. https://doi.org/10.1111/j.1467-9213.2009.618_7.x
Green M (2009) Speech acts, the handicap principle and the expression of psychological states. Mind Lang 24(2):139–163. https://doi.org/10.1111/j.1468-0017.2008.01357.x
Green M (2017) Assertion. Oxford Handb Online 1:1–25. https://doi.org/10.1093/oxfordhb/9780199935314.013.8
Hakli R (2006) Group beliefs and the distinction between belief and acceptance. Cogn Syst Res Cogn Joint Act Collect Intent 7(2):286–297. https://doi.org/10.1016/j.cogsys.2005.11.013
Hakli R (2007) On the possibility of group knowledge without belief. Soc Epistemol 21(3):249–266. https://doi.org/10.1080/02691720701685581
Hare RM (1952) The language of morals. Oxford Paperbacks, Oxford
Hersh SM (2003) The Stovepipe: how conflicts between the bush administration and the intelligence community marred the reporting on Iraq’s Weapons. The New Yorker, 19 October 2003. https://www.newyorker.com/magazine/2003/10/27/the-stovepipe
Higgs R (1985) On telling patients the truth. In: Lockwood M (ed) Moral dilemmas in modern medicine. Oxford University Press, Oxford
Holguín B (2019) Lying and knowing. Synthese. https://doi.org/10.1007/s11229-019-02407-2
Hormio S (2022) Group lies and the narrative constraint. Episteme. https://doi.org/10.1017/epi.2022.12
Hovannisian RG (2015) Denial of the Armenian genocide 100 years later: the new practitioners and their trade. Genocide Stud Int 9(2):228–247. https://doi.org/10.3138/gsi.9.2.04
Hughes J (1984) Group speech acts. Linguist Philos 7:379–395
Isenberg A (1964) Deontology and the ethics of lying. Philos Phenomenol Res 24(4):463–480
Jackson J (1991) Telling the truth. J Med Ethics 17:5–9. https://doi.org/10.1111/tct.12155
Kölbel M (2010) Literal force: a defence of conventional assertion. In: Sawyer S (ed) New waves in philosophy of language. Palgrave Macmillan, London, pp 108–137
Kovarik W (2012) Ethyl leaded gasoline. Environmental History (blog). 23 September 2012. https://environmentalhistory.org/about/ethyl-leaded-gasoline/
Krauss SF (2017) Lying, risk and accuracy. Analysis 73:651–659. https://doi.org/10.1093/analys/anx105
Krifka M (2014) Embedding illocutionary acts. Recursion. https://doi.org/10.1007/978-3-319-05086-7_4
Krifka M (2019) Layers of assertive clauses: propositions, judgements, commitments, acts. In Propositionale Argumente Im Sprachvergleich: Theorie Und Empirie
Krstić V (2019) Can you lie without intending to deceive? Pac Philos Q 100(2):642–660. https://doi.org/10.1111/papq.12241
Krstić V (2022) On the connection between lying, asserting, and intending to cause beliefs. Inquiry. https://doi.org/10.1080/0020174X.2022.2111344
Krstić V, Wiegmann A (2022) Bald-faced lies, blushing, and noses that grow: an experimental analysis. Erkenntnis. https://doi.org/10.1007/s10670-022-00541-x
Lackey J (2006) The nature of testimony. Pac Philos Q 87(2):177–197. https://doi.org/10.1111/j.1468-0114.2006.00254.x
Lackey J (2007) Norms of assertion. Noûs 41(4):594–626. https://doi.org/10.1111/j.1747-9991.2007.00065.x
Lackey J (2008) Learning from words: testimony as a source of knowledge. Oxford University Press, Oxford
Lackey J (2013) Lies and deception: an unhappy divorce. Analysis 73(2):236–248. https://doi.org/10.1093/analys/ant006
Lackey J (ed) (2014) Essays in collective epistemology. Oxford University Press, Oxford. https://doi.org/10.1093/acprof:oso/9780199665792.001.0001
Lackey J (2017) Group assertion. Erkenntnis. https://doi.org/10.1007/s10670-016-9870-2
Lackey J (2018a) Group lies. In: Michaelson E, Stokke A (eds) Lying: language, knowledge, ethics, politics. Oxford University Press, Oxford, pp 1–43
Lackey J (2018b) Selfless assertions. In: Meibauer J (ed) The Oxford handbook of lying, Oxford University Press, pp. 244–251. https://doi.org/10.1093/oxfordhb/9780198736578.013.18
Lackey J (2020a) Group belief: lessons from lies and bullshit. Aristot Soc Suppl 94(1):185–208. https://doi.org/10.1093/arisup/akaa007
Lackey J (2020b) The epistemology of groups. Oxford University Press, Oxford
Langton R (2018) Blocking as counter-speech. New work on speech acts. Oxford University Press, Oxford
Leland PR (2015) Rational responsibility and the assertoric character of bald-faced lies. Analysis. https://doi.org/10.1093/analys/anv080
List C (2005) Group knowledge and group rationality: a judgment aggregation perspective. Episteme 2(1):25–38. https://doi.org/10.3366/epi.2005.2.1.25
List C, Pettit P (2011) Group agency. Oxford University Press, Oxford. https://doi.org/10.1093/acprof:oso/9780199591565.001.0001
Ludwig K (2016) From individual to plural agency: collective action, vol 1. Oxford University Press, Oxford, New York
Ludwig K (2017) From plural to institutional agency, first edition. Collective action, vol 2. Oxford University Press, Oxford, New York
Ludwig K (2019) What Are Group Speech Acts? Lang Commun. https://doi.org/10.1016/j.langcom.2019.04.004
Ludwig K (2020) Proxy assertion. In: Goldberg S (ed) The Oxford handbook of assertion, pp. 305–326. https://doi.org/10.1093/oxfordhb/9780190675233.013.13
MacFarlane J (2011) What is assertion? In: Brown J, Cappelen H (eds) Assertion: new philosophical essays. Oxford University Press, Oxford, pp 79–96
Mahon JE (2007) A definition of deceiving. Int J Appl Philos 21(2):181–194
Mahon JE (2008) Two definitions of lying. Int J Appl Philos 22(2):211–230
Mahon JE (2015) The definition of lying and deception. In: Zalta EN (ed) Stanford encyclopedia of philosophy. https://plato.stanford.edu/archives/win2016/entries/lyingdefinition/
Mamigonian MA (2015) Academic denial of the Armenian genocide in American scholarship: denialism as manufactured controversy. Genocide Stud Int 9(1):61–82
Marsili N (2014) Lying as a scalar phenomenon. In: Cantarini S, Abraham W, Leiss E (eds) Certainty-uncertainty—and the attitudinal space in between. John Benjamins Publishing Company, Amsterdam, pp 153–173
Marsili N (2015) Normative accounts of assertion: from Peirce to Williamson, and back again. Rivista Italiana Di Filosofia Del Linguaggio. https://doi.org/10.4396/26SFL2014
Marsili N (2016) Lying by promising. Int Rev Pragmat 8(2):271–313. https://doi.org/10.1163/18773109-00802005
Marsili N (2017) You don’t say! Lying, asserting and insincerity. PhD Dissertation, University of Sheffield. https://etheses.whiterose.ac.uk/19068/
Marsili N (2018a) Truth and assertion: rules versus aims. Analysis 78(4):638–648. https://doi.org/10.1093/analys/any008
Marsili N (2018b) Lying and certainty. In: Meibauer J (ed) The Oxford handbook of lying. Oxford University Press, Oxford, pp 169–182. https://doi.org/10.1093/oxfordhb/9780198736578.013.12
Marsili N (2019) The norm of assertion: a “constitutive” rule? Inquiry. https://doi.org/10.1080/0020174X.2019.1667868
Marsili N (2020a) Lies, common ground and performative utterances. Erkenntnis. https://doi.org/10.1007/s10670-020-00368-4
Marsili N (2020b) The definition of assertion. SSRN. https://doi.org/10.2139/ssrn.3711804
Marsili N (2021a) Lying: knowledge or belief? Philos Stud. https://doi.org/10.1007/s11098-021-01713-1
Marsili N (2021b) Lying, speech acts, and commitment. Synthese 199:3245–3269. https://doi.org/10.1007/s11229-020-02933-4
Marsili N (2021c) Eliot Michaelson and Andreas Stokke (Eds.), Lying: language, knowledge, ethics, and politics (Oxford: Oxford University Press, 2018), Pp. 320. Utilitas 33(4): 502–505. https://doi.org/10.1017/S0953820821000182
Marsili N (2022a) Immoral lies and partial beliefs. Inquiry 65(1):117–127. https://doi.org/10.1080/0020174X.2019.1667865
Marsili N (2022b) Fictions that purport to tell the truth. Philos Q. https://doi.org/10.1093/pq/pqac035
Marsili N, Green M (2021) Assertion: a (partly) social speech act. J Pragmat 181(August):17–28. https://doi.org/10.1016/j.pragma.2021.03.016
Marsili N, G Löhr (forthcoming) Saying, commitment, and the lying - misleading distinction. J Philos
Marušić B (2012) Belief and difficult action. Philosophers 12(18):1–30
Marušić B (2015) Evidence & agency: norms of belief for promising and resolving. Oxford University Press, Oxford
Meibauer J (2005) Lying and falsely implicating. J Pragmat 37(9):1373–1399. https://doi.org/10.1016/j.pragma.2004.12.007
Meibauer J (2011) On lying: intentionality, implicature, and imprecision. Intercult Pragmat 2(8):277–292. https://doi.org/10.1515/IPRG.2011.013
Meibauer J (2014) Lying at the semantics-pragmatics interface. Lying at the semantics-pragmatics interface. De Gruyter, Berlin, Boston. https://doi.org/10.1515/9781614510840
Meibauer J (2018) The linguistics of lying. Annu Rev Linguist 4(1):357–375. https://doi.org/10.1146/annurev-linguistics-011817-045634
Meijers A (1999) Belief, cognition, and the will. Tilburg University Press, Tilburg
Meijers A (2002) Collective agents and cognitive attitudes. ProtoSociology 16(April):70–85. https://doi.org/10.5840/protosociology20021621
Meijers A (2007) Collective speech acts. In: Tsohatzidis SL (ed) Intentional acts and institutional facts: essays on John Searle’s social ontology. Springer, Dordrecht, pp 93–110. https://doi.org/10.1007/978-1-4020-6104-2_4
Milić I (2017) Against selfless assertions. Philos Stud 174(9):2277–2295. https://doi.org/10.1007/s11098-016-0798-9
Montminy M (2013) The single norm of assertion. Perspectives on pragmatics and philosophy. Springer, Cham, pp 35–52
Morris B, Zeevi D (2019) The thirty-year genocide: Turkey’s destruction of its christian minorities, 1894–1924. Harvard University Press, Harvard
O’Brien D (2007) Testimony and lies. Philos Q 57(227):225–238. https://doi.org/10.1111/j.1467-9213.2007.481.x
Pagin P (2004) Is assertion social? J Pragmat 36(5):833–859. https://doi.org/10.1016/j.pragma.2003.10.004
Pagin P (2009) Assertion not possibly social. J Pragmat 41(12):2563–2567. https://doi.org/10.1016/j.pragma.2008.12.014
Pagin P (2014) Assertion. In: Stanford enciclopedia of philosophy, Winter 2014 edition
Pagin P, Marsili N (2021) Assertion. In: Stanford enciclopedia of philosophy, Winter 2021 edition. https://plato.stanford.edu/archives/win2021/entries/assertion/
Pegan P (2009) Why assertion may yet be social. J Pragmat 41(12):2557–2562. https://doi.org/10.1016/j.pragma.2008.12.009
Peirce CS (n.d.) (CP) Collected papers of Charles sanders Peirce. In: Hartshorne C, Weiss P, Burks AW (eds) Harvard University Press, Cambridge, MA
Peirce CS (n.d.) (MS) The Charles S. Peirce Papers. Harvard University Library, Harvard
Pepp J (2018) Truth serum, liar serum, and some problems about saying what you think is false. Lying: language, knowledge, ethics, politics. Oxford University Press, Oxford
Pettit P (2003) Groups with minds of their own. In: Frederick S (ed) Socialising metaphysics: the nature of social reality, Rowman & Littlefield Inc, London, pp 167–93. https://doi.org/10.1111/josp.12295.
Quinton A (1976) Social objects. Proc Aristot Soc 76:1–27
Reins LM, Wiegmann A (2021) Is lying bound to commitment? Empirically investigating deceptive presuppositions, implicatures, and actions. Cognit Sci. https://doi.org/10.1111/cogs.12936
Rescorla M (2009) Assertion and its constitutive norms. Philos Phenomenol Res. https://doi.org/10.1111/j.1933-1592.2009.00268.x/full
Rutschmann R, Wiegmann A (2017) No need for an intention to deceive? Challenging traditional definition of lying. Philos Psychol. https://doi.org/10.1080/09515089.2016.1277382
Saul J (2000) Did clinton say something false? Analysis 60(3):255–257
Saul J (2012) Lying, misleading, and the role of what is said. Oxford University Press, Oxford
Schmid HB, Sirtes D, Weber M (2013) Collective epistemology. De Gruyter. https://doi.org/10.1515/9783110322583
Schweikard DP, Schmid HB (2021) Collective intentionality. In: EN Zalta (ed) Stanford encyclopedia of philosophy. https://plato.stanford.edu/archives/sum2017/entries/collective-responsibility/%0Ahttp://plato.stanford.edu/board.html.
Searle JR (1969) Speech acts: an essay in the philosophy of language. Cambridge University Press, Cambridge
Searle JR (1975) The logical status of fictional discourse. New Literary Hist 6(2):319–332
Searle JR, Vanderveken D (1985) Foundations of illocutionary logic. Cambridge University Press, Cambridge
Shapiro L (2018) Commitment accounts of assertion. In: Goldberg S (ed) The Oxford handbook of assertion, Oxford University Press, pp 73–97. https://doi.org/10.1093/oxfordhb/9780190675233.013.3
Siegler FA (1966) Lying. Am Philos Q 3(2):128–136
Sneddon A (2020) Alternative motivation and lies. Analysis. https://doi.org/10.1093/analys/anaa027
Sorensen R (2007) Bald-faced lies! lying without the intent to deceive. Pac Philos Q 88:251–264. https://doi.org/10.1111/j.1468-0114.2007.00290.x/full
Sorensen R (2010) Knowledge-lies. Analysis 70(4):608–615. https://doi.org/10.1093/analys/anq072
Sorensen R (2011) What lies behind misspeaking. Am Philos Q 48(4):399–410
Sorensen R (2018) Lying to mindless machines. In: Michaelson E, Stokke A (eds) Lying: language, knowledge, ethics, politics. Oxford University Press, Oxford, pp 1–23. https://doi.org/10.1093/oso/9780198743965.001.0001
Sorensen R (2022) Lie for me: the intent to deceive fails to scale up. Synthese 200(2):130. https://doi.org/10.1007/s11229-022-03603-3
Staffel J (2018) Knowledge-lies and group lies. In: Meibauer J (ed) The Oxford handbook of lying, Oxford University Press
Stainton RJ (2016) Full-on stating. Mind Lang 31(4):395–413. https://doi.org/10.1111/mila.12112
Stokke A (2013a) Lying and asserting. J Philos 110(1):33–60. https://doi.org/10.5840/jphil2013110144
Stokke A (2013b) Lying, deceiving, and misleading. Philos Compass 8(4):348–359. https://doi.org/10.1111/phc3.12022
Stokke A (2014) Insincerity. Noûs 48(3):496–520. https://doi.org/10.1111/nous.12001
Stokke A (2018) Lying and insincerity. Oxford University Press, Oxford
Strudler A (2009) The distinctive wrong in lying. Ethical Theory Moral Pract 13(2):171–179. https://doi.org/10.1007/s10677-009-9194-2
Tanesini A (2016) “Calm down, dear”: intellectual arrogance, silencing and ignorance. Proc Aristot Soc Suppl 90(1):71–92. https://doi.org/10.1093/arisup/akw011
Tanesini A (2020) Silencing and assertion. In: Goldberg S (ed) The Oxford handbook of assertion. Oxford University Press, Oxford. https://doi.org/10.1093/oxfordhb/9780190675233.013.31
Tollefsen DP (2003) Rejecting rejectionism. ProtoSociology 18(19):389–405. https://doi.org/10.5840/protosociology200318/1916
Tollefsen DP (2007) Group testimony. Soc Epistemol 21(3):299–311. https://doi.org/10.1080/02691720701674163
Tollefsen DP (2009) Wikipedia and the epistemology of testimony. Episteme 6(1):8–24. https://doi.org/10.3366/e1742360008000518
Tollefsen DP (2020) Can groups assert that P? In: Goldberg SC (ed) The Oxford handbook of assertion. Oxford University Press, Oxford, pp 326–344. https://doi.org/10.1093/oxfordhb/9780190675233.013.34
Townsend L (2018) Group assertion and group silencing. Lang Commun 2014:1–26
Townsend L (2020) The epistemology of collective testimony. J Soc Ontol 6(2):187–210. https://doi.org/10.1515/jso-2019-0044
Trpin B, Dobrosovestnova A, Sebastian JG (2021) A computer simulation study of graded lies and trust dynamics. Synthese 199(1):991–1018
Tuomela R (1992) Group beliefs. Synthese 91(3):285–318. https://doi.org/10.1007/BF00413570
Tuomela R (1995) The importance of Us: a philosophical study of basic social notions, vol 108. Stanford University Press, Stanford
Tuomela R (2013) Social ontology: collective intentionality and group agents. Oxford University Press, New York
Turri J (2014) Selfless assertions: some empirical evidence. Synthese. https://doi.org/10.1007/s11229-014-0621-0
Viebahn E (2017) Non-literal lies. Erkenntnis. https://doi.org/10.1007/s10670-017-9880-8
Viebahn E (2021) The lying/misleading distinction: a commitment-based approach. J Philos CXVIII(6)
Webber J (2013) Liar! Analysis 73(4):651–659. https://doi.org/10.1093/analys/ant081
Whyte J (2013) Review of “Lying, Misleading & What Is Said”, by J. M. Saul. Philos Q 64(254):209–210
Williams BAO (2002) Truth and truthfulness an essay in genealogy. Princeton University Press, Princeton
Wray KB (2001) Collective belief and acceptance. Synthese 129(3):319–333. https://doi.org/10.1023/A:1013148515033
Wray KB (2003) What really divides Gilbert and the rejectionists? ProtoSociology 18/19:363–376. https://doi.org/10.5840/protosociology200318/1914
Wright C (1992) Truth and objectivity. Harvard University Press, Cambridge
Funding
Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. Funding was provided by Ministerio de Ciencia e Innovación (Grant No. PID2020-119588GB-I00).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The author declares not to have any conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Marsili, N. Group Assertions and Group Lies. Topoi 42, 369–384 (2023). https://doi.org/10.1007/s11245-022-09875-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11245-022-09875-1