Skip to main content

The Varieties of Disinformation

Part of the Synthese Library book series (SYLI,volume 358)

Abstract

Intentionally misleading information (aka Disinformation) is ubiquitous and can be extremely dangerous. Emotional, financial, and even physical harm can easily result if people are misled by deceptive advertising, government propaganda, doctored photographs, forged documents, fake maps, internet frauds, fake websites, and manipulated Wikipedia entries. In order to deal with this serious threat to Information Quality, we need to improve our understanding of the nature and scope of disinformation. One way that work in philosophy can help with this task is by identifying and classifying the various types of disinformation, such as lies, spin, and even bullshit. If we are aware of the various ways that people might try to mislead us, we will be in a better position to avoid being duped by intentionally misleading information. Toward this end, this essay surveys and extends classification schemes that have been proposed by several noted philosophers—including Saint Augustine, Roderick Chisholm, and Paul Grice.

Keywords

  • True Belief
  • Information Quality
  • Representational Content
  • Misleading Information
  • Epistemic Goal

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-07121-3_8
  • Chapter length: 27 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   99.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-07121-3
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   129.00
Price excludes VAT (USA)
Hardcover Book
USD   139.99
Price excludes VAT (USA)

Notes

  1. 1.

    This problem arises with any new information technologies. For instance, when new printing technology first made books widely available, there was often a question of whether or not you held in your hands the authoritative version of a given text (see Johns 1998, 30–31). Techniques eventually developed for assuring ourselves of the authority and reliability of books. But such techniques are not always immediately available with new information technologies.

  2. 2.

    Some philosophers (e.g., Floridi 2011, 80) claim that representational content only counts as information if it is true. This paper will use the term information more broadly, to refer to representational content that is false as well as representational content that is true (cf. Fallis 2011, 202–203).

  3. 3.

    In “The Adventure of the Norwood Builder,” Oldacre uses a little of his own blood, and McFarlane’s thumbprint from a wax seal on an envelope, to place a bloody thumbprint on the wall.

  4. 4.

    Many philosophers (e.g., Chisholm and Feehan 1977, 152; Fallis 2009, 34) claim that lies do not have to be false. Also, some philosophers (e.g., Fallis 2009, 34; Carson 2010, 30) claim that lies do not even have to be intended to mislead. However, if there are such lies, they would clearly not count as disinformation. So, they can safely be set aside for purposes of this paper.

  5. 5.

    Floridi’s (2011, 260) analysis of disinformation is very similar to Fetzer’s. He claims that “misinformation is ‘well-formed and meaningful data (i.e. semantic content) that is false.’ ‘Disinformation’ is simply misinformation purposefully conveyed to mislead the receiver into believing that it is information.”

  6. 6.

    Some people (e.g., O’Neill 2003, §4; Ekman 2009, 28) have a broader notion of lying that counts such statements as lies. But the traditional analysis of lying requires that the statement is believed by the speaker to be false (see Mahon 2008, §1.2).

  7. 7.

    Although Floridi’s analysis does not rule out visual disinformation, it does rule out true disinformation and side effect disinformation.

  8. 8.

    It may be no accident that a piece of information is misleading even if it is not intended to be misleading (see Skyrms 2010, 80; Fallis 2011, 211–212). For instance, false rumors can spread even when everybody passing them along believes what they are saying. But this paper will focus exclusively on intentionally misleading information.

  9. 9.

    Skyrms does not say this explicitly. But it is a consequence of his analysis of deception (see Fallis forthcoming).

  10. 10.

    The fact that accuracy is critical does not mean that the other dimensions are not also critical. For instance, in addition to misleading people by disseminating inaccurate information, governments often keep people in the dark by making accurate information inaccessible.

  11. 11.

    Even if a piece of information is intended to be misleading about the accuracy of the content, it might—unbeknownst to the source—actually be accurate. (As the protagonist of Oscar Wilde’s The Importance of Being Earnest laments when he learns that his name really is Ernest, “it is a terrible thing for a man to find out suddenly that all his life he has been speaking nothing but the truth.”) But since it is not misleading, such accurate information would clearly not count as disinformation.

  12. 12.

    A little bit of controversy actually remains about the genuineness of the map (see Monmonier 1995, 102).

  13. 13.

    The Vinland Map is another example where a piece of information was intended to be misleading about the identity of the source.

  14. 14.

    If the type of person that you impersonate is one that never existed before, Bell and Whaley (1991, 58) call the technique “inventing” rather than “mimicking.”

  15. 15.

    This is different from the feature (announced on April Fools’ Day) that supposedly allowed Gmail users to actually send messages into the past (see Google 2008).

  16. 16.

    There are special cases where the content being accurate might imply that the source believes the content. For instance, the claim that the source believes the content might be part of the content. Alternatively, it might be that this particular source is likely to have a lot of true beliefs on this particular topic. So, if the content is accurate it means that the source is likely to know it.

  17. 17.

    It might be suggested that you can only show someone something if that thing is true. But the sense of showing (or “deliberately and openly letting someone know”) that Grice has in mind is not factive.

  18. 18.

    As we learn in Martin’s (2011) A Dance with Dragons, the Imp is actually alive and well across the Narrow Sea.

  19. 19.

    But this technique can also be used to get someone to believe something true. Regardless of whether p is true or false, you might create fake evidence for p or you might just arrange it so that someone will run across existing evidence for p.

  20. 20.

    In House of Cards, Francis Urquhart regularly tells people something without actually saying it by using his catchphrase, “You might very well think that; I couldn’t possibly comment.”

  21. 21.

    When I say that my friend has been hanging around the Nevada a lot, I am not telling the villain that my friend is there now. In fact, my reply suggests that I do not know for sure where my friend is. However, there are examples of false implicature where the speaker does tell his audience precisely what they ask for. For instance, suppose that François is a Frenchman who does not know how to cook. When someone asks him whether he knows how to cook, François might haughtily reply, “I am French!” In that case, he has told his audience (falsely) that he knows how to cook (see Recanati 2004, 5).

  22. 22.

    The Three Stooges fall victim to a fake treasure map in the 1937 film Cash and Carry. Such disinformation can be quite convincing because of “the alluring believability of cartographic images. Maps have an authority that even scholars are reluctant to question” (Monmonier 1995, 103).

  23. 23.

    Let us assume that it is common knowledge that Cracow and Lemberg are the only two possible destinations. A more common version of this joke uses the Russian cities of Minsk and Pinsk.

  24. 24.

    The obvious reading of Freud’s joke is that it was an unsuccessful double bluff. But it might have been a successful triple bluff. As is suggested by the “battle of wits” between Vizzini and the Dread Pirate Roberts in The Princess Bride, even higher-order bluffs are possible.

  25. 25.

    It is not common knowledge between Leonato and Benedick that Benedick is eavesdropping. But it could be completely open that the person that you want to mislead is listening in. For instance, you might make (or just pretend to make) a phone call in her presence. Since you are not addressing her, you are still not assuring her that what you say is true.

  26. 26.

    As a result, this technique of allowing yourself to be overheard can also be effective at getting someone to believe something true. For instance, Adele Faber and Elaine Mazlish (1980, 205) recommend that, in order to promote self-esteem, you should let your children overhear you saying something positive about them.

  27. 27.

    Violations of Gricean norms of conversation are only misleading if the audience assumes that the speaker is being cooperative. But it is also possible (e.g., with a double bluff) to mislead someone even if she assumes that you are not being cooperative.

  28. 28.

    In addition to misleading them about what you say, you are likely to mislead your audience about your having good evidence for what you say.

  29. 29.

    Athanasius also misled them into thinking that he did not know exactly where Athanasius was. If you provide fewer details than your audience clearly would like, it is legitimate for them to conclude that you do not know any more details (see Grice 1989, 33; Fallis forthcoming, §3.2).

  30. 30.

    This technique of being selective about the information that you provide can also be used to get someone to believe something true. This seems to be what happened during the “Climategate” controversy. In public reports, scientists left out data that might have suggested to people that temperatures were not increasing (see Tierney 2009).

  31. 31.

    Floridi (1996, 510) is also interested in cases where the process is accidentally defective. However, the focus here is on activities that are intentionally misleading.

  32. 32.

    Depending on your ontology of databases, it might be argued that this actually is a case of creating misleading information rather than just a case of removing information. Dooku arguably created a new database that lacks certain information (cf. Renear and Wickett 2009).

  33. 33.

    During the 2012 Presidential election, Bing created a special website for news about the election (see Bing 2012). This website put some of the personalization back in the hands of internet users. It had a “thermometer” that allowed the user to change the list of stories that would be featured by selecting strongly left leaning, moderately left leaning, center, moderately right leaning, or strongly right leaning. This feature probably increases biasing since left (right) leaning internet users probably ended up selecting left (right) leaning stories. But this feature could also be used to fight biasing since epistemologically savvy users could select right leaning stories if they were left leaning (or vice versa).

  34. 34.

    Bell and Whaley actually classify decoying as way to show the false rather than as a way to hide the real. But while it might involve showing the false (e.g., a bird faking a broken wing), it might not (e.g., a bird with a real broken wing might also attempt to distract predators from her nest). In any event, the ultimate goal of decoying is clearly to hide the real.

  35. 35.

    Since the Germans had to intercept the documents, the Mincemeat ruse is also an example of misleading by pretending to tell someone else. In fact, it might even be an example of a triple bluff. Despite many flaws in the Mincemeat ruse (e.g., the body was much more decomposed than it should have been), the Germans did not notice them and were simply fooled by the documents. But as Gladwell (2010) points out, if the Germans had noticed these flaws …

    maybe they would have found the flaws in Mincemeat a little too obvious, and concluded that the British were trying to deceive Germany into thinking that they were trying to deceive Germany into thinking that Greece and Sardinia were the real targets—in order to mask the fact that Greece and Sardinia were the real targets.

  36. 36.

    I would like to thank Tony Doyle, Phyllis Illari, and Kay Mathiesen for extremely helpful feedback. This research was supported by a Research Professorship from the Social and Behavioral Sciences Research Institute at the University of Arizona.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Don Fallis .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Fallis, D. (2014). The Varieties of Disinformation. In: Floridi, L., Illari, P. (eds) The Philosophy of Information Quality. Synthese Library, vol 358. Springer, Cham. https://doi.org/10.1007/978-3-319-07121-3_8

Download citation