Skip to main content
Log in

Dretske and Informational Closure

  • General Article
  • Published:
Minds and Machines Aims and scope Submit manuscript

Abstract

Christoph Jäger (Erkenntnis 61:187–201, 2004) has argued that Dretske’s (Knowledge and the flow of Information, MIT Press, Cambridge, 1981) information-based account of knowledge is committed to both knowledge and information closure under known entailment. However, in a reply to Jäger, Dretske (Erkenntnis 64:409–413, 2006) defended his view on the basis of a discrepancy between the relation of information and the relation of logical implication. This paper shares Jäger’s criticism that Dretske’s externalist notion of information implies closure, but provides an analysis based on different grounds. By means of a distinction between two perspectives, the mathematical perspective and the epistemological perspective, I present, in the former, a notion of logical implication that is compatible with the notion of information in the mathematical theory of information, and I show how, in the latter, Dretske’s logical reading of the closure principle is incompatible with his information-theoretic epistemological framework.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. This refers to Shannon (1948), Shannon and Weaver (1964), Reza (1961), Pierce (1980), Ash (1980), Cover (1991), Kåhre (2002), and Gray (2011).

  2. D’Alfonso (2014) offered an interesting modal analysis of Dretske’s ITEF. A more modest approach can still provide the means to reach the goal of this paper.

  3. As Lewis (1976) showed with his triviality results, if one equates the probability of a conditional with a conditional probability, then the probabilistic language becomes trivial (a posteriori probabilities are reduced to a priori probabilities). The proposed reading of the conditional is not exposed to Lewis’ critique, since the probability of the conditional is not equivalent to the conditional probability. Furthermore, for a conditional \(Pr(p \supset q),\) Lewis’ proof requires that both \(Pr(P \cap Q)\) and \(Pr(P \cap \overline{Q})\) be positive, but according to the proposed reading, in the case of PIC, \(Pr(P \cap \overline{Q}) = 0.\)

  4. If one wants to preserve the idea that \(0\leqslant Pr(p \supset q)\leqslant 1,\) which means that \(P \subseteq Q\) to some degree, one could use, for instance, a richer probabilistic interpretation, such as \(Pr(p \supset q) = |P\cap Q |/|P|\) (provided that \(P \ne \varnothing\)), or a graded semantics for the membership relation as in fuzzy set theory. A fuzzy set A is defined by a membership function \(\mu\) such that, for all \(x \in A,\;\mu _{A}: x \mapsto [0,1],\) and the containment relation is defined by \(A \subseteq B\) iff \(\mu _{A}(x) \le \mu _{B}(x)\) for all \(x \in A\) (Klir & Yuan, 1995; Zadeh, 1965). For a measure of the degree of subsethood, see formula (1.19) in Klir and Yuan (1995). Such refinements are not necessary for the sake of the analysis.

  5. Since this concerns sets of truth conditions, there is no need to refer to background knowledge explicitly.

  6. D’Alfonso (2014) also has underlined that consequence.

  7. When Dretske refers to his own theory as a semantic theory of information, he means a theory of information for which the quantity of information under consideration is in correlation with the relevant de re content (Dretske, 1981, p. 66). This is essentially a semantic constraint imposed upon the mathematical theory of information, in which part of the structure of the informational content of the signal is viewed indexically. There is no measure of informational content per se involved, and Dretske has explicitly underlined that information and meaning are two different concepts (Dretske, 1981, p. 72). So Dretske’s theory of information is to be distinguished from other semantic theories of information such as the ones of Carnap (1952) and Floridi (2011). It would have been interesting to have Dretske’s views on the semantic theory proposed by Carnap and Bar-Hillel (1952) and Bar-Hillel and Carnap (1953), but, apart from a footnote simply stating the existence of their work, he has been silent on this matter in 1981 and 2006. In any case, let us underline that the semantic theory of Carnap and Bar-Hillel relies on an adequacy condition that is incompatible with our reading in Section 1. This adequacy condition stipulates that informational content of a statement \(\phi\) includes the informational content of a statement \(\psi\) if \(\phi\) logically implies \(\psi\) (Bar-Hillel & Carnap, 1953, p. 149).

  8. Floridi (2011) exploits the notion of relevance in a somewhat different information-theoretic framework in what he calls a strongly semantic approach to information. However, Floridi’s position is also exposed to a logical difficulty, as Wheeler (2015) underlines. I am indebted to an anonymous reviewer for drawing this article to my attention.

  9. Parts of the analyses of Jäger, Baumann, Shackel, and Luper, depend on this condition. Since Dretske has clarified his stance with respect to the difference between a raising-probability signal and an information-carrying signal, there is no need to discuss this point at length.

  10. Dretske’s Xerox principle (Dretske, 1981, p. 57) seems to lead to such a relation between entailment and information: “It is easy to suppose, then, that this principle [Xerox principle] commits me to closure. All one need assume is that if A entails B, A carries the information that B. That assumption, however, is false” (Dretske, 2006, p. 410).

  11. Dretske writes: “a channel condition generates no information for the signal [...] to carry” (Dretske, 1981, p. 115).

  12. For Dretske (1981, p. 71), the information in (15) and (16) is analytically nested, and in (17) the information is nomically nested.

  13. Following Dretske’s (2005) terminology, the former conditionals are lightweight implications, and the latter are heavyweight implications.

  14. In calling these conditionals ‘skeptical’, the aim is to indicate that they are the conditionals the skeptic typically uses against closure.

  15. If one understands not being a painted mule as being definitionally implied by being a zebra (or informationally nested), then the conditional is a truth-functional one, but this is not the understanding the skeptic has of (18). The skeptic wants to challenge the reliability of perceptual knowledge, and this is a channel condition in Dretske’s sense.

  16. A monotonic consequence relation does not allow for informational gain (Kåhre 2002; Makinson, 2005).

  17. If there were some way of conceiving any informational dependence in a skeptical conditional by some truth-conditional means, then, in the case of (19) for instance, the probability of the reality of the past would increase as the number of past events would increase. There would be an informational dependence between the reality of the past as a whole and each and every event that has occurred. This would be similar to accepting the (problematic) idea that each natural number one could think of or write down would increase the probability of the existence of the set of natural numbers. Another way to look at this is to consider that informational dependence can be made explicit through conditional probability (like in Bayesian networks). In that regard, even though it makes sense to evaluate the probability of a past event given the reality of the past, it is dubious, to say the least, to evaluate the probability of the reality of the past given a past event. On the contrary, when informational dependence can be captured truth-conditionally, the probabilistic evaluation becomes straightforward; all other things being equal, the probability of Joe being at home given that Joe is either at home or at the office is \(\frac{1}{2},\) and the probability of Joe being either at home or at the office given that Joe is at home is 1.

References

Download references

Acknowledgements

I want to thank two anonymous reviewers whose comments and suggestions helped improve and clarify this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yves Bouchard.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bouchard, Y. Dretske and Informational Closure. Minds & Machines 32, 311–322 (2022). https://doi.org/10.1007/s11023-021-09587-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11023-021-09587-2

Keywords

Navigation