Abstract
Christoph Jäger (Erkenntnis 61:187–201, 2004) has argued that Dretske’s (Knowledge and the flow of Information, MIT Press, Cambridge, 1981) information-based account of knowledge is committed to both knowledge and information closure under known entailment. However, in a reply to Jäger, Dretske (Erkenntnis 64:409–413, 2006) defended his view on the basis of a discrepancy between the relation of information and the relation of logical implication. This paper shares Jäger’s criticism that Dretske’s externalist notion of information implies closure, but provides an analysis based on different grounds. By means of a distinction between two perspectives, the mathematical perspective and the epistemological perspective, I present, in the former, a notion of logical implication that is compatible with the notion of information in the mathematical theory of information, and I show how, in the latter, Dretske’s logical reading of the closure principle is incompatible with his information-theoretic epistemological framework.
Similar content being viewed by others
Notes
D’Alfonso (2014) offered an interesting modal analysis of Dretske’s ITEF. A more modest approach can still provide the means to reach the goal of this paper.
As Lewis (1976) showed with his triviality results, if one equates the probability of a conditional with a conditional probability, then the probabilistic language becomes trivial (a posteriori probabilities are reduced to a priori probabilities). The proposed reading of the conditional is not exposed to Lewis’ critique, since the probability of the conditional is not equivalent to the conditional probability. Furthermore, for a conditional \(Pr(p \supset q),\) Lewis’ proof requires that both \(Pr(P \cap Q)\) and \(Pr(P \cap \overline{Q})\) be positive, but according to the proposed reading, in the case of PIC, \(Pr(P \cap \overline{Q}) = 0.\)
If one wants to preserve the idea that \(0\leqslant Pr(p \supset q)\leqslant 1,\) which means that \(P \subseteq Q\) to some degree, one could use, for instance, a richer probabilistic interpretation, such as \(Pr(p \supset q) = |P\cap Q |/|P|\) (provided that \(P \ne \varnothing\)), or a graded semantics for the membership relation as in fuzzy set theory. A fuzzy set A is defined by a membership function \(\mu\) such that, for all \(x \in A,\;\mu _{A}: x \mapsto [0,1],\) and the containment relation is defined by \(A \subseteq B\) iff \(\mu _{A}(x) \le \mu _{B}(x)\) for all \(x \in A\) (Klir & Yuan, 1995; Zadeh, 1965). For a measure of the degree of subsethood, see formula (1.19) in Klir and Yuan (1995). Such refinements are not necessary for the sake of the analysis.
Since this concerns sets of truth conditions, there is no need to refer to background knowledge explicitly.
D’Alfonso (2014) also has underlined that consequence.
When Dretske refers to his own theory as a semantic theory of information, he means a theory of information for which the quantity of information under consideration is in correlation with the relevant de re content (Dretske, 1981, p. 66). This is essentially a semantic constraint imposed upon the mathematical theory of information, in which part of the structure of the informational content of the signal is viewed indexically. There is no measure of informational content per se involved, and Dretske has explicitly underlined that information and meaning are two different concepts (Dretske, 1981, p. 72). So Dretske’s theory of information is to be distinguished from other semantic theories of information such as the ones of Carnap (1952) and Floridi (2011). It would have been interesting to have Dretske’s views on the semantic theory proposed by Carnap and Bar-Hillel (1952) and Bar-Hillel and Carnap (1953), but, apart from a footnote simply stating the existence of their work, he has been silent on this matter in 1981 and 2006. In any case, let us underline that the semantic theory of Carnap and Bar-Hillel relies on an adequacy condition that is incompatible with our reading in Section 1. This adequacy condition stipulates that informational content of a statement \(\phi\) includes the informational content of a statement \(\psi\) if \(\phi\) logically implies \(\psi\) (Bar-Hillel & Carnap, 1953, p. 149).
Floridi (2011) exploits the notion of relevance in a somewhat different information-theoretic framework in what he calls a strongly semantic approach to information. However, Floridi’s position is also exposed to a logical difficulty, as Wheeler (2015) underlines. I am indebted to an anonymous reviewer for drawing this article to my attention.
Parts of the analyses of Jäger, Baumann, Shackel, and Luper, depend on this condition. Since Dretske has clarified his stance with respect to the difference between a raising-probability signal and an information-carrying signal, there is no need to discuss this point at length.
Dretske’s Xerox principle (Dretske, 1981, p. 57) seems to lead to such a relation between entailment and information: “It is easy to suppose, then, that this principle [Xerox principle] commits me to closure. All one need assume is that if A entails B, A carries the information that B. That assumption, however, is false” (Dretske, 2006, p. 410).
Dretske writes: “a channel condition generates no information for the signal [...] to carry” (Dretske, 1981, p. 115).
For Dretske (1981, p. 71), the information in (15) and (16) is analytically nested, and in (17) the information is nomically nested.
Following Dretske’s (2005) terminology, the former conditionals are lightweight implications, and the latter are heavyweight implications.
In calling these conditionals ‘skeptical’, the aim is to indicate that they are the conditionals the skeptic typically uses against closure.
If one understands not being a painted mule as being definitionally implied by being a zebra (or informationally nested), then the conditional is a truth-functional one, but this is not the understanding the skeptic has of (18). The skeptic wants to challenge the reliability of perceptual knowledge, and this is a channel condition in Dretske’s sense.
If there were some way of conceiving any informational dependence in a skeptical conditional by some truth-conditional means, then, in the case of (19) for instance, the probability of the reality of the past would increase as the number of past events would increase. There would be an informational dependence between the reality of the past as a whole and each and every event that has occurred. This would be similar to accepting the (problematic) idea that each natural number one could think of or write down would increase the probability of the existence of the set of natural numbers. Another way to look at this is to consider that informational dependence can be made explicit through conditional probability (like in Bayesian networks). In that regard, even though it makes sense to evaluate the probability of a past event given the reality of the past, it is dubious, to say the least, to evaluate the probability of the reality of the past given a past event. On the contrary, when informational dependence can be captured truth-conditionally, the probabilistic evaluation becomes straightforward; all other things being equal, the probability of Joe being at home given that Joe is either at home or at the office is \(\frac{1}{2},\) and the probability of Joe being either at home or at the office given that Joe is at home is 1.
References
Adams, E. W. (1975). The logic of conditionals. An application of probability to deductive logic (Vol. 86). Synthese Library. D. Reidel. https://doi.org/10.1007/978-94-015-7622-2_1
Ash, R. B. (1980). Information theory. Dover.
Bar-Hillel, Y., & Carnap, R. (1953). Semantic information. The British Journal for the Philosophy of Science, 4, 147–157.
Baumann, P. (2006). Information, closure, and knowledge: On Jäger’s objection to Dretske. Erkenntnis, 64, 403–408. https://doi.org/10.1007/s10670-005-6193-0
Bennett, J. (2003). A philosophical guide to conditionals. Oxford University Press. https://doi.org/10.1093/0199258872.001.0001
Carnap, R., & Bar-Hillel, Y. (1952). An outline of a semantic theory of information. Technical report. Massachusetts Instutute of Technology.
Cover, T. M., & Thomas, J. A. (1991). Elements of information theory. Wiley.
D’Alfonso, S. (2014). The logic of knowledge and the flow of information. Minds and Machines, 24, 307–325. https://doi.org/10.1007/s11023-013-9310-x
Dretske, F. I. (1970). Epistemic operators. The Journal of Philosophy, 67, 1007–1023. https://doi.org/10.2307/2024710
Dretske, F. I. (2006). Information and closure. Erkenntnis, 64, 409–413. https://doi.org/10.1007/s10670-005-5815-x
Dretske, F. I. (1981). Knowledge and the flow of information. MIT.
Dretske, F. I., & Hawthorne, J. (2005). Is knowledge closed under known entailment? In M. Steup & E. Sosa (Eds.), Contemporary debates in epistemology (pp. 13–46). Blackwell.
Floridi, L. (2011). The philosophy of information. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199232383.001.0001
Gray, R. M. (2011). Entropy and information theory (2nd ed.). Springer. https://doi.org/10.1007/978-1-4419-7970-4
Jackson, F. (1991). Classifying conditionals II. Analysis, 51, 137–143. https://doi.org/10.1093/analys/51.3.137
Jäger, C. (2004). Skepticism, information, and closure: Dretske’s theory of knowledge. Erkenntnis, 61, 187–201. https://doi.org/10.1007/s10670-004-9283-5
Kåhre, J. (2002). The mathematical theory of information. Springer. https://doi.org/10.1007/978-1-4615-0975-2
Klir, G. J., & Yuan, B. (1995). Fuzzy sets and fuzzy logic. Theory and applications. Prentice-Hall.
Lewis, D. (1976). Probabilities of conditionals and conditional probabilities. The Philosophical Review, 85, 297–315. https://doi.org/10.1007/978-94-009-9117-0_6
Makinson, D. (2005). Bridges from classical to nonmonotonic logic. King’s College.
Pierce, J. R. (1980). An introduction to information theory: Symbols, signals and noise (2nd ed.). Dover.
Reza, F. M. (1961). An introduction to information theory. McGraw-Hill.
Shackel, N. (2006). Shutting Dretske’s door. Erkenntnis, 64, 393–401. https://doi.org/10.1007/s10670-006-9002-5
Shannon, C. E. (1948). The mathematical theory of communication. The Bell System Technical Journal, 27, 379–423, 623–656.
Shannon, C. E., & Weaver, W. (1964). The mathematical theory of communication. The University of Illinois Press.
Wheeler, G. (2015). Is there a logic of information? Journal of Experimental and Theoretical Artificial Intelligence, 27, 95-98. https://doi.org/10.1080/0952813X.2014.941680
Zadeh, L. A. (1965). Fuzzy sets. Information and Control, 8, 338–353. https://doi.org/10.1016/S0019-9958(65)90241-X
Acknowledgements
I want to thank two anonymous reviewers whose comments and suggestions helped improve and clarify this paper.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Bouchard, Y. Dretske and Informational Closure. Minds & Machines 32, 311–322 (2022). https://doi.org/10.1007/s11023-021-09587-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11023-021-09587-2