Abstract
This book presents a new foundation for information theory where the notion of information is defined in terms of distinctions, differences, distinguishability, and diversity. The direct measure is logical entropy which is the quantitative measure of the distinctions made by a partition. Shannon entropy is a transform or re-quantification of logical entropy for Claude Shannon’s “mathematical theory of communications.” The interpretation of the logical entropy of a partition is the two-draw probability of getting a distinction of the partition (a pair of elements distinguished by the partition) so it realizes a dictum of Gian-Carlo Rota: \(\frac {Probability}{Subsets}\approx \frac {Information}{Partitions}\). Andrei Kolmogorov suggested that information should be defined independently of probability, so logical entropy is first defined in terms of the set of distinctions of a partition and then a probability measure on the set defines the quantitative version of logical entropy. We give a history of the logical entropy formula that goes back to Corrado Gini’s 1912 “index of mutability” and has been rediscovered many times.
For in the general we must note, That whatever is capable of a competent Difference, perceptible to any Sense, may be a sufficient Means whereby to express the Cogitations.John Wilkins 1641
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
Logical information theory is about what Adriaans and van Benthem call “Information B: Probabilistic, information-theoretic, measured quantitatively”, not about “Information A: knowledge, logic, what is conveyed in informative answers” where the connection to philosophy and logic is built-in from the beginning. Likewise, this book is not about Kolmogorov-style “Information C: Algorithmic, code compression, measured quantitatively.” [2, p. 11]
- 3.
The lattice of partitions on U is isomorphically represented by the lattice of partition relations or ditsets on U × U, so in that sense, the size of the ditset of a partition is its ‘size.’
- 4.
Note that S ¬X and \(S_{\lnot Y} \text{intersect in the diagonal }\Delta \subseteq \left ( X\times Y\right )^{2}\).
References
Aczel, J., and Z. Daroczy. 1975. On Measures of Information and Their Characterization. New York: Academic Press.
Adriaans, Pieter, and Johan van Benthem, ed. 2008. Philosophy of Information. Vol. 8. Handbook of the Philosophy of Science. Amsterdam: North-Holland.
Bennett, Charles H. 2003. Quantum Information: Qubits and Quantum Error Correction. International Journal of Theoretical Physics 42: 153–176. https://doi.org/10.1023/A:1024439131297.
Bhargava, T. N., and V. R. R. Uppuluri. 1975. On an axiomatic derivation of Gini diversity, with applications. Metron 33: 41–53.
Boole, George. 1854. An Investigation of the Laws of Thought on which are founded the Mathematical Theories of Logic and Probabilities. Cambridge: Macmillan and Co.
Ellerman, David. 2009. Counting Distinctions: On the Conceptual Foundations of Shannon’s Information Theory. Synthese 168: 119–149. https://doi.org/10.1007/s11229-008-9333-7.
Ellerman, David. 2014. An introduction to partition logic. Logic Journal of the IGPL 22: 94–125. https://doi.org/10.1093/jigpal/jzt036.
Ellerman, David. 2017. Logical Information Theory: New Foundations for Information Theory. Logic Journal of the IGPL 25 (5 Oct.): 806–35.
Friedman, William F. 1922. The Index of Coincidence and Its Applications in Cryptography. Geneva IL: Riverbank Laboratories.
Gini, Corrado 1912. Variabilità e mutabilità. Bologna: Tipografia di Paolo Cuppini.
Gleick, James 2011. The Information: A History, A Theory, A Flood. New York: Pantheon.
Good, I. J. 1979. A.M. Turing’s statistical work in World War II. Biometrika 66: 393–6.
Good, I. J. 1982. Comment (on Patil and Taillie: Diversity as a Concept and its Measurement). Journal of the American Statistical Association 77: 561–3.
Havrda, Jan, and Frantisek Charvat. 1967. Quantification Methods of Classification Processes: Concept of Structural α-Entropy. Kybernetika (Prague) 3: 30–35.
Kolmogorov, Andrei N. 1983. Combinatorial Foundations of Information Theory and the Calculus of Probabilities. Russian Math. Surveys 38: 29–40.
Kullback, Solomon 1976. Statistical Methods in Cryptoanalysis. Walnut Creek CA: Aegean Park Press.
Kung, Joseph P. S., Gian-Carlo Rota, and Catherine H. Yan. 2009. Combinatorics: The Rota Way. New York: Cambridge University Press.
Laplace, Pierre-Simon. 1995. Philosophical Essay on Probabilities. Translated by A.I. Dale. New York: Springer Verlag.
Rao, C. R. 1982a. Gini-Simpson Index of Diversity: A Characterization, Generalization and Applications. Utilitas Mathematica B 21: 273–282.
Rao, C. R. 1982b. Diversity and Dissimilarity Coefficients: A Unified Approach. Theoretical Population Biology. 21: 24-43.
Rejewski, M. 1981. How Polish Mathematicians Deciphered the Enigma. Annals of the History of Computing 3: 213–34.
Ricotta, Carlo, and Laszlo Szeidl. 2006. Towards a unifying approach to diversity measures: Bridging the gap between the Shannon entropy and Rao’s quadratic index. Theoretical Population Biology 70: 237–43. https://doi.org/10.1016/j.tpb.2006.06.003.
Rota, Gian-Carlo. 2001. Twelve problems in probability no one likes to bring up. In Algebraic Combinatorics and Computer Science: A Tribute to Gian-Carlo Rota, ed. Henry Crapo and Domenico Senato, 57–93. Milano: Springer.
Shannon, Claude E. 1948. A Mathematical Theory of Communication. Bell System Technical Journal 27: 379–423; 623–56.
Simpson, Edward Hugh. 1949. Measurement of Diversity. Nature 163: 688.
Tsallis, Constantino. 1988. Possible Generalization for Boltzmann-Gibbs statistics. J. Stat. Physics 52: 479–87.
Wilkins, John 1707 (1641). Mercury or the Secret and Swift Messenger. London.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Ellerman, D. (2021). Logical Entropy. In: New Foundations for Information Theory. SpringerBriefs in Philosophy. Springer, Cham. https://doi.org/10.1007/978-3-030-86552-8_1
Download citation
DOI: https://doi.org/10.1007/978-3-030-86552-8_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86551-1
Online ISBN: 978-3-030-86552-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)