Skip to main content

Logical Entropy

  • Chapter
  • First Online:
New Foundations for Information Theory

Part of the book series: SpringerBriefs in Philosophy ((BRIEFSPHILOSOPH))

  • 622 Accesses

Abstract

This book presents a new foundation for information theory where the notion of information is defined in terms of distinctions, differences, distinguishability, and diversity. The direct measure is logical entropy which is the quantitative measure of the distinctions made by a partition. Shannon entropy is a transform or re-quantification of logical entropy for Claude Shannon’s “mathematical theory of communications.” The interpretation of the logical entropy of a partition is the two-draw probability of getting a distinction of the partition (a pair of elements distinguished by the partition) so it realizes a dictum of Gian-Carlo Rota: \(\frac {Probability}{Subsets}\approx \frac {Information}{Partitions}\). Andrei Kolmogorov suggested that information should be defined independently of probability, so logical entropy is first defined in terms of the set of distinctions of a partition and then a probability measure on the set defines the quantitative version of logical entropy. We give a history of the logical entropy formula that goes back to Corrado Gini’s 1912 “index of mutability” and has been rediscovered many times.

For in the general we must note, That whatever is capable of a competent Difference, perceptible to any Sense, may be a sufficient Means whereby to express the Cogitations.John Wilkins 1641

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Many of the results about logical entropy were developed in [6] and [8].

  2. 2.

    Logical information theory is about what Adriaans and van Benthem call “Information B: Probabilistic, information-theoretic, measured quantitatively”, not about “Information A: knowledge, logic, what is conveyed in informative answers” where the connection to philosophy and logic is built-in from the beginning. Likewise, this book is not about Kolmogorov-style “Information C: Algorithmic, code compression, measured quantitatively.” [2, p. 11]

  3. 3.

    The lattice of partitions on U is isomorphically represented by the lattice of partition relations or ditsets on U × U, so in that sense, the size of the ditset of a partition is its ‘size.’

  4. 4.

    Note that S ¬X and \(S_{\lnot Y} \text{intersect in the diagonal }\Delta \subseteq \left ( X\times Y\right )^{2}\).

References

  1. Aczel, J., and Z. Daroczy. 1975. On Measures of Information and Their Characterization. New York: Academic Press.

    Google Scholar 

  2. Adriaans, Pieter, and Johan van Benthem, ed. 2008. Philosophy of Information. Vol. 8. Handbook of the Philosophy of Science. Amsterdam: North-Holland.

    Google Scholar 

  3. Bennett, Charles H. 2003. Quantum Information: Qubits and Quantum Error Correction. International Journal of Theoretical Physics 42: 153–176. https://doi.org/10.1023/A:1024439131297.

    Article  Google Scholar 

  4. Bhargava, T. N., and V. R. R. Uppuluri. 1975. On an axiomatic derivation of Gini diversity, with applications. Metron 33: 41–53.

    Google Scholar 

  5. Boole, George. 1854. An Investigation of the Laws of Thought on which are founded the Mathematical Theories of Logic and Probabilities. Cambridge: Macmillan and Co.

    Google Scholar 

  6. Ellerman, David. 2009. Counting Distinctions: On the Conceptual Foundations of Shannon’s Information Theory. Synthese 168: 119–149. https://doi.org/10.1007/s11229-008-9333-7.

    Article  Google Scholar 

  7. Ellerman, David. 2014. An introduction to partition logic. Logic Journal of the IGPL 22: 94–125. https://doi.org/10.1093/jigpal/jzt036.

    Article  Google Scholar 

  8. Ellerman, David. 2017. Logical Information Theory: New Foundations for Information Theory. Logic Journal of the IGPL 25 (5 Oct.): 806–35.

    Google Scholar 

  9. Friedman, William F. 1922. The Index of Coincidence and Its Applications in Cryptography. Geneva IL: Riverbank Laboratories.

    Google Scholar 

  10. Gini, Corrado 1912. Variabilità e mutabilità. Bologna: Tipografia di Paolo Cuppini.

    Google Scholar 

  11. Gleick, James 2011. The Information: A History, A Theory, A Flood. New York: Pantheon.

    Google Scholar 

  12. Good, I. J. 1979. A.M. Turing’s statistical work in World War II. Biometrika 66: 393–6.

    Article  Google Scholar 

  13. Good, I. J. 1982. Comment (on Patil and Taillie: Diversity as a Concept and its Measurement). Journal of the American Statistical Association 77: 561–3.

    Article  Google Scholar 

  14. Havrda, Jan, and Frantisek Charvat. 1967. Quantification Methods of Classification Processes: Concept of Structural α-Entropy. Kybernetika (Prague) 3: 30–35.

    Google Scholar 

  15. Kolmogorov, Andrei N. 1983. Combinatorial Foundations of Information Theory and the Calculus of Probabilities. Russian Math. Surveys 38: 29–40.

    Article  Google Scholar 

  16. Kullback, Solomon 1976. Statistical Methods in Cryptoanalysis. Walnut Creek CA: Aegean Park Press.

    Google Scholar 

  17. Kung, Joseph P. S., Gian-Carlo Rota, and Catherine H. Yan. 2009. Combinatorics: The Rota Way. New York: Cambridge University Press.

    Book  Google Scholar 

  18. Laplace, Pierre-Simon. 1995. Philosophical Essay on Probabilities. Translated by A.I. Dale. New York: Springer Verlag.

    Google Scholar 

  19. Rao, C. R. 1982a. Gini-Simpson Index of Diversity: A Characterization, Generalization and Applications. Utilitas Mathematica B 21: 273–282.

    Google Scholar 

  20. Rao, C. R. 1982b. Diversity and Dissimilarity Coefficients: A Unified Approach. Theoretical Population Biology. 21: 24-43.

    Article  Google Scholar 

  21. Rejewski, M. 1981. How Polish Mathematicians Deciphered the Enigma. Annals of the History of Computing 3: 213–34.

    Article  Google Scholar 

  22. Ricotta, Carlo, and Laszlo Szeidl. 2006. Towards a unifying approach to diversity measures: Bridging the gap between the Shannon entropy and Rao’s quadratic index. Theoretical Population Biology 70: 237–43. https://doi.org/10.1016/j.tpb.2006.06.003.

    Article  Google Scholar 

  23. Rota, Gian-Carlo. 2001. Twelve problems in probability no one likes to bring up. In Algebraic Combinatorics and Computer Science: A Tribute to Gian-Carlo Rota, ed. Henry Crapo and Domenico Senato, 57–93. Milano: Springer.

    Chapter  Google Scholar 

  24. Shannon, Claude E. 1948. A Mathematical Theory of Communication. Bell System Technical Journal 27: 379–423; 623–56.

    Article  Google Scholar 

  25. Simpson, Edward Hugh. 1949. Measurement of Diversity. Nature 163: 688.

    Article  Google Scholar 

  26. Tsallis, Constantino. 1988. Possible Generalization for Boltzmann-Gibbs statistics. J. Stat. Physics 52: 479–87.

    Article  Google Scholar 

  27. Wilkins, John 1707 (1641). Mercury or the Secret and Swift Messenger. London.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ellerman, D. (2021). Logical Entropy. In: New Foundations for Information Theory. SpringerBriefs in Philosophy. Springer, Cham. https://doi.org/10.1007/978-3-030-86552-8_1

Download citation

Publish with us

Policies and ethics