Journal of Statistical Physics

, Volume 7, Issue 4, pp 301–310

A comparison of the Shannon and Kullback information measures

  • Arthur Hobson
  • Bin-Kang Cheng
Articles

Abstract

Two widely used information measures are compared. It is shown that the Kullback measure, unlike the Shannon measure, provides the basis for a consistent theory of information which extends to continuous sample spaces and to nonconstant prior distributions. It is shown that the Kullback measure is a generalization of the Shannon measure, and that the Kullback measure has more reasonable additivity properties than does the Shannon measure. The results lend support to Jaynes's entropy maximization procedure.

Key words

Information theory entropy Shannon Kullback Jaynes 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    C. E. Shannon,Bell System Tech. J. 27:379, 623 (1948); reprinted in C. E. Shannon and W. Weaver,The Mathematical Theory of Communication, Univ. of Illinois Press, Urbana, Ill. (1949).Google Scholar
  2. 2.
    N. Wiener,Cybernetics, Wiley, New York (1948).Google Scholar
  3. 3.
    F. M. Reza,An Introduction to Information Theory, McGraw-Hill, New York (1961), and references cited therein.Google Scholar
  4. 4.
    R. C. Tolman,The Principles of Statistical Mechanics, Oxford Univ. Press, London (1938).Google Scholar
  5. 5.
    S. Kullback,Annals of Math. Statistics 22:79 (1951).Google Scholar
  6. 6.
    S. Kullback,Information Theory and Statistics, Wiley, New York (1951).Google Scholar
  7. 7.
    F. Schlögl,Z. Physik 249:1 (1971), and references cited therein.Google Scholar
  8. 8.
    E. T. Jaynes,Phys. Rev. 106:620 (1957);108:171 (1957).Google Scholar
  9. 9.
    E. T. Jaynes, inStatistical Physics (1962 Brandeis Lectures), ed. by K. W. Ford, Benjamin, New York (1963).Google Scholar
  10. 10.
    A. Hobson,Concepts in Statistical Mechanics, Gordon and Breach, New York (1971).Google Scholar
  11. 11.
    A. Katz,Principles of Statistical Mechanics, Freeman, San Francisco (1967).Google Scholar
  12. 12.
    R. Baierlein,Atoms and Information Theory, Freeman, San Francisco (1971).Google Scholar
  13. 13.
    A. Feinstein,Foundations of Information Theory, McGraw-Hill, New York (1968); A. I. Khinchin,Mathematical Foundations of Information Theory, Dover, New York (1957).Google Scholar
  14. 14.
    A. Hobson,J. Stat. Phys. 1:383 (1969).Google Scholar
  15. 15.
    Pierre Simon de Laplace,A Philosophical Essay on Probabilities, Dover, New York (1951).Google Scholar

Copyright information

© Plenum Publishing Corporation 1973

Authors and Affiliations

  • Arthur Hobson
    • 1
  • Bin-Kang Cheng
    • 1
  1. 1.Department of PhysicsUniversity of ArkansasFayetteville

Personalised recommendations