Journal of Statistical Physics

, Volume 7, Issue 4, pp 301–310

A comparison of the Shannon and Kullback information measures

Authors

  • Arthur Hobson
    • Department of PhysicsUniversity of Arkansas
  • Bin-Kang Cheng
    • Department of PhysicsUniversity of Arkansas
Articles

DOI: 10.1007/BF01014906

Cite this article as:
Hobson, A. & Cheng, B. J Stat Phys (1973) 7: 301. doi:10.1007/BF01014906

Abstract

Two widely used information measures are compared. It is shown that the Kullback measure, unlike the Shannon measure, provides the basis for a consistent theory of information which extends to continuous sample spaces and to nonconstant prior distributions. It is shown that the Kullback measure is a generalization of the Shannon measure, and that the Kullback measure has more reasonable additivity properties than does the Shannon measure. The results lend support to Jaynes's entropy maximization procedure.

Key words

Information theoryentropyShannonKullbackJaynes

Copyright information

© Plenum Publishing Corporation 1973