Journal of Statistical Physics

, Volume 7, Issue 4, pp 301–310

A comparison of the Shannon and Kullback information measures

  • Arthur Hobson
  • Bin-Kang Cheng

DOI: 10.1007/BF01014906

Cite this article as:
Hobson, A. & Cheng, BK. J Stat Phys (1973) 7: 301. doi:10.1007/BF01014906


Two widely used information measures are compared. It is shown that the Kullback measure, unlike the Shannon measure, provides the basis for a consistent theory of information which extends to continuous sample spaces and to nonconstant prior distributions. It is shown that the Kullback measure is a generalization of the Shannon measure, and that the Kullback measure has more reasonable additivity properties than does the Shannon measure. The results lend support to Jaynes's entropy maximization procedure.

Key words

Information theory entropy Shannon Kullback Jaynes 

Copyright information

© Plenum Publishing Corporation 1973

Authors and Affiliations

  • Arthur Hobson
    • 1
  • Bin-Kang Cheng
    • 1
  1. 1.Department of PhysicsUniversity of ArkansasFayetteville

Personalised recommendations