A comparison of the Shannon and Kullback information measures
- Cite this article as:
- Hobson, A. & Cheng, B. J Stat Phys (1973) 7: 301. doi:10.1007/BF01014906
- 168 Views
Two widely used information measures are compared. It is shown that the Kullback measure, unlike the Shannon measure, provides the basis for a consistent theory of information which extends to continuous sample spaces and to nonconstant prior distributions. It is shown that the Kullback measure is a generalization of the Shannon measure, and that the Kullback measure has more reasonable additivity properties than does the Shannon measure. The results lend support to Jaynes's entropy maximization procedure.