, Volume 7, Issue 4, pp 301-310

A comparison of the Shannon and Kullback information measures

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


Two widely used information measures are compared. It is shown that the Kullback measure, unlike the Shannon measure, provides the basis for a consistent theory of information which extends to continuous sample spaces and to nonconstant prior distributions. It is shown that the Kullback measure is a generalization of the Shannon measure, and that the Kullback measure has more reasonable additivity properties than does the Shannon measure. The results lend support to Jaynes's entropy maximization procedure.