Articles

Journal of Statistical Physics

, Volume 7, Issue 4, pp 301-310

First online:

A comparison of the Shannon and Kullback information measures

  • Arthur HobsonAffiliated withDepartment of Physics, University of Arkansas
  • , Bin-Kang ChengAffiliated withDepartment of Physics, University of Arkansas

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Two widely used information measures are compared. It is shown that the Kullback measure, unlike the Shannon measure, provides the basis for a consistent theory of information which extends to continuous sample spaces and to nonconstant prior distributions. It is shown that the Kullback measure is a generalization of the Shannon measure, and that the Kullback measure has more reasonable additivity properties than does the Shannon measure. The results lend support to Jaynes's entropy maximization procedure.

Key words

Information theory entropy Shannon Kullback Jaynes