Maximum Entropy and Minimum Cross-Entropy Principles: Need for a Broader Perspective

  • H. K. Kesavan
  • J. N. Kapur
Part of the Fundamental Theories of Physics book series (FTPH, volume 39)


Jaynes’ Maximum Entropy Principle (MaxEnt) has served as a unifying principle in the study of a wide variety of probabilistic systems transcending all disciplinary boundaries. Here, we are concerned about the two inverse problems of determining the most unbiased moment constraints, and the most unbiased entropy measure, when the remaining two probabilistic entities are specified. The need for these inverse problems arises in application areas and has the same relevance as MaxEnt. The problem of determining the most unbiased measure of entropy, however, takes us out of the framework of MaxEnt, where considerations of Generalized Measures of Entropy become essential. Departure from the well-established use of the Shannon measure has raised a variety of objections ranging all the way from the meaning of entropy to the mutilation of its uniqueness properties. The Generalized Maximum Entropy Principle(GMEP), which is the main contribution of this paper, takes cognizance of these criticisms by giving justifications for the use of generalized measures. Acceptance of this new model will result in extending the scope of MaxEnt to tackle problems which are at present beyond its scope.

A very similar argument is advanced in favour of a Generalized Minimum Cross-Entropy principle where other than Kullback-Leibler measure are considered.


Inverse Problem Maximum Entropy Generalize Entropy Shannon Entropy Entropy Measure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    El-Affendi, “A Maximum Entropy Analysis of the M/G/I and G/M/I Queuing Systems at Equilibrium”, Acta Information 19, pp. 339–355, (1983).MathSciNetzbMATHCrossRefGoogle Scholar
  2. 2.
    Forte, B. and C. Sempi, “Maximizing Conditional Entropy: A derivation of quantal statistics”, Rendi Conla de Mathematics, Vol. 9, pp. 551–566, (1976).MathSciNetGoogle Scholar
  3. 3.
    Jaynes E.T., “Information Theory and Statistical Mechanics”, Physical Reviews, Vol. 106, pp. 620–630, (1957).MathSciNetCrossRefGoogle Scholar
  4. 4.
    Kapur J.N., “Measures of uncertainty, mathematical programming and physics”, Jour. Ind. Soc. Agri. Stat., Vol. 24, pp. 47–66, (1972).Google Scholar
  5. 5.
    Kapur J.N., “Four families of measure of entropy”, Ind. Jour. Pure and App. Maths., Vol. 17, No. 4, pp. 429–449, (1986).MathSciNetzbMATHGoogle Scholar
  6. 6.
    Kapur J.N. and H.K. Kesavan, The Generalized Maximum Entropy Principle Sandford Education Press, Waterloo University, Waterloo, Canada, (1987).zbMATHGoogle Scholar
  7. 7.
    Kullback S., Information Theory and Statistics, John Wiley New York (1959).zbMATHGoogle Scholar
  8. 8.
    Shore J. E., “Derivation of Equilibrium and Time-dependent Solution for Queuing Systems using Entropy Maximization”, Nat. Computer. Cong., Vol. 47, pp. 483–487, (1978).Google Scholar
  9. 9.
    Tribus M., “Information Theory as the basis for Thermostatics and Thermodynamics”, J. Appl. Mechanics, Vol. 28, p. 106, (1961).MathSciNetGoogle Scholar
  10. 10.
    Wilson A.G., Entropy in Urban and Regional Modelling, Pion, London, (1970).Google Scholar
  11. 11.
    Kesavan, H.K. and J.N.Kapur, “The Generalized Maximum Entropy Principle”, IEEE SMC Vol. 19, no.5, Sept/Oct 1989.Google Scholar
  12. 12.
    Kesavan, H.K. and J.N.Kapur, “On the Families of Solutions to Generalized Maximum Entropy and Minimum Cross-Entropy Problems,” Int. J. General Systems, 1989.Google Scholar

Copyright information

© Kluwer Academic Publishers 1990

Authors and Affiliations

  • H. K. Kesavan
    • 1
  • J. N. Kapur
    • 1
  1. 1.University of WaterlooCanada

Personalised recommendations