Abstract
It is shown that (i) every probability density is the unique maximizer of relative entropy in an appropriate class and (ii) in the class of all pdf f that satisfy ∝ fh i dμ = λ i for i = 1, 2, ..., ... kthe maximizer of entropy is an f 0 that is proportional to exp(Σc i h i ) for some choice of c i . An extension of this to a continuum of constraints and many examples are presented.
Similar content being viewed by others
References
Durrent R, Probability: Theory and Examples (CA: Wadsworth and Brooks — Cole, Pacific Grove) (1991)
Karatzas I and Shreve S, Brownian Motion and Stochastic Calculus, second edition (GTM, Springer-Verlag) (1991)
McKean H P, Stochastic Integrals (New York: Academic Press) (1969)
Rudin W, Real and Complex Analysis, third edition (New York: McGraw Hill) (1984)
Shannon C, A Mathematical Theory of Communication, Bell Systems Technical J. 23 (1948) 379–423, 623–659
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Athreya, K.B. Entropy maximization. Proc Math Sci 119, 531–539 (2009). https://doi.org/10.1007/s12044-009-0049-5
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12044-009-0049-5