Skip to main content

Part of the book series: Fundamental Theories of Physics ((FTPH,volume 31-32))

Abstract

Maximum entropy is presented as a universal method of finding a “best” positive distribution constrained by incomplete data. The generalised entropy ∑(f - m - f log(f/m))) is the only form which selects acceptable distributions f in particular cases. It holds even if f is not normalised, so that maximum entropy applies directly to physical distributions other than probabilities. Furthermore, maximum entropy should also be used to select “best” parameters if the underlying model m has such freedom.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Cox, R.T. (1961). The algebra of probable inference. Johns Hopkins Press, Baltimore, MD.

    MATH  Google Scholar 

  • Good, I.J. (1963). Maximum entropy for hypothesis formulation, especially for multi-dimensional contingency tables. Annals. Math. Stat., 34, 911–934.

    Article  MathSciNet  MATH  Google Scholar 

  • Gull, S.F. & Skilling, J. (1984). The maximum entropy method. In Indirect imaging, ed. J.A. Roberts. Cambridge: Cambridge University Press.

    Google Scholar 

  • Jaynes, E.T. (1957a). Information theory and statistical mechanics I. Phys. Rev., 106, 620–630.

    Article  MathSciNet  Google Scholar 

  • Jaynes, E.T. (1957b). Information theory and statistical mechanics II. Phys. Rev., 108, 171–190.

    Article  MathSciNet  Google Scholar 

  • Jaynes, E.T. (1984). Monkeys, Kangaroos and N. Presented at fourth maximum entropy workshop, Calgary, ed. J.H. Justice, Dordrecht: Reidel.

    Google Scholar 

  • Kullback, S. (1959). Information theory and statistics. New York: Wiley.

    MATH  Google Scholar 

  • Livesey, A.K. & Skilling, J. (1985). Maximum entropy theory Acta Cryst., A41, 113–122.

    Google Scholar 

  • Shannon, C.F. (1948). A mathematical theory of communication. Bell System Tech. J., 27, 379–423 and 623–656.

    MathSciNet  MATH  Google Scholar 

  • Shannon, C.E. & Weaver, W. (1949). The mathematical theory of communication. Urbana, Illinois: University Illinois Press.

    MATH  Google Scholar 

  • Shore, J.E. & Johnson, R.W. (1980). Axiomatic derivation of the principle of maximum entropy and the principle of minimum cros-entropy. IEEE Trans. Info. Theory, IT-26, 26–37 and IT-29, 942–943.

    Article  MathSciNet  Google Scholar 

  • Tikochinsky, Y., Tishby, N.Z. & Levine, R.D. (1984). Consistent inference of probabilities for reproducible experiments. Phys. Rev. Lett., 52, 1357–1360.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1988 Kluwer Academic Publishers

About this chapter

Cite this chapter

Skilling, J. (1988). The Axioms of Maximum Entropy. In: Erickson, G.J., Smith, C.R. (eds) Maximum-Entropy and Bayesian Methods in Science and Engineering. Fundamental Theories of Physics, vol 31-32. Springer, Dordrecht. https://doi.org/10.1007/978-94-009-3049-0_8

Download citation

  • DOI: https://doi.org/10.1007/978-94-009-3049-0_8

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-7871-9

  • Online ISBN: 978-94-009-3049-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics