Skip to main content
Log in

Maximum information entropy principle and the interpretation of probabilities in statistical mechanics − a short review

  • Topical Review
  • Published:
The European Physical Journal B Aims and scope Submit manuscript

Abstract

In this paper an alternative approach to statistical mechanics based on the maximum information entropy principle (MaxEnt) is examined, specifically its close relation with the Gibbs method of ensembles. It is shown that the MaxEnt formalism is the logical extension of the Gibbs formalism of equilibrium statistical mechanics that is entirely independent of the frequentist interpretation of probabilities only as factual (i.e. experimentally verifiable) properties of the real world. Furthermore, we show that, consistently with the law of large numbers, the relative frequencies of the ensemble of systems prepared under identical conditions (i.e. identical constraints) actually correspond to the MaxEnt probabilites in the limit of a large number of systems in the ensemble. This result implies that the probabilities in statistical mechanics can be interpreted, independently of the frequency interpretation, on the basis of the maximum information entropy principle.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. E.T. Jaynes, Predictive statistical mechanics, in Frontiers of Nonequilibrium Statistical Physics, edited by G.T. Moore, M.O. Scully (Plenum Press, New York, 1986), pp. 33−56.

  2. W. Feller, An Introduction to Probability Theory and Its Applications (John Wiley and Sons, New York, 1961)

  3. Law of large numbers, in Encyclopedia of Mathematics. http://www.encyclopediaofmath.org/

  4. O. Penrose, Foundations of Statistical Mechanics (Pergamon Press, Oxford, 1970)

  5. I. Farquhar, Ergodic Theory in Statistical Mechanics (Wiley, New York, 1964)

  6. J.R. Dorfman, An Introduction to Chaos in Nonequilibrium Statistical Mechanics (Cambridge University Press, Cambridge, 1999)

  7. E.T. Jaynes, Where do we stand on maximum entropy? In The Maximum Entropy Formalism, edited by R.D. Levine, M. Tribus (MIT Press, Cambridge, 1979), pp. 15−118

  8. J.W. Gibbs, Elementary Principles in Statistical Mechanics (Yale University Press, New Haven, 1902)

  9. E.T. Jaynes, Am. J. Phys. 33, 391 (1965)

    Article  ADS  Google Scholar 

  10. E.T. Jaynes, Macroscopic prediction, in Complex Systems Operational Approaches in Neurobiology, Physics, and Computers, edited by H. Haken (Springer, Berlin, 1985), pp. 254−269.

  11. E.T. Jaynes, The second law as physical fact and as human inference, Unpublished manuscript (1990), http://bayes.wustl.edu/etj/node2.html

  12. C.E. Shannon, Bell Syst. Tech. J. 27, 379, 623 (1948). Reprinted in The Mathematical Theory of Communication, edited by C.E. Shannon, W. Weaver (University of Illinois Press, Urbana, 1949)

  13. E.T. Jaynes, Phys. Rev. 106, 620 (1957)

    Article  ADS  MathSciNet  Google Scholar 

  14. E.T. Jaynes, Phys. Rev. 108, 171 (1957)

    Article  ADS  MathSciNet  Google Scholar 

  15. W.T. Grandy, Entropy and the Time Evolution of Macroscopic Systems (Oxford University Press, Oxford, 2008)

  16. W.T. Grandy, Found. Phys. 34, 1 (2004)

    Article  ADS  MathSciNet  Google Scholar 

  17. E.T. Jaynes, in Probability Theory: The Logic of Science, edited by G.L. Bretthorst (Cambridge University Press, Cambridge, 2003)

  18. D. Kuić, Foundational principles of predictive statistical mechanics as the basis for theory of irreversibility, Ph.D. thesis (in Croatian) (2013), http://digre.pmf.unizg.hr/22/

  19. W.T. Grandy, Found. Phys. 34, 21 (2004)

    Article  ADS  MathSciNet  Google Scholar 

  20. W.T. Grandy, Found. Phys. 34, 771 (2004)

    Article  ADS  MathSciNet  Google Scholar 

  21. M. Campisi, P. Talkner, P. Hanggi, Phys. Rev. Lett. 102, 210401 (2009)

    Article  ADS  Google Scholar 

  22. G.E. Crooks, Phys. Rev. E 60, 2721 (1999)

    Article  ADS  Google Scholar 

  23. C. Jarzynski, Phys. Rev. Lett. 78, 2690 (1997)

    Article  ADS  Google Scholar 

  24. E.T. Jaynes, Foundations of probability theory and statistical mechanics, in Delaware Seminar in the Foundations of Physics, edited by M. Bunge (Springer, Berlin, 1967), pp. 77−101

  25. D. Zubarev, V. Morozov, G. Ropke, Statistical Mechanics of Nonequilibrium Processes, Vol. 1: Basic Concepts, Kinetic Theory (Akademie Verlag, Berlin, 1996)

  26. D. Zubarev, V. Morozov, G. Ropke, in Statistical Mechanics of Nonequilibrium Processes, Vol. 2: Relaxation and Hydrodynamic Processes (Akademie Verlag, Berlin, 1997)

  27. R. Balescu, Equilibrium and Nonequilibrium Statistical Mechanics (John Wiley and Sons, New York, 1975)

  28. D. Kuić, P. Zupanovic, D. Juretic, Found. Phys. 42, 319 (2012)

    Article  ADS  MathSciNet  Google Scholar 

  29. D. Kuić, Predictive statistical mechanics and macroscopic time evolution. A model for closed Hamiltonian systems, arXiv:1506.02622 (2015)

  30. D. Kuić, Predictive statistical mechanics and macroscopic time evolution. Hydrodynamics and entropy production. Accepted for publication in Found. Phys, arXiv:1506.02625 (2015)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Domagoj Kuić.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kuić, D. Maximum information entropy principle and the interpretation of probabilities in statistical mechanics − a short review. Eur. Phys. J. B 89, 124 (2016). https://doi.org/10.1140/epjb/e2016-70175-6

Download citation

  • Received:

  • Revised:

  • Published:

  • DOI: https://doi.org/10.1140/epjb/e2016-70175-6

Keywords

Navigation