Skip to main content
  • Book
  • © 1989

E. T. Jaynes: Papers on Probability, Statistics and Statistical Physics

Part of the book series: Synthese Library (SYLI, volume 158)

Buy it now

Buying options

eBook USD 139.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (14 chapters)

  1. Front Matter

    Pages i-xxiv
  2. Introductory Remarks

    • R. D. Rosenkrantz
    Pages 1-3
  3. Brandeis Lectures (1963)

    • R. D. Rosenkrantz
    Pages 39-76
  4. Gibbs vs Boltzmann Entropies (1965)

    • R. D. Rosenkrantz
    Pages 77-86
  5. Delaware Lecture (1967)

    • R. D. Rosenkrantz
    Pages 87-113
  6. Prior Probabilities (1968)

    • R. D. Rosenkrantz
    Pages 114-130
  7. The Well-Posed Problem (1973)

    • R. D. Rosenkrantz
    Pages 131-148
  8. Confidence Intervals vs Bayesian Intervals (1976)

    • R. D. Rosenkrantz
    Pages 149-209
  9. Where Do We Stand on Maximum Entropy? (1978)

    • R. D. Rosenkrantz
    Pages 210-314
  10. Marginalization and Prior Probabilities (1980)

    • R. D. Rosenkrantz
    Pages 337-375
  11. What is the Question? (1981)

    • R. D. Rosenkrantz
    Pages 376-400
  12. The Minimum Entropy Production Principle (1980)

    • R. D. Rosenkrantz
    Pages 401-424
  13. Back Matter

    Pages 425-434

About this book

The first six chapters of this volume present the author's 'predictive' or information theoretic' approach to statistical mechanics, in which the basic probability distributions over microstates are obtained as distributions of maximum entropy (Le. , as distributions that are most non-committal with regard to missing information among all those satisfying the macroscopically given constraints). There is then no need to make additional assumptions of ergodicity or metric transitivity; the theory proceeds entirely by inference from macroscopic measurements and the underlying dynamical assumptions. Moreover, the method of maximizing the entropy is completely general and applies, in particular, to irreversible processes as well as to reversible ones. The next three chapters provide a broader framework - at once Bayesian and objective - for maximum entropy inference. The basic principles of inference, including the usual axioms of probability, are seen to rest on nothing more than requirements of consistency, above all, the requirement that in two problems where we have the same information we must assign the same probabilities. Thus, statistical mechanics is viewed as a branch of a general theory of inference, and the latter as an extension of the ordinary logic of consistency. Those who are familiar with the literature of statistics and statistical mechanics will recognize in both of these steps a genuine 'scientific revolution' - a complete reversal of earlier conceptions - and one of no small significance.

Editors and Affiliations

  • Department of Mathematics, Dartmouth College, USA

    R. D. Rosenkrantz

Bibliographic Information

Buy it now

Buying options

eBook USD 139.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access