Skip to main content

Bayesian Inference

  • Chapter
  • First Online:

Part of the book series: UNITEXT for Physics ((UNITEXTPH))

Abstract

The goal of statistical inference is to get information from experimental observations about quantities (parameters, models,...) on which we want to learn something, be them directly observable or not. Bayesian inference 16 is based on the Bayes rule and considers probability as a measure of the degree of knowledge we have on the quantities of interest. Bayesian methods provide a framework with enough freedom to analyze different models, as complex as needed, using in a natural and conceptually simple way all the information available from the experimental data within a scheme that allows to understand the different steps of the learning process.

... some rule could be found, according to which we ought to estimate the chance that the probability for the happening of an event perfectly unknown, should lie between any two named degrees of probability, antecedently to any experiments made about it; ...

An Essay towards solving a Problem in the Doctrine of Chances

By the late Rev. Mr. Bayes...

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   89.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    For a gentle reading on the subject see [1].

  2. 2.

    It is easy to check for instance that if \(X_0\) is a non-trivial random quantity independent of the \(X_i\), the sequence \(\{X_0+X_1,X_0+X_2,\ldots X_0+X_n\}\) is exchangeable but not iid.

  3. 3.

    This is referred as De Finetti’s Theorem after B. de Finetti (1930s) and was generalized by E. Hewitt and L.J. Savage in the 1950s. See [4].

  4. 4.

    For a comprehensive discussion see [5].

  5. 5.

    Apparently, “Insufficient Reason” was coined by Laplace in reference to the Leibniz’s Principle of Sufficient Reason stating essentially that every fact has a sufficient reason for why it is the way it is and not other way.

  6. 6.

    In this context, the use of Transformation Groups arguments was pioneered by E.T. Jaynes [7].

  7. 7.

    We can go an step upwards and assign a prior to the hyperparameters with hyper-hyperparameters,...

  8. 8.

    The problems studied by Decision Theory can be addressed from the point of view of Game Theory. In this case, instead of Loss Functions one works with Utility Functions \(u({\varvec{\theta }},{\varvec{a}})\) that, in essence, are nothing else but \(u({\varvec{\theta }},{\varvec{a}})=K-l({\varvec{\theta }},{\varvec{a}}) \ge 0\); it is just matter of personal optimism to work with “utilities” or “losses”. J. Von Neumann and O. Morgenstern introduced in 1944 the idea of expected utility and the criteria to take as optimal action hat which maximizes the expected utility.

  9. 9.

    Essentially, the ratio of the predictive inferences for \({\varvec{x}}_2\) after \({\varvec{x}}_1\) has been observed.

  10. 10.

    If \(P(H_1)=P(H_2)=1/2\), then \(P(H_1|\mathrm{data})=0.95\,{\longrightarrow }\,B_{12}=19\,{\longrightarrow }\, {\Delta }_{12} \simeq 6\).

  11. 11.

    [AMS13]: Aguilar M. et al. (2013); Phys. Rev. Lett. 110, 141102.

  12. 12.

    In most cases,a Monte Carlo simulation will simplify life.

  13. 13.

    The rigidity (r) is defined as the momentum (p) divided by the electric charge (Z) so \(r=p\) for protons.

  14. 14.

    [AMS15]: Aguilar M. et al. (2015); PRL 114, 171103 and references therein.

  15. 15.

    Essentially, \(a_{lm} = \frac{4\pi }{n}\sum _{i=1}^n Y_{lm}({\theta }_i,{\phi }_i)\) for a sample of size n.

References

  1. G. D’Agostini, Bayesian Reasoning in Data Analysis (World Scientific Publishing, Singapore, 2003)

    Book  MATH  Google Scholar 

  2. F. James, Statistical Methods in Experimental Physics (World Scientific Publishing Co, Singapore, 2006)

    Book  MATH  Google Scholar 

  3. J.M. Bernardo, The concept of exchangeability and its applications. Far East J. Math. Sci. 4, 111–121 (1996). www.uv.es/~bernardo/Exchangeability.pdf

  4. J.M. Bernardo, A.F.M. Smith, Bayesian Theory (Wiley, New York, 1994)

    Book  MATH  Google Scholar 

  5. R.E. Kass, L. Wasserman, The selection of prior distributions by formal Rules. J. Am. Stat. Assoc. V 91(453), 1343–1370 (1996)

    Article  MATH  Google Scholar 

  6. H. Jeffreys, Theory of Probability (Oxford University Press, Oxford, 1939)

    MATH  Google Scholar 

  7. E.T. Jaynes, Prior Probabilities and Transformation Groups, NSF G23778 (1964)

    Google Scholar 

  8. V.I. Bogachev, Measure Theory (Springer, Berlin, 2006)

    Google Scholar 

  9. M. Stone, Right haar measures for convergence in probability to invariant posterior distributions. Ann. Math. Stat. 36, 440–453 (1965)

    Article  MATH  Google Scholar 

  10. M. Stone, Necessary and sufficient conditions for convergence in probability to invariant posterior distributions. Ann. Math. Stat. 41, 1349–1353 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  11. H. Raiffa, R. Schlaifer, Applied Statistical Decision Theory (Harvard University Press, Cambridge, 1961)

    MATH  Google Scholar 

  12. S.R. Dalal, W.J. Hall, J.R. Stat, Soc. Ser. B 45, 278–286 (1983)

    Google Scholar 

  13. B. Welch, H. Pears, J.R. Stat, Soc. B 25, 318–329 (1963)

    Google Scholar 

  14. M. Gosh, R. Mukerjee, Biometrika 84, 970–975 (1984)

    Google Scholar 

  15. G.S. Datta, M. Ghosh, Ann. Stat. 24(1), 141–159 (1996)

    Article  Google Scholar 

  16. G.S. Datta, R. Mukerjee, Probability Matching Priors and Higher Order Asymptotics (Springer, New York, 2004)

    Book  MATH  Google Scholar 

  17. J.M. Bernardo, J.R. Stat, Soc. Ser. B 41, 113–147 (1979)

    Google Scholar 

  18. J.O. Berger, J.M. Bernardo, D. Sun, Ann. Stat 37(2), 905–938 (2009)

    Article  Google Scholar 

  19. J.M. Bernardo, J.M. Ramón, The Statistician 47, 1–35 (1998)

    Google Scholar 

  20. J.O. Berger, J.M. Bernardo, D. Sun, Objective priors for discrete parameter spaces. J. Am. Stat. Assoc. 107(498), 636–648 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  21. A. O’Hagan, J.R. Stat. Soc. B57, 99–138 (1995)

    Google Scholar 

  22. J.O. Berger, L.R. Pericchi, J. Am. Stat. Assoc. V 91(433), 109–122 (1996)

    Article  Google Scholar 

  23. R.E. Kass, A.E. Raftery, J. Am. Stat. Assoc. V 90(430), 773–795 (1995)

    Article  Google Scholar 

  24. G. Schwarz, Ann. Stat. 6, 461–464 (1978)

    Article  Google Scholar 

  25. Feldman G.J. and Cousins R.D. (1997); arXiv:physics/9711021v2

  26. J.O. Berger, L.R. Pericchi, Ann. Stat. V 32(3), 841–869 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Carlos Maña .

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Maña, C. (2017). Bayesian Inference. In: Probability and Statistics for Particle Physics. UNITEXT for Physics. Springer, Cham. https://doi.org/10.1007/978-3-319-55738-0_2

Download citation

Publish with us

Policies and ethics