Abstract
The goal of statistical inference is to get information from experimental observations about quantities (parameters, models,...) on which we want to learn something, be them directly observable or not. Bayesian inference 16 is based on the Bayes rule and considers probability as a measure of the degree of knowledge we have on the quantities of interest. Bayesian methods provide a framework with enough freedom to analyze different models, as complex as needed, using in a natural and conceptually simple way all the information available from the experimental data within a scheme that allows to understand the different steps of the learning process.
... some rule could be found, according to which we ought to estimate the chance that the probability for the happening of an event perfectly unknown, should lie between any two named degrees of probability, antecedently to any experiments made about it; ...
An Essay towards solving a Problem in the Doctrine of Chances
By the late Rev. Mr. Bayes...
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
For a gentle reading on the subject see [1].
- 2.
It is easy to check for instance that if \(X_0\) is a non-trivial random quantity independent of the \(X_i\), the sequence \(\{X_0+X_1,X_0+X_2,\ldots X_0+X_n\}\) is exchangeable but not iid.
- 3.
This is referred as De Finetti’s Theorem after B. de Finetti (1930s) and was generalized by E. Hewitt and L.J. Savage in the 1950s. See [4].
- 4.
For a comprehensive discussion see [5].
- 5.
Apparently, “Insufficient Reason” was coined by Laplace in reference to the Leibniz’s Principle of Sufficient Reason stating essentially that every fact has a sufficient reason for why it is the way it is and not other way.
- 6.
In this context, the use of Transformation Groups arguments was pioneered by E.T. Jaynes [7].
- 7.
We can go an step upwards and assign a prior to the hyperparameters with hyper-hyperparameters,...
- 8.
The problems studied by Decision Theory can be addressed from the point of view of Game Theory. In this case, instead of Loss Functions one works with Utility Functions \(u({\varvec{\theta }},{\varvec{a}})\) that, in essence, are nothing else but \(u({\varvec{\theta }},{\varvec{a}})=K-l({\varvec{\theta }},{\varvec{a}}) \ge 0\); it is just matter of personal optimism to work with “utilities” or “losses”. J. Von Neumann and O. Morgenstern introduced in 1944 the idea of expected utility and the criteria to take as optimal action hat which maximizes the expected utility.
- 9.
Essentially, the ratio of the predictive inferences for \({\varvec{x}}_2\) after \({\varvec{x}}_1\) has been observed.
- 10.
If \(P(H_1)=P(H_2)=1/2\), then \(P(H_1|\mathrm{data})=0.95\,{\longrightarrow }\,B_{12}=19\,{\longrightarrow }\, {\Delta }_{12} \simeq 6\).
- 11.
[AMS13]: Aguilar M. et al. (2013); Phys. Rev. Lett. 110, 141102.
- 12.
In most cases,a Monte Carlo simulation will simplify life.
- 13.
The rigidity (r) is defined as the momentum (p) divided by the electric charge (Z) so \(r=p\) for protons.
- 14.
[AMS15]: Aguilar M. et al. (2015); PRL 114, 171103 and references therein.
- 15.
Essentially, \(a_{lm} = \frac{4\pi }{n}\sum _{i=1}^n Y_{lm}({\theta }_i,{\phi }_i)\) for a sample of size n.
References
G. D’Agostini, Bayesian Reasoning in Data Analysis (World Scientific Publishing, Singapore, 2003)
F. James, Statistical Methods in Experimental Physics (World Scientific Publishing Co, Singapore, 2006)
J.M. Bernardo, The concept of exchangeability and its applications. Far East J. Math. Sci. 4, 111–121 (1996). www.uv.es/~bernardo/Exchangeability.pdf
J.M. Bernardo, A.F.M. Smith, Bayesian Theory (Wiley, New York, 1994)
R.E. Kass, L. Wasserman, The selection of prior distributions by formal Rules. J. Am. Stat. Assoc. V 91(453), 1343–1370 (1996)
H. Jeffreys, Theory of Probability (Oxford University Press, Oxford, 1939)
E.T. Jaynes, Prior Probabilities and Transformation Groups, NSF G23778 (1964)
V.I. Bogachev, Measure Theory (Springer, Berlin, 2006)
M. Stone, Right haar measures for convergence in probability to invariant posterior distributions. Ann. Math. Stat. 36, 440–453 (1965)
M. Stone, Necessary and sufficient conditions for convergence in probability to invariant posterior distributions. Ann. Math. Stat. 41, 1349–1353 (1970)
H. Raiffa, R. Schlaifer, Applied Statistical Decision Theory (Harvard University Press, Cambridge, 1961)
S.R. Dalal, W.J. Hall, J.R. Stat, Soc. Ser. B 45, 278–286 (1983)
B. Welch, H. Pears, J.R. Stat, Soc. B 25, 318–329 (1963)
M. Gosh, R. Mukerjee, Biometrika 84, 970–975 (1984)
G.S. Datta, M. Ghosh, Ann. Stat. 24(1), 141–159 (1996)
G.S. Datta, R. Mukerjee, Probability Matching Priors and Higher Order Asymptotics (Springer, New York, 2004)
J.M. Bernardo, J.R. Stat, Soc. Ser. B 41, 113–147 (1979)
J.O. Berger, J.M. Bernardo, D. Sun, Ann. Stat 37(2), 905–938 (2009)
J.M. Bernardo, J.M. Ramón, The Statistician 47, 1–35 (1998)
J.O. Berger, J.M. Bernardo, D. Sun, Objective priors for discrete parameter spaces. J. Am. Stat. Assoc. 107(498), 636–648 (2012)
A. O’Hagan, J.R. Stat. Soc. B57, 99–138 (1995)
J.O. Berger, L.R. Pericchi, J. Am. Stat. Assoc. V 91(433), 109–122 (1996)
R.E. Kass, A.E. Raftery, J. Am. Stat. Assoc. V 90(430), 773–795 (1995)
G. Schwarz, Ann. Stat. 6, 461–464 (1978)
Feldman G.J. and Cousins R.D. (1997); arXiv:physics/9711021v2
J.O. Berger, L.R. Pericchi, Ann. Stat. V 32(3), 841–869 (2004)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Maña, C. (2017). Bayesian Inference. In: Probability and Statistics for Particle Physics. UNITEXT for Physics. Springer, Cham. https://doi.org/10.1007/978-3-319-55738-0_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-55738-0_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-55737-3
Online ISBN: 978-3-319-55738-0
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)