Objective Bayesianism and Geometry
We suggest in this paper that the concepts of utility, prior probability and entropy are not independent but must be related through the following formula: “The expected utility of a Theory is an increasing funetion of its entropy.” It follows that associated to each regular class of theories (i.e. parametric statistical model) there is a unique one parameter family of densities able to act as prior distributions. These entropic priors form the exponential family generated by the invariant measure in the class, that has the entropy of each theory as sufficient statistic.
KeywordsInvariant Measure Prior Distribution Prior Probability Maximum Entropy Exponential Family
Unable to display preview. Download preview PDF.
- Amari, S., 1985, Differentual Geometrical Methods in Statistics. Lecture Notes in Statistics, 28 Springer-Verlag.Google Scholar
- Jeffreys, H., 1939, Theory of Probability. Clarendon Press, Oxford.Google Scholar
- Rodriguez, C., 1988. The metrics induced by the Kullback Number. Maximum Entropy and Bayesian Methods, 415–422. (J. Skilling, Ed.) Kluwer Academic Publishers.Google Scholar
- Skilling, J., 1987. The axioms of maximum entropy. Proceedings of the 7th Maximum Entropy Workshop. G. Erikson and R. Smith (Eds.) Kluwer Academic Publishers.Google Scholar
- Skilling, J., 1988. Classical MaxEnt data analysis. Maximum Entropy and Bayesian Methods. (J. Skilling, Ed.) Kluwer Academic Publisher.Google Scholar