Making Probabilities

  • J. A. Hartigan
Part of the Springer Series in Statistics book series (SSS)

Abstract

The essence of Bayes theory is giving probability values to bets. Methods of generating such probabilities are what separate the various theories.

Keywords

Entropy Covariance Candy 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bernardo, J. M. (1979), Reference posterior distributions for Bayesian inference (with discussion), J. Roy. Statist. Soc. 41, 113–147.MathSciNetMATHGoogle Scholar
  2. Christensen, Ronald (1981), Entropy Minimax Sourcebook, Vol. I: General Description. Lincoln, Massachusetts: Entropy Limited.Google Scholar
  3. Fraser, D. A. S. (1968), The Structure of Inference. New York: John Wiley.MATHGoogle Scholar
  4. Good, I. J. (1966), A derivation of the probabilistic explanation of information, J. Roy. Statist. Soc. B 28, 578–581.MathSciNetMATHGoogle Scholar
  5. Good, I. J. (1969), What is the use of a distribution?, in Krishnaiah (ed.), Multivariate Analysis Vol. II, 183–203. New York: Academic Press.Google Scholar
  6. Hartigan, J. A. (1964), Invariant prior distributions, Ann. Math. Statist. 35, 836–845.MathSciNetMATHCrossRefGoogle Scholar
  7. Hartigan, J. A. (1965), The asymptotically unbiased prior distribution, Ann. Math. Statist. 36, 1137–1152.MathSciNetMATHCrossRefGoogle Scholar
  8. Hartigan, J. A. (1966), Note on the confidence-prior of Welch and Peers, J. Roy. Statist. Soc. B 28, 55–56.MathSciNetMATHGoogle Scholar
  9. Hartigan, J. A. (1971), Similarity and probability, in V. P. Godambe and D. A. Sprott, (eds.), Foundations of Statistical Inference. Toronto: Holt, Rinehart and Winston.Google Scholar
  10. Jaynes, E. T. (1957), Information theory and statistical mechanics, Phys. Rev. 106, 620–630.MathSciNetCrossRefGoogle Scholar
  11. Jeffreys, H. (1946), An invariant form for the prior probability in estimation problems, Proc. R. Soc. London A 186, 453–461.MathSciNetMATHCrossRefGoogle Scholar
  12. Kullback, S. (1959), Information Theory and Statistics. New York: Wiley.MATHGoogle Scholar
  13. Kullback, S. and Leibler, R. A. (1951), On information and sufficiency, Ann. Math. Statist. 22, 79–86.MathSciNetMATHCrossRefGoogle Scholar
  14. Lindley, D. V. (1956), On a measure of the information provided by an experiment, Ann. Math. Statist. 27, 986–1005.MathSciNetMATHCrossRefGoogle Scholar
  15. Perks, W. (1947), Some observations on inverse probability; including a new indifference rule, J. Inst. Actuaries 73, 285–334.MathSciNetGoogle Scholar
  16. Pitman, E. J. G. (1979), Some Basic Theory for Statistical Inference. London: Chapman and Hall.Google Scholar
  17. Shannon, C. E. (1948), A mathematical theory of communication, Bell System Tech. J. 27, 379–423.MathSciNetMATHGoogle Scholar
  18. Stone, M. (1970), Necessary and sufficient conditions for convergence in probability to invariant posterior distributions, Ann. Math. Statist. 41, 1939–1953.CrossRefGoogle Scholar
  19. Zellner, A. (1977), Maximal data information prior distributions, in A. Aykac and C. Brumat, (eds.), New Developments in the Applications of Bayesian Methods, p. 211–232. Amsterdam: North Holland.Google Scholar
  20. Welch, B. L. and Peers, H. W. (1963), On formulae for confidence points based on integrals of weighted likelihoods, J. Roy. Statist. Soc. B 25, 318–329.MathSciNetMATHGoogle Scholar
  21. Winkler, R. L. (1967), The assessment of prior distributions in Bayesian analysis, J. Am. Stat. Assoc. 62, 776–800.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag New York Inc. 1983

Authors and Affiliations

  • J. A. Hartigan
    • 1
  1. 1.Department of StatisticsYale UniversityNew HavenUSA

Personalised recommendations