Bayesian Method of Moments (BMOM) Analysis of Mean and Regression Models

  • Arnold Zellner


A Bayesian method of moments/instrumental variable (BMOM/IV) approach is developed and applied in the analysis of the important mean and multiple regression models. Given a single set of data, it is shown how to obtain posterior and predictive moments without the use of likelihood functions, prior densities and Bayes’ Theorem. The posterior and predictive moments, based on a few relatively weak assumptions, are then used to obtain maximum entropy densities for parameters, realized error terms and future values of variables. Posterior means for parameters and realized error terms are shown to be equal to certain well known estimates and rationalized in terms of quadratic loss functions. Conditional maxent posterior densities for means and regression coefficients given scale parameters are in the normal form while scale parameters’ maxent densities are in the exponential form. Marginal densities for individual regression coefficients, realized error terms and future values are in the Laplace or double-exponential form with heavier tails than normal densities with the same means and variances. It is concluded that these results will be very useful, particularly when there is difficulty in formulating appropriate likelihood functions and prior densities needed in traditional maximum likelihood and Bayesian approaches.


Likelihood Function Posterior Density Prior Density Predictive Density Posterior Odds 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Berger, J.O. (1985), Statistical Decision Theory, 2nd ed., New York: Springer–Verlag. Box, G.E.P. and G.C. Tiao (1973), Bayesian Inference in Statistical Analysis, Reading, MA: Addison-WesleyGoogle Scholar
  2. Chaloner, K. and R. Brant (1988), “A Bayesian Approach to Outlier Detection and Residual Analysis,” Biometrika, 75, 651–659. Cover, T.M. and J.A. Thomas (1991), Elements of Information Theory, New York: J. Wiley & Sons, IncGoogle Scholar
  3. Geisser, S. (1993), Predictive Inference: An Introduction, New York: Chapman & HallCrossRefzbMATHGoogle Scholar
  4. Gradshteyn, I.S. and J.M. Ryzhik (1980), Table of Integrals, Series and Products, (Corrected and enlarged edition prepared by A. Jeffrey), San Diego, CA: Academic PressGoogle Scholar
  5. Green, E.J. and W.E. Strawderman (1994), “A Bayesian Growth and Yield Model for Slash Pine Plantations,” Department of Natural Resources and Department of Statistics, Rutgers U., New Brunswick, NJ. Jaynes, E.T. (1982a), Papers on Probability, Statistics and Statistical Physics, Dordrecht, Netherlands: ReidelGoogle Scholar
  6. Jaynes, E.T. (1982b), “On the Rationale of Maximum-Entropy Methods,” Proc. of the IEEE, 70, 939–952CrossRefGoogle Scholar
  7. Jeffreys, H. (1988), Theory of Probability(3rd reprinted edition, 1st edition, 1939 ), Oxford: Oxford U. PressGoogle Scholar
  8. Min, C-k. and A. Zellner (1993), “Bayesian and Non-Bayesian Methods for Combining Models and Forecasts with Applications to Forecasting International Growth Rates,” Journal of Econometrics, 56, 89–118CrossRefzbMATHGoogle Scholar
  9. Palm, F.C. and A. Zellner (1992), “To Combine or Not to Combine? Issues of Combining Forecasts,” Journal of Forecasting, 11, 687–701CrossRefGoogle Scholar
  10. Press, S.J. (1989), Bayesian Statistics, New York: J. Wiley & Sons, IncGoogle Scholar
  11. Shore, JE and RW Johnson (1980), “Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy,” IEEE Transactions, Vol. IT-26, No. 1, 26–37Google Scholar
  12. Stigler, S.M. (1986), The History of Statistics, Cambridge: Harvard U. PresszbMATHGoogle Scholar
  13. Zellner, A. (1971), An Introduction to Bayesian Inference in Econometrics, New York: John Wiley & Sons, Inc., reprinted by Krieger Publishing Co., 1987zbMATHGoogle Scholar
  14. Zellner, A. (1975), “Bayesian Analysis of Regression Error Terms,” Journal of the American Statistical Association, 70, 138–144CrossRefzbMATHMathSciNetGoogle Scholar
  15. Zellner, A. (1991), “Bayesian Methods and Entropy in Economics and Econometrics,” in W.T. Grandy and L.H. Schick (eds.), Maximum Entropy and Bayesian Methods, Kluwer, 17–32Google Scholar
  16. Zellner, A (1993), “Prior Information, Model Formulation and Bayesian Analysis,” paper presented at the Conference on Informational Aspects of Bayesian Statistics in Honor of H. Akaike, Fuji Conference Center, Japan, Dec. 1993. A short version of the paper is in the ASA’s1993 Proceedings of the Bayesian Statistics Science Section202–207Google Scholar
  17. Zellner, A. (1995), “The Finite Sample Properties of Simultaneous Equations’ Estimates and Estimators: Bayesian and Non-Bayesian Approaches,” paper presented at the Conference in Honor of Professor Carl F. Christ, April 21–22, 1995 at Johns Hopkins U. and to appear in the conference volume.Google Scholar
  18. Zellner, A. and B.R. Moulton (1985), “Bayesian Regression Diagnostics with Applications to International Consumption and Income Data,” Journal of Econometrics, 29, 187–211CrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 1996

Authors and Affiliations

  • Arnold Zellner
    • 1
  1. 1.University of ChicagoUSA

Personalised recommendations