Bayesian Method of Moments (BMOM) Analysis of Mean and Regression Models
A Bayesian method of moments/instrumental variable (BMOM/IV) approach is developed and applied in the analysis of the important mean and multiple regression models. Given a single set of data, it is shown how to obtain posterior and predictive moments without the use of likelihood functions, prior densities and Bayes’ Theorem. The posterior and predictive moments, based on a few relatively weak assumptions, are then used to obtain maximum entropy densities for parameters, realized error terms and future values of variables. Posterior means for parameters and realized error terms are shown to be equal to certain well known estimates and rationalized in terms of quadratic loss functions. Conditional maxent posterior densities for means and regression coefficients given scale parameters are in the normal form while scale parameters’ maxent densities are in the exponential form. Marginal densities for individual regression coefficients, realized error terms and future values are in the Laplace or double-exponential form with heavier tails than normal densities with the same means and variances. It is concluded that these results will be very useful, particularly when there is difficulty in formulating appropriate likelihood functions and prior densities needed in traditional maximum likelihood and Bayesian approaches.
KeywordsLikelihood Function Posterior Density Prior Density Predictive Density Posterior Odds
Unable to display preview. Download preview PDF.
- Berger, J.O. (1985), Statistical Decision Theory, 2nd ed., New York: Springer–Verlag. Box, G.E.P. and G.C. Tiao (1973), Bayesian Inference in Statistical Analysis, Reading, MA: Addison-WesleyGoogle Scholar
- Chaloner, K. and R. Brant (1988), “A Bayesian Approach to Outlier Detection and Residual Analysis,” Biometrika, 75, 651–659. Cover, T.M. and J.A. Thomas (1991), Elements of Information Theory, New York: J. Wiley & Sons, IncGoogle Scholar
- Gradshteyn, I.S. and J.M. Ryzhik (1980), Table of Integrals, Series and Products, (Corrected and enlarged edition prepared by A. Jeffrey), San Diego, CA: Academic PressGoogle Scholar
- Green, E.J. and W.E. Strawderman (1994), “A Bayesian Growth and Yield Model for Slash Pine Plantations,” Department of Natural Resources and Department of Statistics, Rutgers U., New Brunswick, NJ. Jaynes, E.T. (1982a), Papers on Probability, Statistics and Statistical Physics, Dordrecht, Netherlands: ReidelGoogle Scholar
- Jeffreys, H. (1988), Theory of Probability(3rd reprinted edition, 1st edition, 1939 ), Oxford: Oxford U. PressGoogle Scholar
- Press, S.J. (1989), Bayesian Statistics, New York: J. Wiley & Sons, IncGoogle Scholar
- Shore, JE and RW Johnson (1980), “Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy,” IEEE Transactions, Vol. IT-26, No. 1, 26–37Google Scholar
- Zellner, A. (1991), “Bayesian Methods and Entropy in Economics and Econometrics,” in W.T. Grandy and L.H. Schick (eds.), Maximum Entropy and Bayesian Methods, Kluwer, 17–32Google Scholar
- Zellner, A (1993), “Prior Information, Model Formulation and Bayesian Analysis,” paper presented at the Conference on Informational Aspects of Bayesian Statistics in Honor of H. Akaike, Fuji Conference Center, Japan, Dec. 1993. A short version of the paper is in the ASA’s1993 Proceedings of the Bayesian Statistics Science Section202–207Google Scholar
- Zellner, A. (1995), “The Finite Sample Properties of Simultaneous Equations’ Estimates and Estimators: Bayesian and Non-Bayesian Approaches,” paper presented at the Conference in Honor of Professor Carl F. Christ, April 21–22, 1995 at Johns Hopkins U. and to appear in the conference volume.Google Scholar