Psychometrika

, Volume 63, Issue 3, pp 271–300 | Cite as

A Bayesian approach to nonlinear latent variable models using the Gibbs sampler and the metropolis-hastings algorithm

  • Gerhard Arminger
  • Bengt O. Muthén
Article

Abstract

Nonlinear latent variable models are specified that include quadratic forms and interactions of latent regressor variables as special cases. To estimate the parameters, the models are put in a Bayesian framework with conjugate priors for the parameters. The posterior distributions of the parameters and the latent variables are estimated using Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis-Hastings algorithm. The proposed estimation methods are illustrated by two simulation studies and by the estimation of a non-linear model for the dependence of performance on task complexity and goal specificity using empirical data.

Key words

Gibbs Sampler LISREL model Metropolis-Hastings algorithm Non-linear functions of latent regressors 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson, T. W. (1984).An Introduction to multivariate statistical analysis (2nd ed.). New York: Wiley.Google Scholar
  2. Arminger, G., & Schoenberg, R. J. (1989). Pseudo maximum likelihood estimation and a test for misspecification in mean and covariance structure models.Psychometrika, 54, 409–425.CrossRefGoogle Scholar
  3. Arminger, G., Wittenberg, J., & Schepers, A. (1996).MECOSA3 User Guide. Friedrichsdorf, Germany: ADDITIVE GmbH.Google Scholar
  4. Arnold, S. F. (1993). Gibbs Sampling. In C. R. Rao (Ed.),Handbook of statistics (Vol. 9, pp. 599–625). Amsterdam: North Holland.Google Scholar
  5. Besag, J., Green, P., Higdon, D., & Mengersen, K. (1995).Bayesian computation and stochastic systems.Statistical Science, 19(1), 3–66.Google Scholar
  6. Box, G. E. P., & Tiao, G. C. (1973).Bayesian inference in statistical analysis. Reading: Addison-Wesley.Google Scholar
  7. Browne, M. W. (1984). Asymptotic distribution-free methods for the analysis of covariance structures.British Journal of Mathematical and Statistical Psychology, 37, 62–83.PubMedGoogle Scholar
  8. Carlin, B. P., & Louis, T. A. (1996).Bayes and empirical bayes methods for data analysis. London: Chapman & Hall.Google Scholar
  9. Casella, G., & George, E. I. (1992). Explaining the Gibbs sampler.The American Statistican, 46, 167–174.Google Scholar
  10. Chib, S., & Greenberg, E. (1995). Understanding the Metropolis-Hastings algorithm.The American Statistican, 49(4), 327–335.Google Scholar
  11. Chib, S., & Greenberg, E. S. (1996). Markov chain Monte Carlo simulation methods in econometrics.Econometric Theory, 12(3), 409–431.Google Scholar
  12. Early, P. C., Lee, C., & Hanson, L. A. (1990). Joint moderating effects of job experience and task component complexity: relations among goal setting, task strategies, and performance.Journal of Organizational Behavior, 11, 3–15.Google Scholar
  13. Gelfand, A. E., & Smith, A. F. M. (1990). Sampling-based approaches to calculating marginal densities.Journal of the American Statistical Association, 85, 398–409.Google Scholar
  14. Gelman, A., Carlin, J. B., Stern, H. S., & Rubin, D. B. (1995).Bayesian data analysis. London: Chapman & Hall.Google Scholar
  15. Gelman, A., & Rubin, D. B. (1992). Inference from Iterative Simulation Using Multiple Sequences (with discussion).Statistical Science, 7, 457–511.Google Scholar
  16. Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distribution and the Bayesian restoration of images.IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721–741.Google Scholar
  17. Gilks, W. R., Richardson, S., & Spiegelhalter, D. J. (1996).Markov chain Monte Carlo in practice, London: Chapman and Hall.Google Scholar
  18. Härdle, W. (1990).Applied nonparametric regression. Cambridge, MA: Cambridge University Press.Google Scholar
  19. Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications.Biometrika, 57, 97–109.Google Scholar
  20. Hayduk, L. A. (1987).Structural equation modeling with LISREL: Essentials and advances. Baltimore, MD: Johns Hopkins Press.Google Scholar
  21. Hobert, J. P., & Casella, G. (in press). The effect of improper priors on Gibbs sampling in hierarchical linear mixed models.Journal of the American Statistical Association.Google Scholar
  22. Holling, H. (1995).Goal setting and performance. Unpublished manuscript, Universität Münster, Department of Psychology, Germany.Google Scholar
  23. Jöreskog, K. G., & Sörbom, D. (1993).LISREL 8: Structural equation modeling with the SIMPLIS command language. Hillsdale, NJ, Lawrence Earlbaum Associates.Google Scholar
  24. Jöreskog, K. G., & Yang, F. (1996). Nonlinear structural equation models: The Kenny-Judd model with interaction effects. In G. A. Marcoulides, & R. E. Schumacker (Eds.),Advanced structural equation modeling techniques (pp. 57–88). Hillsdale, NJ, Lawrence Erlbaum Associates.Google Scholar
  25. Kenny, D., & Judd, C. M. (1984). Estimating the nonlinear and interactive effects of latent variables.Psychological Bulletin, 96, 201–210.Google Scholar
  26. Lawley, D. N., & Maxwell, A. E. (1971).Factor analysis as a statistical method. London: Butterworth.Google Scholar
  27. Lee, S. Y. (1991). A Bayesian Approach to Confirmatory Factor Analysis.Psychometrika, 46, 153–160.Google Scholar
  28. Lindley, D. V., & Smith, A. F. M. (1972). Bayes estimates for the linear model (with discussion).Journal of the Royal Statistical Society, Series B, 34, 1–41.Google Scholar
  29. Loftsgaarden, D. O., & Quesenberry, G. P. (1965). A nonparametric estimate of a multivariate density function.Annals of Mathematical Statistics, 36, 1049–1051.Google Scholar
  30. MacEachern, S. N., & Berliner, M. L. (1994). Subsampling the Gibbs sampler.The American Statistican, 48, 188–190.Google Scholar
  31. Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., & Teller, E. (1953). Equations of state calculations by fast computing machines.Journal of Chemical Physics, 21, 1087–1091.CrossRefGoogle Scholar
  32. Müller, P. (1994). Metroplis Based Posterior Integration Schemes. Unpublished manuscript, Duke University, Durham.Google Scholar
  33. Muthén, B. O. (1988).LISCOMP—Analysis of linear structural equations with a comprehensive measurement model. Mooresville: Scientific Software.Google Scholar
  34. Press, S. J., & Shigemasu, K. (1989). Bayesian inference in factor analysis. In L. Gleser, M. D. Perlman, S. J. Press, & A. R. Sampson (Eds.),Contributions to probability and statistics (pp. 271–287). New York: Springer Verlag.Google Scholar
  35. Ritter, C., & Tanner, M. A. (1992). Facilitating the Gibbs sampler: The Gibbs stopper and the Griddy-Gibbs sampler.Journal of the American Statistical Association, 87, 861–868.Google Scholar
  36. Rubin, D. B. (1991). EM and Beyond.Psychometrika, 56, 241–254.CrossRefGoogle Scholar
  37. Tanner, M. A. (1993).Tools for statistical inference (2nd ed.). Heidelberg and New York: Springer Verlag.Google Scholar
  38. Tierney, L. (1994). Markov chains for exploring posterior distributions (with Discussion).Annals of Statistics, 22, 1701–1762.Google Scholar

Copyright information

© The Psychometric Society 1998

Authors and Affiliations

  • Gerhard Arminger
    • 1
  • Bengt O. Muthén
    • 2
    • 3
  1. 1.Department of EconomicsFB6, Bergische Universität—GH WuppertalWuppertalGermany
  2. 2.University of CaliforniaLos Angeles
  3. 3.Graduate School of Education & Information StudiesUSA

Personalised recommendations