Advertisement

Psychometrika

, Volume 58, Issue 2, pp 211–232 | Cite as

A simple Gauss-Newton procedure for covariance structure analysis with high-level computer languages

  • Robert Cudeck
  • Kelli J. Klebe
  • Susan J. Henly
Article

Abstract

An implementation of the Gauss-Newton algorithm for the analysis of covariance structures that is specifically adapted for high-level computer languages is reviewed. With this procedure one need only describe the structural form of the population covariance matrix, and provide a sample covariance matrix and initial values for the parameters. The gradient and approximate Hessian, which vary from model to model, are computed numerically. Using this approach, the entire method can be operationalized in a comparatively small program. A large class of models can be estimated, including many that utilize functional relationships among the parameters that are not possible in most available computer programs. Some examples are provided to illustrate how the algorithm can be used.

Key words

covariance structures Gauss-Newton method simplex models second order factor analysis dichotomized variables 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson, T. W. (1960). Some stochastic process models for intelligence test scores. In K. J. Arrow, S. Karlin, & P. Suppes (Ed.),Mathematical methods in the social sciences (pp. 205–220). Stanford, CA: Stanford University Press.Google Scholar
  2. Bentler, P. M., & Lee, S.-L. (1983). Covariance structures under polynomial constraints: Applications to correlation and alpha-type structural models.Journal of Educational Statistics, 8, 207–222.Google Scholar
  3. Bock, R. D., & Lieberman, M. (1970). Fitting a response model for n dichotomously scored items.Psychometrika, 35, 179–197.Google Scholar
  4. Bollen, K. A. (1989).Structural equations with latent variables. New York: Wiley.Google Scholar
  5. Browne, M. W. (1982). Covariance structures. In D. M. Hawkins (Ed.),Topics in applied multivariate analysis (pp. 72–141). Cambridge: Cambridge University Press.Google Scholar
  6. Browne, M. W. (1984). Asymptotically distribution-free methods for the analysis of covariance structures.British Journal of Mathematical and Statistical Psychology, 37, 62–83.Google Scholar
  7. Browne, M. W., & du Toit, S. H. C. (1992). Automated fitting of nonstandard models.Multivariate Behavioral Research, 27, 269–300.Google Scholar
  8. Burden, R. L., & Faires, J. D. (1989).Numerical analysis (4th ed.). Boston: Prindle, Weber & Schmidt.Google Scholar
  9. Christoffersson, A. (1975). Factor analysis of dichotomized variables.Psychometrika, 40, 5–32.Google Scholar
  10. Fraser, C. (1979).COSAN: User's guide. Toronto: Ontario Institute for Studies in Education, Department of Measurement, Evaluation and Computer Applications.Google Scholar
  11. Foremel, E. A. (1971). A comparison of computer routines for the calculation of the tetrachoric correlation coefficient.Psychometrika, 36, 165–173.Google Scholar
  12. Green, D. P., & Palmquist, B. L. (1991). More “tricks of the trade”: Reparameterizing LISREL models using negative variances.Psychometrika, 56, 137–145.Google Scholar
  13. Humphreys, L. G. (1968). The fleeting nature of the prediction of college academic success.Journal of Educational Psychology, 59, 375–380.Google Scholar
  14. Jennrich, R. I., & Sampson, P. F. (1966). Application of stepwise regression to nonlinear estimation.Technometrics, 10, 63–72.Google Scholar
  15. Jöreskog, K. G. (1970). Estimation and testing of simplex models.British Journal of Mathematical and Statistical Psychology, 23, 121–145.Google Scholar
  16. Jöreskog, K. G. (1981). Analysis of covariance structures.Scandinavian Journal of Statistics, 8, 65–92.Google Scholar
  17. Jöreskog, K. G., & Sörbom, D. (1989).LISREL 7 user's guide. Mooresville, IN: Scientific Software.Google Scholar
  18. Lee, S.-Y., & Jennrich, R. I. (1979). A study of algorithms for covariance structure analysis with specific comparisons using factor analysis.Psychometrika, 44, 99–113.Google Scholar
  19. Lee, S.-Y., & Jennrich, R. I. (1984). The analysis of structural equation models by means of derivative free nonlinear least squares.Psychometrika, 49, 521–528.Google Scholar
  20. Long, J. S. (1990). [Review of software for structural equation modeling: EQS, EQS-EM, EzPATH, LISCOMP, LISREL 7].Journal of Marketing Research, 37, 372–378.Google Scholar
  21. McDonald, R. P. (1967). Nonlinear factor analysis.Psychometric Monograph Number 15, 32 (4, Pt. 2).Google Scholar
  22. McDonald, R. P. (1978). A simple comprehensive model for the analysis of covariance structures.British Journal of Mathematical and Statistical Psychology, 31, 59–72.Google Scholar
  23. McDonald, R. P. (1985). Unidimensional and multidimensional models for item response theory. In D. J. Weiss (Ed.),Proceedings of the 1982 item response theory and computerized adaptive testing conference (pp. 127–148). Minneapolis, MN: University of Minnesota, Department of Psychology.Google Scholar
  24. Mooijaart, A. (1983). Two kinds of factor analysis for ordered categorical variables.Multivariate Behavioral Research, 18, 423–441.Google Scholar
  25. Morrison, D. F. (1990).Multivariate statistical methods (3rd ed.). New York: McGraw-Hill.Google Scholar
  26. Muthén, B. (1978). Contributions to factor analysis of dichotomous variables.Psychometrika, 43, 551–560.Google Scholar
  27. Pickle, L. W. (1991). Maximum likelihood estimation in the new computing environment.Statistical Computing and Statistical Graphical Newsletter, 3, 6–15.Google Scholar
  28. Rindskopf, D. (1983). Parameterizing inequality constraints on unique variances in linear structural models.Psychometrika, 48, 73–83.Google Scholar
  29. Rindskopf, D. (1984). Using phantom and imaginary latent variables to parameterize constraints in linear structural models.Psychometrika, 49, 37–47.Google Scholar
  30. Rindskopf, D., & Rose, T. (1988). Some theory and applications of confirmatory second-order factor analysis.Multivariate Behavioral Research, 23, 51–67.Google Scholar
  31. SAS Institute. (1985).SAS/IML user's guide, version 5. Cary, NC: Author.Google Scholar
  32. Sörbom, D. (1975). Detection of correlated errors in longitudinal data.British Journal of Mathematical and Statistical Psychology, 28, 138–151.Google Scholar
  33. Swain, A. J. (1975). A class of factor analysis estimation procedures with common asymptotic sampling properties.Psychometrika, 40, 315–335.Google Scholar
  34. Takane, Y., & de Leeuw, J. (1987). On the relationship between item response theory and factor analysis of discretized variables.Psychometrika, 52, 393–408.Google Scholar
  35. Thisted, R. A. (1988).Elements of statistical computing. New York: Chapman and Hall.Google Scholar
  36. Werts, C., Linn, R. L., & Jöreskog, K. G. (1978). Reliability of college grades from longitudinal data.Educational and Psychological Measurement, 38, 89–95.Google Scholar
  37. Wheaton, B., Muthén, B., Alwin, D., & Summers, G. (1977). Assessing reliability and stability in panel models. In D. R. Heise, (Ed.),Sociological methodology (pp. 84–136). San Francisco: Jossey-Bass.Google Scholar
  38. Wiley, J. A., & Wiley, M. G. (1974). A note on correlated errors in repeated measurements.Sociological Methods and Research, 3, 172–188.Google Scholar
  39. Woltz, D. J. (1988). An investigation of the role of working memory in procedural skill acquisition.Journal of Experimental Psychology: General, 117, 319–331.Google Scholar
  40. Wothke, W., & Browne, M. W. (1990). The direct product model for the MTMM matrix parameterized as a second order factor analysis model.Psychometrika, 55, 255–262.Google Scholar

Copyright information

© The Psychometric Society 1993

Authors and Affiliations

  • Robert Cudeck
    • 1
  • Kelli J. Klebe
    • 2
  • Susan J. Henly
    • 3
  1. 1.Department of PsychologyUniversity of MinnesotaMinneapolis
  2. 2.University of ColoradoColorado Springs
  3. 3.College of NursingUniversity of North DakotaUSA

Personalised recommendations