Skip to main content

On the Precision Matrix in Semi-High-Dimensional Settings

  • Conference paper
  • First Online:
  • 669 Accesses

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 322))

Abstract

Many aspects of multivariate analysis involve obtaining the precision matrix, i.e., the inverse of the covariance matrix. When the dimension is larger than the sample size, the sample covariance matrix is no longer positive definite, and the inverse does not exist. Under the sparsity assumption on the elements of the precision matrix, the problem can be solved by fitting a Gaussian graphical model with lasso penalty. However, in high-dimensional settings in behavioral sciences, the sparsity assumption does not necessarily hold. The dimensions are often greater than the sample sizes, while they are likely to be comparable in size. Under such circumstances, introducing some covariance structures might solve the issue of estimating the precision matrix. Factor analysis is employed for modeling the covariance structure and the Woodbury identity to find the precision matrix. Different methods are compared such as unweighted least squares and factor analysis with equal unique variances (i.e., the probabilistic principal component analysis), as well as ridge factor analysis with small ridge parameters. Results indicate that they all give relatively small mean squared errors even when the dimensions are larger than the sample size.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Bentler, P. M. (1976). Multistructure statistical model applied to factor analysis. Multivariate Behavioral Research, 11, 3–15.

    Article  Google Scholar 

  • Bickel, P., & Lavina, E. (2008). Covariance regularization by thresholding. Annals of Statistics, 36, 2577–2604.

    Article  MathSciNet  Google Scholar 

  • Byrd, R. H., Lu, P., Nocedal, J., & Zhu, C. (1995). A limited memory algorithm for bound constrained optimization. SIAM Journal on Scientific Computing, 16, 1190–1208.

    Article  MathSciNet  Google Scholar 

  • Engel, J., Buydens, L., & Blanchet, L. (2017). An overview of large-dimensional covariance and precision matrix estimator with applications in chemometrics. Journal of Chemometrics, 31, article e2880.

    Google Scholar 

  • Friedman, J., Hastie, T., & Tibshirani, R. (2008). Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9, 432–441.

    Article  Google Scholar 

  • Friedman, J., Hastie, T. & Tibshirani, R. (2018). Graphical lasso: Estimation of Gaussian graphical models, Version 1.10. https://cran.r-project.org/web/packages/glasso/glasso.pdf

  • Guttman, L. (1956). Best possible systematic estimates of communalities. Psychometrika, 21, 273–285.

    Article  MathSciNet  Google Scholar 

  • Harville, D. A. (1997). Matrix algebra from a statistician’s perspective. New York: Springer.

    Book  Google Scholar 

  • Hastie, T., Tibshirani, R., & Wainwright, M. (2015). Statistical learning with sparsity: The lasso and generalizations. Boca Raton, FL: CRC Press.

    Book  Google Scholar 

  • Hayashi, K., & Bentler, P. M. (2000). On the relations among regular, equal unique variances and image factor analysis. Psychometrika, 65, 59–72.

    Article  MathSciNet  Google Scholar 

  • Hayashi, K., Yuan, K.-H., & Jiang, G. (2019). On extended Guttman condition in high dimensional factor analysis. In M. Wilberg, S. Culpepper, R. Janssen, J. Gonzalez, & D. Molenaar (Eds.), Quantitative psychology: The 83rd annual meeting of the psychometric Society, New York City, 2018 (pp. 221–228). New York: Springer.

    Chapter  Google Scholar 

  • Krijnen, W. P. (2006). Convergence of estimates of unique variances in factor analysis, based on the inverse sample covariance matrix. Psychometrika, 71, 193–199.

    Article  MathSciNet  Google Scholar 

  • Lawley, D. N., & Maxwell, A. E. (1971). Factor analysis as a statistical method (2nd ed.). New York: American Elsevier.

    MATH  Google Scholar 

  • MacCallum, R. C., Browne, M. W., & Cai, L. (2007). Factor analysis models as approximations. In R. Cudeck & R. C. MacCallum (Eds.), Factor analysis at 100: Historical developments and future directions (pp. 153–175). Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Mazumder, R., & Hastie, T. (2012). The graphical lasso: New insights and alternatives. Electronic Journal of Statistics, 6, 2125–2149.

    Article  MathSciNet  Google Scholar 

  • Pourahmadi, M. (2013). High-dimensional covariance estimation. New York: Wiley.

    Book  Google Scholar 

  • Schneeweiss, H. (1997). Factors and principal components in the near spherical case. Multivariate Behavioral Research, 32, 375–401.

    Article  Google Scholar 

  • Tipping, M. E., & Bishop, C. M. (1999). Probabilistic principal component analysis. Journal of the Royal Statistical Society, Series B, 61, 611–622.

    Article  MathSciNet  Google Scholar 

  • Yuan, K.-H., & Chan, W. (2008). Structural equation modeling with near singular covariance matrices. Computational Statistics & Data Analysis, 52, 4842–4858.

    Article  MathSciNet  Google Scholar 

  • Yuan, K.-H., & Chan, W. (2016). Structural equation modeling with unknown population distributions: Ridge generalized least squares. Structural Equation Modeling, 23, 163–179.

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

The authors would like to thank Dr. Dylan Molenaar for his careful review of the manuscript and also thank Rachelle Podhorzer for clerical assistance. This work was supported by Grant 31971029 from the National Natural Science Foundation of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kentaro Hayashi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hayashi, K., Yuan, KH., Jiang, G. (2020). On the Precision Matrix in Semi-High-Dimensional Settings. In: Wiberg, M., Molenaar, D., González, J., Böckenholt, U., Kim, JS. (eds) Quantitative Psychology. IMPS 2019. Springer Proceedings in Mathematics & Statistics, vol 322. Springer, Cham. https://doi.org/10.1007/978-3-030-43469-4_15

Download citation

Publish with us

Policies and ethics