Abstract
Improving the predicting performance of the multiple response regression compared with separate linear regressions is a challenging question. On the one hand, it is desirable to seek model parsimony when facing a large number of parameters. On the other hand, for certain applications it is necessary to take into account the general covariance structure for the errors of the regression model. We assume a reduced-rank regression model and work with the likelihood function with general error covariance to achieve both objectives. In addition we propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty, and to estimate the error covariance matrix simultaneously by using a similar penalty on the precision matrix. We develop a numerical algorithm to solve the penalized regression problem. In a simulation study and real data analysis, the new method is compared with two recent methods for multivariate regression and exhibits competitive performance in prediction and variable selection.
This is a preview of subscription content, access via your institution.

Notes
When we applied the exact MRCE algorithm for \(p \ge n\), the R function “mrce” often resulted in the precision matrix with extremely large determinant (>\(10^{15}\)) while the sparsity of \(\hat{\varvec{\varOmega }}\) was estimated reasonably well.
References
Breiman, L., Friedman, J.: Predicting multivariate responses in multiple linear regression. J. R. Stat. Soc. B 59, 3–54 (1997)
Buchen, T., Wohlrabe, K.: Forecasting with many predictors: is boosting a viable alternative? Econ. Lett. 113(1), 16–18 (2011)
Bunea, F., She, Y., Wegkamp, M.H.: Optimal selection of reduced rank estimators of high-dimensional matrices. Ann. Stat. 39(2), 1282–1309 (2011)
Bunea, F., She, Y., Wegkamp, M.H.: Joint variable and rank selection for parsimonious estimation of high-dimensional matrices. Ann. Stat. 40(5), 2359–2388 (2012)
Chen, K., Chan, K.S., Stenseth, N.C.: Reduced rank stochastic regression with a sparse singular value decomposition. J. R. Stat. Soc. B 74(2), 203–221 (2012)
Chen, K., Dong, H., Chan, K.S.: Reduced rank regression via adaptive nuclear norm penalization. Biometrika 100(4), 901–920 (2013)
Chen, L., Huang, J.Z.: Sparse reduced-rank regression with covariance estimation. J. Am. Stat. Assoc. 107(500), 1533–1545 (2012)
Friedman, J., Hastie, T., Tibshirani, R.: Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9, 432–441 (2007)
Izenman, A.J.: Reduced-rank regression for the multivariate linear model. J. Multivar. Anal. 5, 248–264 (1975)
Izenman, A.J.: Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning. Springer, New York (2008)
Mazumder, R., Hastie, T.: The graphical lasso: new insights and alternatives. Electron. J. Stat. 6, 2125–2149 (2012)
Negahban, S., Wainwright, M.J., et al.: Estimation of (near) low-rank matrices with noise and high-dimensional scaling. Ann. Stat. 39(2), 1069–1097 (2011)
Obozinski, G., Wainwright, M.J., Jordan, M.I., et al.: Support union recovery in high-dimensional multivariate regression. Ann. Stat. 39(1), 1–47 (2011)
Peng, J., Zhu, J., Bergamaschi, A., Han, W., Noh, D., Pollack, J.R., Wang, P.: Regularized multivariate regression for identifying master predictors with application to integrative genomics study of breast cancer. Ann. Appl. Stat. 4, 53–77 (2010)
Reinsel, G.C., Velu, R.P.: Multivariate Reduced-Rank Regression, Theory and Applications. Springer, New York (1998)
Rothman, A., Levina, E., Zhu, J.: Sparse multivariate regression with covariance estimation. J. Comput. Gr. Stat. 19(4), 947–962 (2010)
Simila, T., Tikka, J.: Input selection and shrinkage in multiresponse linear regression. Comput. Stat. Data Anal. 52, 406–422 (2007)
Stock, J.H., Watson, M.W.: An empirical comparison of methods for forecasting using many predictors. Princeton University, Manuscript (2005)
Stock, J.H., Watson, M.W.: Forecasting with many predictors. Handbook of economic forecasting 1, 515–554 (2006)
Turlach, B., Venables, W., Wright, S.: Simultaneous variable selection. Technometrics 47, 350–363 (2005)
Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. B 68, 49–67 (2006)
Yuan, M., Lin, Y.: Model selection and estimation in the gaussian graphical model. Biometrika 94(1), 19–35 (2007)
Yuan, M., Ekici, A., Lu, Z., Monteiro, R.: Dimension reduction and coefficient estimation in multivariate linear regression. J. R. Stat. Soc. B 69, 329–346 (2007)
Acknowledgments
Huang’s work was partially supported by NSF grant DMS-1208952 and by Award Numbers KUS-CI-016-04 and GRP-CF-2011-19-P-Gao-Huang, made by King Abdullah University of Science and Technology (KAUST).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Chen, L., Huang, J.Z. Sparse reduced-rank regression with covariance estimation. Stat Comput 26, 461–470 (2016). https://doi.org/10.1007/s11222-014-9517-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11222-014-9517-6
Keywords
- Covariance estimation
- Group lasso
- Reduced-rank regression
- Variable selection