Statistics and Computing

, Volume 27, Issue 3, pp 789–804 | Cite as

Structured regularization for conditional Gaussian graphical models

  • Julien Chiquet
  • Tristan Mary-Huard
  • Stéphane Robin


Conditional Gaussian graphical models are a reparametrization of the multivariate linear regression model which explicitly exhibits (i) the partial covariances between the predictors and the responses, and (ii) the partial covariances between the responses themselves. Such models are particularly suitable for interpretability since partial covariances describe direct relationships between variables. In this framework, we propose a regularization scheme to enhance the learning strategy of the model by driving the selection of the relevant input features by prior structural information. It comes with an efficient alternating optimization procedure which is guaranteed to converge to the global minimum. On top of showing competitive performance on artificial and real datasets, our method demonstrates capabilities for fine interpretation, as illustrated on three high-dimensional datasets from spectroscopy, genetics, and genomics.


Multivariate regression Regularization Sparsity  Conditional Gaussian graphical model Structured elastic net Regulatory motif QTL study Spectroscopy 



We would like to thank Mathieu Lajoie and Laurent Bréhélin for kindly sharing the dataset from Gasch et al. (2000). We also thank the reviewers for their questions and remarks, which helped us to improve our manuscript. This project was conducted in the framework of the project AMAIZING funded by the French ANR. This work has been partially supported by the GRANT Reg4Sel from the French INRA-SelGen metaprogram.


  1. Bach, F., Jenatton, R., Mairal, J., Obozinski, G.: Optimization with sparsity-inducing penalties. Found. Trends Mach. Learn. 4(1), 1–106 (2012)CrossRefzbMATHGoogle Scholar
  2. Brown, P., Vannucci, M., Fearn, T.: Multivariate bayesian variable selection and prediction. J. R. Stat. Soc. B 60(3), 627–641 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  3. Brown, P., Fearn, T., Vannucci, M.: Bayesian wavelet regression on curves with applications to a spectroscopic calibration problem. J. Am. Stat. Assoc. 96, 398–408 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  4. Chiquet, J., Grandvalet, Y., Ambroise, C.: Inferring multiple graphical structures. Stat. Comput. 21(4), 537–553 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  5. de los Campos, G., Hickey, J., Pong-Wong, R., Daetwyler, H., Calus, M.: Whole genome regression and prediction methods applied to plant and animal breeding. Genetics 193(2), 327–345 (2012)CrossRefGoogle Scholar
  6. Efron, B.: The estimation of prediction error: covariance penalties and cross-validation (with discussion). J. Am. Stat. Assoc. 99, 619–642 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  7. Ferreira, M., Satagopan, J., Yandell, B., Williams, P., Osborn, T.: Mapping loci controlling vernalization requirement and flowering time in brassica napus. Theor. Appl. Genet. 90, 727–732 (1995)CrossRefGoogle Scholar
  8. Friedman, J., Hastie, T., Tibshirani, R.: Regularization paths for generalized linear models via coordinate descent. J. Stat. Softw. 33, 1–22 (2010)CrossRefGoogle Scholar
  9. Gasch, A., Spellman, P., Kao, C., Carmel-Harel, O., Eisen, M.B., Storz, G., Botstein, D., Brown, P.: Genomic expression programs in the response of yeast cells to environmental changes. Mol. Biol. Cell 11(12), 4241–4257 (2000)CrossRefGoogle Scholar
  10. Hans, C.: Elastic net regression modeling with the orthant normal prior. J. Am. Stat. Assoc. 106, 1383–1393 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  11. Harville, D.: Matrix Algebra from a Statistician’s Perspective. Springer, New York (1997)CrossRefzbMATHGoogle Scholar
  12. Hebiri, M., van De Geer, S.: The smooth-lasso and other l1 + l2 penalized methods. Electron. J. Stat. 5, 1184–1226 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  13. Hesterberg, T., Choi, N.M., Meier, L., Fraley, C.: Least angle and \(\ell _{1}\) penalized regression: a review. Stat. Surv. 2, 61–93 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  14. Hoefling, H.: A path algorithm for the fused lasso signal approximator. J. Comput. Graph. Stat. 19(4), 984–1006 (2010)MathSciNetCrossRefGoogle Scholar
  15. Kim, S., Xing, E.: Statistical estimation of correlated genome associations to a quantitative trait network. PLoS Genet. 5(8), e1000587 (2009)CrossRefGoogle Scholar
  16. Kim, S., Xing, E.: Tree-guided group lasso for multi-task regression with structured sparsity. In: Proceedings of the 27th International Conference on Machine Learning, pp. 543–550 (2010)Google Scholar
  17. Kim, S.J., Koh, K., Boyd, S., D, G.: \(\ell _1\) trend filtering. SIAM Rev. 51(2), 339–360 (2009)MathSciNetCrossRefGoogle Scholar
  18. Kole, C., Thorman, C., Karlsson, B., Palta, J., Gaffney, P., Yandell, B., Osborn, T.: Comparative mapping of loci controlling winter survival and related traits in oilseed brassica rapa and B. napus. Mol. Breed. 1, 201–210 (2002)CrossRefGoogle Scholar
  19. Krishna, A., Bondell, H., Ghosh, S.: Bayesian variable selection using an adaptive powered correlation prior. J. Stat. Plan. Inference 139(8), 2665–2674 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  20. Lajoie, M., Gascuel, O., Lefort, V., Brehelin, L.: Computational discovery of regulatory elements in a continuous expression space. Genome Biol. 13(11), R109 (2012). doi: 10.1186/gb-2012-13-11-r109 CrossRefGoogle Scholar
  21. Li, C., Li, H.: Variable selection and regression analysis for graph-structured covariates with an application to genomics. Ann. Appl. Stat. 4(3), 1498–1516 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  22. Li, X., Panea, C., Wiggins, C., Reinke, V., Leslie, C.: Learning “graph-mer” motifs that predict gene expression trajectories in development. PLoS Comput. Biol. 6(4), e1000,761 (2010)CrossRefGoogle Scholar
  23. Lorbert, A., Eis, D., Kostina, V., Blei, D., Ramadge, P.: Exploiting covariate similarity in sparse regression via the pairwise elastic net. In: Teh, Y.W., Titterington, D.M. (eds.) Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics (AISTATS-10), vol. 9, pp. 477–484 (2010)Google Scholar
  24. Mardia, K., Kent, J., Bibby, J.: Multivariate Analysis. Academic Press, London (1979)Google Scholar
  25. Marin, J.M., Robert, C.P.: Bayesian Core: A Practical Approach to Computational Bayesian Statistics. Springer, New York (2007)zbMATHGoogle Scholar
  26. Obozinski, G., Wainwright, M., Jordan, M.: Support union recovery in high-dimensional multivariate regression. Ann. Stat. 39(1), 1–47 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  27. Osborne, B., Fearn, T., Miller, A., Douglas, S.: Application of near infrared reflectance spectroscopy to compositional analysis of biscuits and biscuit doughs. J. Sci. Food Agric. 35, 99–105 (1984)CrossRefGoogle Scholar
  28. Osborne, M.R., Presnell, B., Turlach, B.A.: On the lasso and its dual. J. Comput. Graph. Stat. 9(2), 319–337 (2000)MathSciNetGoogle Scholar
  29. Park, T., Casella, G.: The Bayesian lasso. J. Am. Stat. Assoc. 103, 681–686 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  30. Rapaport, F., Zinovyev, A., Dutreix, M., Barillot, E., Vert, J.P.: Classification of microarray data using gene networks. BMC Bioinform. 8, 35 (2007)CrossRefGoogle Scholar
  31. Rothman, A., Levina, E., Zhu, J.: Sparse multivariate regression with covariance estimation. J. Comput. Graph. Stat. 19(4), 947–962 (2010)MathSciNetCrossRefGoogle Scholar
  32. Shannon, P.: MotifDb: An Annotated Collection of Protein-DNA Binding Sequence Motifs. R package version 1.4.0 (2013)Google Scholar
  33. Slawski, M., W, Zu Castell, Tutz, G.: Feature selection guided by structural information. Ann. Appl. Stat. 4, 1056–1080 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  34. Sohn, K., Kim, S.: Joint estimation of structured sparsity and output structure in multiple-output regression via inverse-covariance regularization. JMLR W&CP(22), 1081–1089 (2012)Google Scholar
  35. Städler, N., Bühlmann, P., Geer, S.: \(\ell _1\)-penalization for mixture regression models. Test 19(2), 209–256 (2010). doi: 10.1007/s11749-010-0197-z MathSciNetCrossRefzbMATHGoogle Scholar
  36. Stein, C.: Estimation of the mean of a multivariate normal distribution. Ann. Stat. 9, 1135–1151 (1981)MathSciNetCrossRefzbMATHGoogle Scholar
  37. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. R. Stat. Soc. B 58, 267–288 (1996)MathSciNetzbMATHGoogle Scholar
  38. Tibshirani, R., Taylor, J.: The solution path of the generalized lasso. Ann. Stat. 39(3), 1335–1371 (2011). doi: 10.1214/11-AOS878 MathSciNetCrossRefzbMATHGoogle Scholar
  39. Tibshirani, R., Taylor, J.: Degrees of freedom in lasso problems. Ann. Stat. 40, 1198–1232 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  40. Tibshirani, R., Saunders, M., Rosset, S., Zhu, J., Knight, K.: Sparsity and smoothness via the fused lasso. J. R. Stat. Soc. B 67, 91–108 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  41. Tseng, P.: Convergence of a block coordinate descent method for nondifferentiable minimization. J. Optim. Theory Appl. 109(3), 475–494 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  42. Tseng, P., Yun, S.: A coordinate gradient descent method for nonsmooth separable minimization. Math. Program. 117, 387–423 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  43. Yin, J., Li, H.: A sparse conditional Gaussian graphical model for analysis of genetical genomics data. Ann. Appl. Stat. 5, 2630–2650 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  44. Yuan, X.T., Zhang, T.: Partial Gaussian graphical model estimation. IEEE Trans. Inform. Theory 60(3), 1673–1687 (2014)MathSciNetCrossRefGoogle Scholar
  45. Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. R. Stat. Soc. B 67, 301–320 (2005)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Julien Chiquet
    • 1
  • Tristan Mary-Huard
    • 2
  • Stéphane Robin
    • 3
  1. 1.LaMME - UMR 8071 CNRS/Université d’Évry-Val-d’EssonneBoulevard de FranceFrance
  2. 2.UMR de Génétique Végétale du Moulon, INRA/Univ. Paris Sud/CNRSFerme du MoulonFrance
  3. 3.UMR 518 AgroParisTech/INRAParisFrance

Personalised recommendations