Selecting the tuning parameter in penalized Gaussian graphical models
- 193 Downloads
Penalized inference of Gaussian graphical models is a way to assess the conditional independence structure in multivariate problems. In this setting, the conditional independence structure, corresponding to a graph, is related to the choice of the tuning parameter, which determines the model complexity or degrees of freedom. There has been little research on the degrees of freedom for penalized Gaussian graphical models. In this paper, we propose an estimator of the degrees of freedom in \(\ell _1\)-penalized Gaussian graphical models. Specifically, we derive an estimator inspired by the generalized information criterion and propose to use this estimator as the bias term for two information criteria. We called these tuning parameter selectors GAIC and GBIC. These selectors can be used to choose the tuning parameter, i.e., the optimal tuning parameter is the minimizer of GAIC or GBIC. A simulation study shows that GAIC tends to improve the performance of both AIC-type and CV-type model selectors, in terms of estimation quality (entropy loss function) in high-dimensional setting. Moreover, GBIC model selector improves the performance of both BIC-type and CV-type model selectors, in terms of support recovery (F-score). A data analysis shows that GBIC selects a tuning parameter that produces a sparser graph with respect to BIC and a CV-type model selector (KLCV).
KeywordsPenalized likelihood Kullback–Leibler divergence Model complexity Model selection Generalized information criterion
The project was partially supported by the “European Cooperation in Science & Technology” (COST) funding: action number CA15109.
- Foygel, R., Drton, M.: Extended bayesian information criteria for gaussian graphical models. Adv. Neural Inf. Process. Syst. 23, 604–612 (2010)Google Scholar
- Fried, R., Vogel, D.: On robust gaussian Graphical Modelling. Recent Developments in Applied Probability and Statistics. Springer, Berlin (2009)Google Scholar
- Liu, H., Roeder, K., Wasserman, L.: Stability approach to regularization selection (stars) for high dimensional graphical models. Adv. Neural Inf. Process. Syst. 23, 1432–1440 (2010)Google Scholar
- Magnus, J.R., Neudecker, H.: Matrix Differential Calculus with Application in Statistics and Econometrics. Wiley Series in Probability and Statistics. Wiley, Hoboken (2007)Google Scholar
- Penny, W.: Kullback–Leibler divergences of normal, gamma. Dirichlet and Wishart densities. Tech. Rep., Welcome Department of Cognitive Neurology (2001)Google Scholar
- Schmidt, M., Fung, G. and Rosales, R.: Fast optimization methods for l1 regularization: A comparative study and two new approaches. In: Proceedings of European Conference on Machine Learning, pp. 286–297. Springer (2007)Google Scholar
- Wille, A., Zimmermann, P., Vranová, E., Fürholz, A., Laule, O., Bleuler, S., Hennig, L., Prelić, A., von Rohr, P., Thiele, L., Zitzler, E., Gruissem, W., Buhlmann, P.: Sparse graphical Gaussian modeling of the isoprenoid gene network in Arabidopsis thaliana. Genome Biol. 5(11), R92 (2004)CrossRefGoogle Scholar