Journal of the European Mathematical Society

, Volume 3, Issue 3, pp 203–268

Gaussian model selection

  • Lucien Birgé
  • Pascal Massart

DOI: 10.1007/s100970100031

Cite this article as:
Birgé, L. & Massart, P. J. Eur. Math. Soc. (2001) 3: 203. doi:10.1007/s100970100031
  • 127 Downloads

Abstract.

Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and importance of model selection come from the fact that it provides a suitable approach to many different types of problems, starting from model selection per se (among a family of parametric models, which one is more suitable for the data at hand), which includes for instance variable selection in regression models, to nonparametric estimation, for which it provides a very powerful tool that allows adaptation under quite general circumstances. Our approach to model selection also provides a natural connection between the parametric and nonparametric points of view and copes naturally with the fact that a model is not necessarily true. The method is based on the penalization of a least squares criterion which can be viewed as a generalization of Mallows’Cp. A large part of our efforts will be put on choosing properly the list of models and the penalty function for various estimation problems like classical variable selection or adaptive estimation for various types of lp-bodies.

Mathematics Subject Classification (1991): 62G07, 62C20, 41A46 

Copyright information

© Springer-Verlag Berlin Heidelberg & EMS 2001

Authors and Affiliations

  • Lucien Birgé
    • 1
  • Pascal Massart
    • 2
  1. 1.UMR 7599 “Probabilités et modèles aléatoires”, Laboratoire de Probabilités, boîte 188, Université Paris VI, 4 Place Jussieu, 75252 Paris Cedex 05, France, e-mail: LB@CCR.JUSSIEU.FRFR
  2. 2.UMR 8628 “Laboratoire de Mathématiques”, Bât. 425, Université Paris Sud, Campus d’Orsay, 91405 Orsay Cedex, France, e-mail: PASCAL.MASSART@MATH.U–PSUD.FRFR

Personalised recommendations