Probability Theory and Related Fields

, Volume 97, Issue 1, pp 113–150

Rates of convergence for minimum contrast estimators

  • Lucien Birgé
  • Pascal Massart
Article

DOI: 10.1007/BF01199316

Cite this article as:
Birgé, L. & Massart, P. Probab. Th. Rel. Fields (1993) 97: 113. doi:10.1007/BF01199316

Summary

We shall present here a general study of minimum contrast estimators in a nonparametric setting (although our results are also valid in the classical parametric case) for independent observations. These estimators include many of the most popular estimators in various situations such as maximum likelihood estimators, least squares and other estimators of the regression function, estimators for mixture models or deconvolution... The main theorem relates the rate of convergence of those estimators to the entropy structure of the space of parameters. Optimal rates depending on entropy conditions are already known, at least for some of the models involved, and they agree with what we get for minimum contrast estimators as long as the entropy counts are not too large. But, under some circumstances (“large” entropies or changes in the entropy structure due to local perturbations), the resulting the rates are only suboptimal. Counterexamples are constructed which show that the phenomenon is real for non-parametric maximum likelihood or regression. This proves that, under purely metric assumptions, our theorem is optimal and that minimum contrast estimators happen to be suboptimal.

Mathematics Subject Classification (1980)

62G0562J02

Copyright information

© Springer-Verlag 1993

Authors and Affiliations

  • Lucien Birgé
    • 1
  • Pascal Massart
    • 2
  1. 1.URA 1321 “Statistique et modèles aléatoires,” 45-55 3e ét boîte 158Université Paris VIParis Cedex 05France
  2. 2.URA 743 “Modélisation stochastique et Statistique”Université Paris SudOrsay CedexFrance