Advertisement

Constrained M-Estimation for Regression

  • Beatriz Mendes
  • David E. Tyler
Part of the Lecture Notes in Statistics book series (LNS, volume 109)

Abstract

When using redescending M-estimates of regression, one must choose not only an estimate of scale, but since the redescending M-estimating equations may admit multiple solutions, of which all of them may not be a desired solution, one must also have a method for choosing a desirable solution to the estimating equations. We introduce here a new approach for properly scaling redescending M-estimating equations and for obtaining high breakdown point solutions to the equations by the introduction of the constrained M-estimates of regression, or the CM-estimates of regression for short. Unlike the S-estimates of regression, the CM-estimates of regression can be tuned to obtain good local robustness properties while maintaining a breakdown point of 1/2.

Keywords and phrases

Breakdown point M-estimates robust estimation S-estimates 

AMS 1991 subject classifications

Primary 62F35 secondary 62J05 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Davies, P.L. (1990): The asymptotics of S-estimators in the linear regression model. Ann. Statist. 18 1651–1292.MathSciNetCrossRefzbMATHGoogle Scholar
  2. [2]
    Donoho, D.L. and Huber, P.J. (1983): The notion of breakdown point. In Festschrift for Erich L. Lehmann (P.J. Bickel, K.A. Doksum, J.L. Hodges Jr., eds.), 157–184. Wadsworth, Belmont, Calif.Google Scholar
  3. [3]
    Hampel, F.R. (1974): The influence curve and its role in robust estimation. J. Amer. Statist. Assoc. 69 383–393.MathSciNetCrossRefzbMATHGoogle Scholar
  4. [4]
    Hampel, F.R., Ronchetti, E.M., Rousseeuw, P.J. and Stahel, W.A. (1986): Robust Statistics — The Approach Based On Influence Functions. John Wiley, New York.zbMATHGoogle Scholar
  5. [5]
    Hoagin, D.C., Mosteller, F. and Tukey, J.W. (editors) (1983): Understanding Robust and Exploratory Data Analysis. J. Willey & Sons, Inc.Google Scholar
  6. [6]
    Huber, P.J. (1973): Robust regression: Asymptotics, conjectures and Monte Carlo. Ann. Statist. 1 799–821.MathSciNetCrossRefzbMATHGoogle Scholar
  7. [7]
    Huber, P.J. (1984): Finite sample breakdown of M-and P-estimators. Ann. Statist. 12, No. 1, 119–126.MathSciNetCrossRefzbMATHGoogle Scholar
  8. [8]
    Maronna, R., Bustos, O.H. and Yohai, V.J. (1979): Bias-and efficiency-robustness of general M-estimators for regression with random carriers. In Smoothing Techniques for Curve Estimation (T. Gasser and M. Rosenblatt, eds.), Springer, New York.Google Scholar
  9. [9]
    Mendes, B. (1994): Constrained M-estimation for Linear Regression Models. Ph.D. Thesis, Rutgers Univ.Google Scholar
  10. [10]
    Rousseeuw, P.J. (1984): Least median of squares regression. J. Amer. Statist. Assoc. 79 871–880.MathSciNetCrossRefzbMATHGoogle Scholar
  11. [11]
    Rousseeuw, P.J. and Leroy, A.M. (1987): Robust Regression and Outlier Detection. John Wiley, New York.CrossRefzbMATHGoogle Scholar
  12. [12]
    Rousseeuw, P.J. and Yohai, V. (1984): Robust regression by means of S-estimators. Robust and Nonlinear time Series Analysis. Lecture Notes in Statist. #26 256–272. Springer-Verlag, New York.Google Scholar
  13. [13]
    Siegel, A.F. (1982): Robust regression using repeated medians. Biometrika 69 242–244.CrossRefzbMATHGoogle Scholar
  14. [14]
    Kent, J. and Tyler, D.E. (1993): Constrained M-estimation for Multivariate Location and Covariance Matrices. Unpublished manuscript.Google Scholar
  15. [15]
    Yohai, V.J. (1987): High breakdown point and high efficiency robust estimates for regression. Ann. Statist. 15 642–656.MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag New York, Inc. 1996

Authors and Affiliations

  • Beatriz Mendes
    • 1
  • David E. Tyler
    • 1
  1. 1.Rutgers UniversityNew BrunswickCanada

Personalised recommendations