Computation of the Minimum Covariance Determinant Estimator

  • Christoph Pesch
Conference paper
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)


Robust estimation of location and scale in the presence of outliers is an important task in classification. Outlier sensitive estimation will lead to a large number of misclassifications. Rousseeuw introduced two estimators with high breakdown point, namely the minimum-volume-ellipsoid estimator (MVE) and the minimum-covariance-determinant estimator (MCD). While the MCD estimator has better theoretical properties than the MVE, the latter one appears to be used more widely. This may be due to the lack of fast algorithms for computing the MCD, up to now.

In this paper two branch-and-bound algorithms for the exact computation of the MCD are presented. The results of their application to simulated samples are compared with a new heuristic algorithm “multistart iterative trimming” and the steepest descent method suggested by Hawkins. The results show that multistart iterative trimming is a good and very fast heuristic for the MCD which can be applied to samples of large size.


Steep Descent Steep Descent Method Breakdown Point Fast Heuristic Minimum Volume Ellipsoid 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. BUTLER, R. W. and DAVIES, P. L. and Jhun, M. (1993): Asymptotics for the Minimum Covariance Determinant Estimator. The Annals of Statistics, 21, No. 3, 1385–1400.Google Scholar
  2. DAVIES, L. (1992): The Asymptotics of Rousseeuw’s Minimum Volume Ellipsoid Estimator. The Annals of Statistics, 20, No. 4, 1828–1843.Google Scholar
  3. GNANADESIKAN, R. and KETTENRING, J. R. (1972): Robust Estimates, Re- siduals, and Outlier Detection with Multiresponse Data. Biometrics, 28, 81–124CrossRefGoogle Scholar
  4. HAWKINS, D. M. (1994): The feasible solution algorithm for the minimum covariance determinant estimator in multivariate data. Computational Statistics e4 Data Analysis, 17, 197–210.CrossRefGoogle Scholar
  5. PESCH, C. (1998): Fast Computation of the Minimum Covariance Determinant Estimator. MIP-9806, Technical Report, Fakultät für Mathematik und Informatik, Universität Passau, 94030 Passau, Germany.Google Scholar
  6. PREPARATA, F. P. and SHAMOS, M. I. (1988): Computational geometry. Springer Verlag, New York.Google Scholar
  7. ROUSSEEUW, P. J. (1983): Multivariate Estimation with High Breakdown Point. In Mathematical statistics and applications: Proc. 4th Pannonian Symp. Math. Stat., Bad Tatzmannsdorf, Austria. 283–297.Google Scholar
  8. ROUSSEEUW, P. J. and VAN DRIESSEN, K. (1997): A Fast Algorithm for the Minimum Covariance Determinant Estimator. Preprint. Technical Report, University of Antwerp, Submitted for publication. Google Scholar
  9. ROUSSEEUW, P. J. and LEROY, A. M. (1987): Robust Regression and Outlier Detection. John Wiley & Sons, Inc.Google Scholar

Copyright information

© Springer-Verlag Berlin · Heidelberg 1999

Authors and Affiliations

  • Christoph Pesch
    • 1
  1. 1.Fakultät für Mathematik und InformatikUniversität PassauGermany

Personalised recommendations