Pattern Analysis & Applications

, Volume 5, Issue 2, pp 121–135

Bagging, Boosting and the Random Subspace Method for Linear Classifiers

  • Marina Skurichina
  • Robert P. W. Duin

DOI: 10.1007/s100440200011

Cite this article as:
Skurichina, M. & Duin, R. Pattern Anal Appl (2002) 5: 121. doi:10.1007/s100440200011


Recently bagging, boosting and the random subspace method have become popular combining techniques for improving weak classifiers. These techniques are designed for, and usually applied to, decision trees. In this paper, in contrast to a common opinion, we demonstrate that they may also be useful in linear discriminant analysis. Simulation studies, carried out for several artificial and real data sets, show that the performance of the combining techniques is strongly affected by the small sample size properties of the base classifier: boosting is useful for large training sample sizes, while bagging and the random subspace method are useful for critical training sample sizes. Finally, a table describing the possible usefulness of the combining techniques for linear classifiers is presented.

Key words: Bagging; Boosting; Combining classifiers; Linear classifiers; Random subspaces; Training sample size 

Copyright information

© Springer-Verlag London Limited 2002

Authors and Affiliations

  • Marina Skurichina
    • 1
  • Robert P. W. Duin
    • 1
  1. 1.Pattern Recognition Group, Department of Applied Physics, Faculty of Applied Sciences, Delft University of Technology, Delft, The NetherlandsNL

Personalised recommendations