Machine Learning

, Volume 24, Issue 2, pp 123–140

Bagging predictors

  • Leo Breiman
Article

DOI: 10.1007/BF00058655

Cite this article as:
Breiman, L. Mach Learn (1996) 24: 123. doi:10.1007/BF00058655

Abstract

Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. The aggregation averages over the versions when predicting a numerical outcome and does a plurality vote when predicting a class. The multiple versions are formed by making bootstrap replicates of the learning set and using these as new learning sets. Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy. The vital element is the instability of the prediction method. If perturbing the learning set can cause significant changes in the predictor constructed, then bagging can improve accuracy.

Keywords

AggregationBootstrapAveragingCombining
Download to read the full article text

Copyright information

© Kluwer Academic Publishers 1996

Authors and Affiliations

  • Leo Breiman
    • 1
  1. 1.Statistics DepartmentUniversity of CaliforniaBerkeley