Artificial Intelligence Review

, Volume 35, Issue 3, pp 223–240

Combining bagging, boosting, rotation forest and random subspace methods

Article

DOI: 10.1007/s10462-010-9192-8

Cite this article as:
Kotsiantis, S. Artif Intell Rev (2011) 35: 223. doi:10.1007/s10462-010-9192-8

Abstract

Bagging, boosting, rotation forest and random subspace methods are well known re-sampling ensemble methods that generate and combine a diversity of learners using the same learning algorithm for the base-classifiers. Boosting and rotation forest algorithms are considered stronger than bagging and random subspace methods on noise-free data. However, there are strong empirical indications that bagging and random subspace methods are much more robust than boosting and rotation forest in noisy settings. For this reason, in this work we built an ensemble of bagging, boosting, rotation forest and random subspace methods ensembles with 6 sub-classifiers in each one and then a voting methodology is used for the final prediction. We performed a comparison with simple bagging, boosting, rotation forest and random subspace methods ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique had better accuracy in most cases.

Keywords

Data miningMachine learningPattern recognitionEnsembles of classifiers

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  1. 1.Department of MathematicsUniversity of PatrasPatrasGreece