Pruning Adaptive Boosting Ensembles by Means of a Genetic Algorithm
This work analyzes the problem of whether, given a classification ensemble built by Adaboost, it is possible to find a subensemble with lower generalization error. In order to solve this task a genetic algorithm is proposed and compared with other heuristics like Kappa pruning and Reduce-error pruning with backfitting. Experiments carried out over a wide variety of classification problems show that the genetic algorithm behaves better than, or at least, as well as the best of those heuristics and that subensembles with similar and sometimes better prediction accuracy can be obtained.
Unable to display preview. Download preview PDF.
- Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. In: Proc. 2nd European Conference on Computational Learning Theory, pp. 23–37 (1995)Google Scholar
- Quinlan, J.R.: C4.5 programs for machine learning. Morgan Kaufmann, San Francisco (1993)Google Scholar
- Quinlan, J.R.: Bagging, boosting, and C4.5. In: Proc. 13th National Conference on Artificial Intelligence, Cambridge, MA, pp. 725–730 (1996)Google Scholar
- Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
- Martínez-Muñoz, G., Suárez, A.: Pruning in ordered bagging ensembles. International Conference on Machine Learning, 609–616 (2006)Google Scholar
- Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: Proc. 14th International Conference on Machine Learning, pp. 211–218. Morgan Kaufmann, San Francisco (1997)Google Scholar
- Garey, M.R., Johnson, D.S.: Computers and Intractability. In: A Guide to the Theory of NP-Completeness, W. H. Freeman & Co, New York (1990)Google Scholar
- Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998)Google Scholar