Skip to main content

Pruning Adaptive Boosting Ensembles by Means of a Genetic Algorithm

  • Conference paper
Intelligent Data Engineering and Automated Learning – IDEAL 2006 (IDEAL 2006)

Abstract

This work analyzes the problem of whether, given a classification ensemble built by Adaboost, it is possible to find a subensemble with lower generalization error. In order to solve this task a genetic algorithm is proposed and compared with other heuristics like Kappa pruning and Reduce-error pruning with backfitting. Experiments carried out over a wide variety of classification problems show that the genetic algorithm behaves better than, or at least, as well as the best of those heuristics and that subensembles with similar and sometimes better prediction accuracy can be obtained.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. In: Proc. 2nd European Conference on Computational Learning Theory, pp. 23–37 (1995)

    Google Scholar 

  • Quinlan, J.R.: C4.5 programs for machine learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  • Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Chapman & Hall, New York (1984)

    MATH  Google Scholar 

  • Quinlan, J.R.: Bagging, boosting, and C4.5. In: Proc. 13th National Conference on Artificial Intelligence, Cambridge, MA, pp. 725–730 (1996)

    Google Scholar 

  • Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)

    Google Scholar 

  • Dietterich, T.G.: An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning 40(2), 139–157 (2000)

    Article  Google Scholar 

  • Breiman, L.: Arcing classifiers. The Annals of Statistics 26(3), 801–849 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  • Zhou, Z.H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137(1-2), 239–263 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  • Martínez-Muñoz, G., Suárez, A.: Pruning in ordered bagging ensembles. International Conference on Machine Learning, 609–616 (2006)

    Google Scholar 

  • Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  • Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: Proc. 14th International Conference on Machine Learning, pp. 211–218. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

  • Tamon, C., Xiang, J.: On the boosting pruning problem. In: Lopez de Mantaras, R., Plaza, E. (eds.) ECML 2000. LNCS (LNAI), vol. 1810, pp. 404–412. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  • Garey, M.R., Johnson, D.S.: Computers and Intractability. In: A Guide to the Theory of NP-Completeness, W. H. Freeman & Co, New York (1990)

    Google Scholar 

  • Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc., Boston (1989)

    MATH  Google Scholar 

  • Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hernández-Lobato, D., Hernández-Lobato, J.M., Ruiz-Torrubiano, R., Valle, Á. (2006). Pruning Adaptive Boosting Ensembles by Means of a Genetic Algorithm. In: Corchado, E., Yin, H., Botti, V., Fyfe, C. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2006. IDEAL 2006. Lecture Notes in Computer Science, vol 4224. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11875581_39

Download citation

  • DOI: https://doi.org/10.1007/11875581_39

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-45485-4

  • Online ISBN: 978-3-540-45487-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics