Real Adaboost Ensembles with Emphasized Subsampling

  • Sergio Muñoz-Romero
  • Jerónimo Arenas-García
  • Vanessa Gómez-Verdejo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5517)


Multi-Net systems in general, and the Real Adaboost algorithm in particular, offer a very interesting way of designing very powerful classifiers. However, one inconvenient of this schemes is the large computational burden required for their construction. In this paper, we propose a new Boosting scheme which incorporates subsampling mechanisms to speed up the training of base learners and, therefore, the setup of the ensemble network. Furthermore, subsampling the training data provides additional diversity among the constituent learners, according to the some principles exploited by Bagging approaches. Experimental results show that our method is in fact able to improve both Boosting and Bagging schemes in terms of recognition rates, while allowing significant training time reductions.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Kuncheva, L.I.: Combining Pattern Classifiers. Methods and Algorithms. Wiley, New Jersey (2004)CrossRefMATHGoogle Scholar
  2. 2.
    Sharkey, A.J.C. (ed.): Combining Artificial Neural Nets. Ensemble and Modular Multi-Net Systems. Perspectives in Neural Computing. Springer, Heidelberg (1999)MATHGoogle Scholar
  3. 3.
    Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)MATHGoogle Scholar
  4. 4.
    Schapire, R.E.: The strength of weak learnability. In: 30th Annual Symposium on Foundations of Computer, pp. 28–33 (1989)Google Scholar
  5. 5.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to Boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)MathSciNetCrossRefMATHGoogle Scholar
  6. 6.
    Arenas-García, J., Gómez-Verdejo, V., Muñoz-Romero, S., Ortega-Moral, M., Figueiras-Vidal, A.R.: Fast Classification with Neural Networks via Confidence Rating. In: Cabestany, J., Prieto, A.G., Sandoval, F. (eds.) IWANN 2005. LNCS, vol. 3512, pp. 622–629. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  7. 7.
    Schapire, R.E., Singer, Y.: Improved Boosting algorithms using confidence-rated predictions. Machine Learning 37(3), 297–336 (1999)CrossRefMATHGoogle Scholar
  8. 8.
    Goldberg, D.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley, Reading (1989)MATHGoogle Scholar
  9. 9.
    Ripley, B.D.: Neural networks and related methods for classification (with discussion). Journal of the Royal Statistical Society 56(3), 409–456 (1994)MathSciNetMATHGoogle Scholar
  10. 10.
    Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences (1998),
  11. 11.
    Gómez-Verdejo, V., Arenas-García, J., Figueiras-Vidal, A.R.: A dynamically adjusted mixed emphasis method for building Boosting ensembles. IEEE Transactions on Neural Networks 19(1), 3–17 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Sergio Muñoz-Romero
    • 1
  • Jerónimo Arenas-García
    • 1
  • Vanessa Gómez-Verdejo
    • 1
  1. 1.Department of Signal Theory and CommunicationsUniversidad Carlos III de MadridLeganés-MadridSpain

Personalised recommendations