Real Adaboost Ensembles with Emphasized Subsampling
Multi-Net systems in general, and the Real Adaboost algorithm in particular, offer a very interesting way of designing very powerful classifiers. However, one inconvenient of this schemes is the large computational burden required for their construction. In this paper, we propose a new Boosting scheme which incorporates subsampling mechanisms to speed up the training of base learners and, therefore, the setup of the ensemble network. Furthermore, subsampling the training data provides additional diversity among the constituent learners, according to the some principles exploited by Bagging approaches. Experimental results show that our method is in fact able to improve both Boosting and Bagging schemes in terms of recognition rates, while allowing significant training time reductions.
Unable to display preview. Download preview PDF.
- 4.Schapire, R.E.: The strength of weak learnability. In: 30th Annual Symposium on Foundations of Computer, pp. 28–33 (1989)Google Scholar
- 6.Arenas-García, J., Gómez-Verdejo, V., Muñoz-Romero, S., Ortega-Moral, M., Figueiras-Vidal, A.R.: Fast Classification with Neural Networks via Confidence Rating. In: Cabestany, J., Prieto, A.G., Sandoval, F. (eds.) IWANN 2005. LNCS, vol. 3512, pp. 622–629. Springer, Heidelberg (2005)CrossRefGoogle Scholar
- 10.Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html