Improving Adaptive Boosting with a Relaxed Equation to Update the Sampling Distribution
Adaptive Boosting (Adaboost) is one of the most known methods to build an ensemble of neural networks. In this paper we briefly analyze and mix two of the most important variants of Adaboost, Averaged Boosting and Conservative Boosting, in order to build a robuster ensemble of neural networks. The mixed method called Averaged Conservative Boosting (ACB) applies the conservative equation used in Conserboost along with the averaged procedure used in Aveboost in order to update the sampling distribution. We have tested the methods with seven databases from the UCI repository. The results show that Averaged Conservative Boosting is the best performing method.
KeywordsSampling Distribution Single Network Relaxed Equation Multilayer Feedforward Network Wisconsin Breast Cancer
Unable to display preview. Download preview PDF.
- 1.Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
- 5.Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html