Averaged Conservative Boosting: Introducing a New Method to Build Ensembles of Neural Networks
In this paper, a new algorithm called Averaged Conservative Boosting (ACB) is presented to build ensembles of neural networks. In ACB we mix the improvements that Averaged Boosting (Aveboost) and Conservative Boosting (Conserboost) made to Adaptive Boosting (Adaboost). In the algorithm we propose we have applied the conservative equation used in Conserboost along with the averaged procedure used in Aveboost in order to update the sampling distribution used in the training of Adaboost. We have tested the methods with seven databases from the UCI repository. The results show that the best results are provided by our method, Averaged Conservative Boosting.
KeywordsSampling Distribution Single Network Multilayer Feedforward Network Wisconsin Breast Cancer Previous Network
Unable to display preview. Download preview PDF.
- 2.Raviv, Y., Intratorr, N.: Bootstrapping with noise: An effective regularization technique. Connection Science, Special issue on Combining Estimators 8, 356–372 (1996)Google Scholar
- 3.Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
- 6.Oza, N.C.: Boosting with averaged weight vectors. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 15–24. Springer, Heidelberg (2003)Google Scholar
- 7.Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html