Comparison of Bagging and Boosting Algorithms on Sample and Feature Weighting
We compared boosting with bagging in different strengths of learning algorithms for improving the performance of the set of classifiers to be fused. Our experimental results showed that boosting worked much with weak algorithms and bagging, especially feature-based bagging, worked much with strong algorithms. On the basis of these observations we developed a mixed fusion method in which randomly chosen features are used with a standard boosting method. As a result, it was confirmed that the proposed fusion method worked well regardless of learning algorithms.
KeywordsTraining Sample Feature Subset Fusion Method Testing Error Training Error
Unable to display preview. Download preview PDF.
- 2.Freund, Y., Shapire, E.: Experiments with a new boosting algorithm. In: Proc. 13th Int. Conf. on Machine Learning, pp. 148–156 (1996)Google Scholar
- 6.Ueda, N., Nakano, R.: Analysis of Improvement Effect for Generalization Error of Ensemble Estimators. IEICE technical report. Neurocomputing 95, 107–114 (1996) (in Japanese) Google Scholar
- 8.Shirai, S., Kudo, M., Nakamura, A.: Bagging, Random Subspace Method and Biding. LNCS, vol. 5342, pp. 801–810. Springer, Heidelberg (2008)Google Scholar
- 10.Quinlan, J.: Bagging, boosting, and c4.5. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence, pp. 725–730 (1996)Google Scholar
- 11.Kim, H., Pang, S., Je, H., Kim, D., Bang, S.: Pattern Classification Using Support Vector Machine Ensemble. In: Proc. 16th International Conference on Pattern Recognition, 2002, vol. 2, pp. 160–163 (2002)Google Scholar
- 12.Redpadth, D.B., Levart, K.: Boosting Feature Selection. LNCS, vol. 3086, pp. 305–314. Springer, Heidelberg (2005)Google Scholar
- 13.Blake, C., Merz, C.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/mlrepository.html