Bagging Ensemble Selection for Regression
Bagging ensemble selection (BES) is a relatively new ensemble learning strategy. The strategy can be seen as an ensemble of the ensemble selection from libraries of models (ES) strategy. Previous experimental results on binary classification problems have shown that using random trees as base classifiers, BES-OOB (the most successful variant of BES) is competitive with (and in many cases, superior to) other ensemble learning strategies, for instance, the original ES algorithm, stacking with linear regression, random forests or boosting. Motivated by the promising results in classification, this paper examines the predictive performance of the BES-OOB strategy for regression problems. Our results show that the BES-OOB strategy outperforms Stochastic Gradient Boosting and Bagging when using regression trees as the base learners. Our results also suggest that the advantage of using a diverse model library becomes clear when the model library size is relatively large. We also present encouraging results indicating that the non-negative least squares algorithm is a viable approach for pruning an ensemble of ensembles.
Unable to display preview. Download preview PDF.
- 4.Buhlmann, P., van de Geer, S.: Statistics for High-Demensional Data: Methods, Theory and Applications. Springer (2011)Google Scholar
- 5.Caruana, R., Munson, A., Niculescu-Mizil, A.: Getting the most out of ensemble selection. In: Proceedings of the Sixth International Conference on Data Mining, ICDM 2006 (2006)Google Scholar
- 6.Caruana, R., Niculescu-Mizil, A., Crew, G., Ksikes, A.: Ensemble selection from libraries of models. In: Proceedings of the Twenty-First International Conference on Machine Learning, ICML 2004 (2004)Google Scholar
- 10.Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.: The weka data mining software: An update. SIGKDD Explorations 11(1) (2009)Google Scholar
- 11.Hernandez-Lobato, D., Martinez-Munoz, G., Suarez, A.: Pruning in ordered regression bagging ensembles. In: International Joint Conference on Neural Networks, IJCNN 2006, pp. 1266–1273 (2006)Google Scholar
- 12.Lawson, C., Hanson, R.: Solving Least-Squares Problems. Prentice-Hall (1974)Google Scholar
- 13.Quinlan, J.R.: Learning with continuous Classes. In: Proceedings of the 5th Australian Joint Conference on Artificial Intelligence, pp. 343–348 (1992)Google Scholar
- 15.Sun, Q., Pfahringer, B.: Bagging ensemble selection. In: Proceedings of the 24th Australasian Conference on Artificial Intelligence, pp. 251–260. Springer, Perth (2011)Google Scholar
- 16.Webb, G.I.: Multiboosting: A technique for combining boosting and wagging. Machine Learning 40(2) (2000)Google Scholar
- 18.Yu, Y., Zhou, Z.H., Ting, K.M.: Cocktail ensemble for regression. In: Seventh IEEE International Conference on Data Mining, ICDM 2007, pp. 721–726 (2007)Google Scholar
- 19.Zhou, Z.-H.: Ensemble Methods: Foundations and Algorithms. Chapman & Hall/CRC, Boca Raton, FL (2012)Google Scholar