Abstract
Ensemble pruning looks for a subset of classifiers from a group of trained classifiers to make a better prediction performance for the test set. Recently, ensemble pruning techniques have attracted significant attention in the machine learning and the data mining community. Unlike previous heuristic approaches, in this paper we formalize the ensemble pruning problem as a function maximization problem to strike an optimal balance between quality of classifiers and diversity within the subset. Firstly, a quality and pairwise diversity combined framework is proposed and the function is proved to be submodular. Furthermore, we propose a submodular and monotonic function which is the composition of both quality and entropy diversity. Based on the theoretical analysis, although this maximization problem is still NP-hard, the greedy search algorithm with approximation guarantee of factor 1 - \(\frac{1}{e}\) is employed to get a near-optimal solution. Through the extensive experiments on 36 real datasets, our empirical studies demonstrate that our proposed approaches are capable of achieving superior performance and better efficiency.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Asuncion, A., Newman, D.: Uci machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)
Caruana, R., Niculescu-Mizil, A., Crew, G., Ksikes, A.: Ensemble selection from libraries of models. In: ICML, pp. 18–25 (2004)
Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley-Interscience (2006)
Fan, W., Wang, H., Yu, P.S., Ma, S.: Is random model better? on its accuracy and efficiency. In: ICDM, pp. 51–58 (2003)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. JCSS 55(1), 119–139 (1997)
Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)
Hoi, S.C.H., Jin, R., Zhu, J., Lyu, M.R.: Batch mode active learning and its application to medical image classification. In: ICML, pp. 417–424 (2006)
Kempe, D., Kleinberg, J., Tardos, E.: Maximizing the spread of influence through a social network. In: KDD (2003)
Krause, A., Cevher, V.: Submodular dictionary selection for sparse representation. In: ICML (2010)
Krause, A., Guestrin, C.: Near-optimal nonmyopic value of information in graphical models. In: Uncertainty in Artificial Intelligence, pp. 324–331 (2005)
Li, N., Yu, Y., Zhou, Z.-H.: Diversity regularized ensemble pruning. In: KDD, pp. 330–345 (2012)
Li, N., Zhou, Z.-H.: Selective ensemble under regularization framework. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds.) MCS 2009. LNCS, vol. 5519, pp. 293–303. Springer, Heidelberg (2009)
Lu, Z., Wu, X., Zhu, X., Bongard, J.: Ensemble pruning via individual contribution ordering. In: KDD (2010)
Margineantu, D., Dietterich, T.: Pruning adaptive boosting. In: ICML, pp. 211–218 (1997)
Martinez-Munoz, G., Suarez, A.: Pruning in ordered bagging ensembles. In: ICML (2006)
Nemhauser, G., Wolsey, L.A., Fisher, M.: An analysis of approximations for maximizing submodular set functions - i. Mathematical Programming 14(1), 265–294 (1978)
Partalas, I., Tsoumakas, G., Vlahavas, I.: A study on greedy algorithms for ensemble pruning. Technical Report TR-LPIS-360-12, LPIS, Dept. of Informatics, Aristotle University of Thessaloniki, Greece (2012)
Tamon, C., Xiang, J.: On the boosting pruning problem. In: Lopez de Mantaras, R., Plaza, E. (eds.) ECML 2000. LNCS (LNAI), vol. 1810, pp. 404–412. Springer, Heidelberg (2000)
Witten, I., Frank, E.: Data Mining: Practical machine learning tools and techniques. Morgan Kaufmann (2005)
Xu, L., Li, B., Chen, E.: Ensemble pruning via constrained eigen-optimization. In: ICDM (2012)
Zhang, Y., Burer, S., Street, W.N.: Ensemble pruning via semi-definite programming. Journal of Machine Learning Research 7 (2006)
Zhou, Z.-H.: Ensemble Methods: Foundations and Algorithms. Chapman and Hall CRC (2012)
Zhou, Z.-H., Li, N.: Multi-information ensemble diversity. In: El Gayar, N., Kittler, J., Roli, F. (eds.) MCS 2010. LNCS, vol. 5997, pp. 134–144. Springer, Heidelberg (2010)
Zhou, Z.-H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Sha, C., Wang, K., Wang, X., Zhou, A. (2014). Ensemble Pruning: A Submodular Function Maximization Perspective. In: Bhowmick, S.S., Dyreson, C.E., Jensen, C.S., Lee, M.L., Muliantara, A., Thalheim, B. (eds) Database Systems for Advanced Applications. DASFAA 2014. Lecture Notes in Computer Science, vol 8422. Springer, Cham. https://doi.org/10.1007/978-3-319-05813-9_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-05813-9_1
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-05812-2
Online ISBN: 978-3-319-05813-9
eBook Packages: Computer ScienceComputer Science (R0)