Skip to main content

Ensemble Pruning: A Submodular Function Maximization Perspective

  • Conference paper
Database Systems for Advanced Applications (DASFAA 2014)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8422))

Included in the following conference series:

Abstract

Ensemble pruning looks for a subset of classifiers from a group of trained classifiers to make a better prediction performance for the test set. Recently, ensemble pruning techniques have attracted significant attention in the machine learning and the data mining community. Unlike previous heuristic approaches, in this paper we formalize the ensemble pruning problem as a function maximization problem to strike an optimal balance between quality of classifiers and diversity within the subset. Firstly, a quality and pairwise diversity combined framework is proposed and the function is proved to be submodular. Furthermore, we propose a submodular and monotonic function which is the composition of both quality and entropy diversity. Based on the theoretical analysis, although this maximization problem is still NP-hard, the greedy search algorithm with approximation guarantee of factor 1 - \(\frac{1}{e}\) is employed to get a near-optimal solution. Through the extensive experiments on 36 real datasets, our empirical studies demonstrate that our proposed approaches are capable of achieving superior performance and better efficiency.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Asuncion, A., Newman, D.: Uci machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html

  2. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  3. Caruana, R., Niculescu-Mizil, A., Crew, G., Ksikes, A.: Ensemble selection from libraries of models. In: ICML, pp. 18–25 (2004)

    Google Scholar 

  4. Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley-Interscience (2006)

    Google Scholar 

  5. Fan, W., Wang, H., Yu, P.S., Ma, S.: Is random model better? on its accuracy and efficiency. In: ICDM, pp. 51–58 (2003)

    Google Scholar 

  6. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. JCSS 55(1), 119–139 (1997)

    MATH  MathSciNet  Google Scholar 

  7. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  8. Hoi, S.C.H., Jin, R., Zhu, J., Lyu, M.R.: Batch mode active learning and its application to medical image classification. In: ICML, pp. 417–424 (2006)

    Google Scholar 

  9. Kempe, D., Kleinberg, J., Tardos, E.: Maximizing the spread of influence through a social network. In: KDD (2003)

    Google Scholar 

  10. Krause, A., Cevher, V.: Submodular dictionary selection for sparse representation. In: ICML (2010)

    Google Scholar 

  11. Krause, A., Guestrin, C.: Near-optimal nonmyopic value of information in graphical models. In: Uncertainty in Artificial Intelligence, pp. 324–331 (2005)

    Google Scholar 

  12. Li, N., Yu, Y., Zhou, Z.-H.: Diversity regularized ensemble pruning. In: KDD, pp. 330–345 (2012)

    Google Scholar 

  13. Li, N., Zhou, Z.-H.: Selective ensemble under regularization framework. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds.) MCS 2009. LNCS, vol. 5519, pp. 293–303. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  14. Lu, Z., Wu, X., Zhu, X., Bongard, J.: Ensemble pruning via individual contribution ordering. In: KDD (2010)

    Google Scholar 

  15. Margineantu, D., Dietterich, T.: Pruning adaptive boosting. In: ICML, pp. 211–218 (1997)

    Google Scholar 

  16. Martinez-Munoz, G., Suarez, A.: Pruning in ordered bagging ensembles. In: ICML (2006)

    Google Scholar 

  17. Nemhauser, G., Wolsey, L.A., Fisher, M.: An analysis of approximations for maximizing submodular set functions - i. Mathematical Programming 14(1), 265–294 (1978)

    Article  MATH  MathSciNet  Google Scholar 

  18. Partalas, I., Tsoumakas, G., Vlahavas, I.: A study on greedy algorithms for ensemble pruning. Technical Report TR-LPIS-360-12, LPIS, Dept. of Informatics, Aristotle University of Thessaloniki, Greece (2012)

    Google Scholar 

  19. Tamon, C., Xiang, J.: On the boosting pruning problem. In: Lopez de Mantaras, R., Plaza, E. (eds.) ECML 2000. LNCS (LNAI), vol. 1810, pp. 404–412. Springer, Heidelberg (2000)

    Google Scholar 

  20. Witten, I., Frank, E.: Data Mining: Practical machine learning tools and techniques. Morgan Kaufmann (2005)

    Google Scholar 

  21. Xu, L., Li, B., Chen, E.: Ensemble pruning via constrained eigen-optimization. In: ICDM (2012)

    Google Scholar 

  22. Zhang, Y., Burer, S., Street, W.N.: Ensemble pruning via semi-definite programming. Journal of Machine Learning Research 7 (2006)

    Google Scholar 

  23. Zhou, Z.-H.: Ensemble Methods: Foundations and Algorithms. Chapman and Hall CRC (2012)

    Google Scholar 

  24. Zhou, Z.-H., Li, N.: Multi-information ensemble diversity. In: El Gayar, N., Kittler, J., Roli, F. (eds.) MCS 2010. LNCS, vol. 5997, pp. 134–144. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  25. Zhou, Z.-H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137 (2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Sha, C., Wang, K., Wang, X., Zhou, A. (2014). Ensemble Pruning: A Submodular Function Maximization Perspective. In: Bhowmick, S.S., Dyreson, C.E., Jensen, C.S., Lee, M.L., Muliantara, A., Thalheim, B. (eds) Database Systems for Advanced Applications. DASFAA 2014. Lecture Notes in Computer Science, vol 8422. Springer, Cham. https://doi.org/10.1007/978-3-319-05813-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-05813-9_1

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-05812-2

  • Online ISBN: 978-3-319-05813-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics