Skip to main content
Log in

A hierarchical and parallel branch-and-bound ensemble selection algorithm

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

This paper describes the development of an effective and efficient Hierarchical and Parallel Branch-and-Bound Ensemble Selection (H&PB&BEnS) algorithm. Using the proposed H&PB&BEnS, ensemble selection is accomplished in a divisional, parallel, and hierarchical way. H&PB&BEnS uses the superior performance of the Branch-and-Bound (B&B) algorithm in relation to small-scale combinational optimization problems, whilst also managing to avoid “the curse of dimensionality” that can result from the direct application of B&B to ensemble selection problems. The B&B algorithm is used to select each partitioned subensemble, which enhances the predictive accuracy of each pruned subsolution, and then the working mechanism of H&PB&BEnS improves the diversity of the ensemble selection results. H&PB&BEnS realizes layer-wise refinement of the selected ensemble solutions, which enables the classification performance of the selected ensembles to be improved in a layer-by-layer manner. Empirical investigations are conducted using five benchmark classification datasets, and the results verify the effectiveness and efficiency of the proposed H&PB&BEnS algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Trawinski K, Cordón O, Quirin A, Sánchez L (2013) Multiobjective genetic classifier selection for random oracles fuzzy rule-based classifier ensembles: how beneficial is the additional diversity?. Knowl-Based Syst 54:3–21

    Article  Google Scholar 

  2. Partalas I, Tsoumakas G, Vlahavas I (2010) An ensemble uncertainty aware measure for directed hill climbing ensemble pruning. Mach Learn 81:257–282

    Article  MathSciNet  Google Scholar 

  3. Margineantu D, Dietterich T (1997) Pruning adaptive boosting. In: Proceedings of the 14th international conference on machine learning

  4. Prodromidis AL, Stolfo SJ (2001) Cost complexity-based pruning of ensemble classifiers. Knowl Inf Syst 3:449–469

    Article  MATH  Google Scholar 

  5. Dai Q (2013) A competitive ensemble pruning approach based on cross-validation technique. Knowl-Based Syst 37:394–414

    Article  Google Scholar 

  6. Dai Q, Liu Z (2013) ModEnPBT: a modified backtracking ensemble pruning algorithm. Appl Soft Comput 13:4292–4302

    Article  Google Scholar 

  7. Fan W, Chu F, Wang H, Yu PS (2002) Pruning and dynamic scheduling of cost-sensitive ensembles. In: Eighteenth national conference on artificial intelligence, American association for artificial intelligence

  8. Banfield RE, Hall LO, Bowyer KW, Kegelmeyer WP (2005) Ensemble diversity measures and their application to thinning. Information Fusion 6:49–62

    Article  Google Scholar 

  9. Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the 21st international conference on machine learning

  10. Giacinto G, Roli F, Fumera G (2000) Design of effective multiple classifier systems by clustering of classifiers

  11. Martinez-Munoz G, Suarez A (2004) Aggregation ordering in bagging

  12. Martinez-Munoz G, Suarez A (2006) Pruning in ordered bagging ensembles

  13. Partalas I, Tsoumakas G, Vlahavas I (2008) Focused ensemble selection: a diversity-based method for greedy ensemble selection, vol 178. IOS Press, Patras, Greece: Amsterdam

    Google Scholar 

  14. Tsoumakas G, Angelis L, Vlahavas I (2005) Selective fusion of heterogeneous classifiers. Intelligent Data Analysis 9:511–525

    Google Scholar 

  15. Dai Q (2013) An efficient ensemble pruning algorithm using One-Path and Two-Trips searching approach. Knowl-Based Syst 51:85–92

    Article  Google Scholar 

  16. Dai Q (2013) A novel ensemble pruning algorithm based on randomized greedy selective strategy and ballot. Neurocomputing 122:258–265

    Article  Google Scholar 

  17. Dai Q, Li M (2015) Introducing randomness into greedy ensemble pruning algorithms. Appl Intell 42:406–429

    Article  Google Scholar 

  18. Dai Q, Zhang T, Liu N (2015) A new reverse reduce-error ensemble pruning algorithm. Appl Soft Comput 28:237–249

    Article  Google Scholar 

  19. Liu Z, Dai Q, Liu N (2014) Ensemble selection by GRASP. Appl Intell 41:128–144

    Article  Google Scholar 

  20. Partridge D, Yates W (1996) Engineering multiversion neural-net systems. Neural Comput 8:869–893

    Article  Google Scholar 

  21. Anifowose FA, Labadin J, Abdulraheem A (2015) Ensemble model of non-linear feature selection-based extreme learning machine for improved natural gas reservoir characterization. J Nat Gas Sci Eng

  22. Kokkinos Y, Margaritis KG (2015) Confidence ratio affinity propagation in ensemble selection of neural network classifiers for distributed privacy-preserving data mining. Neurocomputing 150:513–528

    Article  Google Scholar 

  23. JunTan C, Lim CP, Cheah YN (2014) A multi-objective evolutionary algorithm-based ensemble optimizer for feature selection and classification with neural network models. Neurocomputing 125:217–228

    Article  Google Scholar 

  24. Lysiak R, Kurzynski M, Woloszynski T (2014) Optimal selection of ensemble classifiers using measures of competence and diversity of base classifiers. Neurocomputing 126:29–35

    Article  Google Scholar 

  25. Garey MR, Johnson DS (1979) Computers and intractability: a guide to the theory of NP-completeness. W.H. Freeman & Co., New York

    MATH  Google Scholar 

  26. Bendjoudi A, Melab N, Talbi E-G (2012) Hierarchical branch and bound algorithm for computational grids. Futur Gener Comput Syst 28:1168–1176

    Article  Google Scholar 

  27. Martinez-Munoz G, Hernandez-Lobato D, Suarez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Mach Intell 31:245–259

    Article  Google Scholar 

  28. Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72:1900–1909

    Article  Google Scholar 

  29. Martinez-Munoz G, Suarez A (2007) Using boosting to prune bagging ensembles. Pattern Recogn Lett 28:156–165

    Article  Google Scholar 

  30. Wang XD (2001) Computer algorithms design and analysis. Publishing House of Electronics Industry, Beijing

    Google Scholar 

  31. http://www.ics.uci.edu/~mlearn/MLRepository.html or ftp.ics.uci.edu:pub/machine-learning-databases

  32. Dai Q, Liu NZ (2011) The build of n-bits binary coding ICBP ensemble system. Neurocomputing 74:3509–3519

    Article  Google Scholar 

  33. Dai Q, Chen SC, Zhang BZ (2003) Improved CBP neural network model with applications in time series prediction. Neural Process Lett 18:197–211

    Google Scholar 

  34. Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501

    Article  Google Scholar 

Download references

Acknowledgments

This work is supported by the National Natural Science Foundation of China under Grant no. 61473150.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qun Dai.

Ethics declarations

Disclosure of potential conflict of interest

Conflict of interests

The authors declare that they have no conflict of interest.

Research involving Human Participants and/or Animals

All procedures performed in studies involving human participants were carried out in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

For this type of study, formal consent was not required.

All applicable international, national, and/or institutional guidelines for the care and use of animals were followed.

Informed consent

Informed consent was obtained from all individual participants involved in the study.

Additional informed consent was obtained from all individual participants for whom identifying information is included in this article.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dai, Q., Yao, C. A hierarchical and parallel branch-and-bound ensemble selection algorithm. Appl Intell 46, 45–61 (2017). https://doi.org/10.1007/s10489-016-0817-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-016-0817-8

Keywords

Navigation