Ensemble Selection Based on Discriminant Functions in Binary Classification Task

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9375)


The paper describes the dynamic ensemble selection. The proposed algorithm uses values of the discriminant functions and it is dedicated to the binary classification task. The proposed algorithm of the ensemble selection uses decision profiles and the normalization of the discrimination functions is carried out. Additionally, the difference of the discriminant functions is used as one condition of selection. The reported results based on the ten data sets from the UCI repository show that the proposed dynamic ensemble selection is a promising method for the development of multiple classifiers systems.


Ensemble selection Multiple classifier system Binary classification task 



This work was supported by the Polish National Science Center under the grant no. DEC-2013/09/B/ST6/02264 and by the statutory funds of the Department of Systems and Computer Networks, Wroclaw University of Technology.


  1. 1.
    Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer, Heidelberg (2006)zbMATHGoogle Scholar
  2. 2.
    Cavalin, P.R., Sabourin, R., Suen, C.Y.: Dynamic selection approaches for multiple classifier systems. Neural Comput. Appl. 22(3–4), 673–688 (2013)CrossRefGoogle Scholar
  3. 3.
    Britto, A.S., Sabourin, R., Oliveira, L.E.S.: Dynamic selection of classifiers a comprehensive review. Pattern Recognit. 47(11), 3665–3680 (2014)CrossRefGoogle Scholar
  4. 4.
    Burduk, R.: Classifier fusion with interval-valued weights. Pattern Recognit. Lett. 34(14), 1623–1629 (2013)CrossRefGoogle Scholar
  5. 5.
    Cyganek, B.: One-class support vector ensembles for image segmentation and classification. J. Math. Imaging Vis. 42(2–3), 103–117 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Cyganek, B., Woźniak, M.: Vehicle logo recognition with an ensemble of classifiers. In: Nguyen, N.T., Attachoo, B., Trawiński, B., Somboonviwat, K. (eds.) ACIIDS 2014, Part II. LNCS, vol. 8398, pp. 117–126. Springer, Heidelberg (2014) CrossRefGoogle Scholar
  7. 7.
    Didaci, L., Giacinto, G., Roli, F., Marcialis, G.L.: A study on the performances of dynamic classifier selection based on local accuracy estimation. Pattern Recognit. 38, 2188–2191 (2005)CrossRefzbMATHGoogle Scholar
  8. 8.
    Forczmański, P., Łabȩdź, P.: Recognition of occluded faces based on multi-subspace classification. In: Saeed, K., Chaki, R., Cortesi, A., Wierzchoń, S. (eds.) CISIM 2013. LNCS, vol. 8104, pp. 148–157. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  9. 9.
    Frank, A., Asuncion, A.: UCI machine learning repository (2010)Google Scholar
  10. 10.
    Frejlichowski, D.: An algorithm for the automatic analysis of characters located on car license plates. In: Kamel, M., Campilho, A. (eds.) ICIAR 2013. LNCS, vol. 7950, pp. 774–781. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  11. 11.
    Giacinto, G., Roli, F.: An approach to the automatic design of multiple classifier systems. Pattern Recognit. Lett. 22, 25–33 (2001)CrossRefzbMATHGoogle Scholar
  12. 12.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)zbMATHGoogle Scholar
  13. 13.
    Markatopoulou, F., Tsoumakas, G., Vlahavas, I.: Dynamic ensemble pruning based on multi-label classification. Neurocomputing 150, 501–512 (2015)CrossRefGoogle Scholar
  14. 14.
    Ho, T.K., Hull, J.J., Srihari, S.N.: Decision combination in multiple classifier systems. IEEE Trans. Pattern Anal. Mach. Intell. 16(1), 66–75 (1994)CrossRefGoogle Scholar
  15. 15.
    Jackowski, K., Krawczyk, B., Woźniak, M.: Improved adaptive splitting and selection: the hybrid training method of a classifier based on a feature space partitioning. Int. J. Neural Syst. 24(3), 1430007 (2014)CrossRefGoogle Scholar
  16. 16.
    Jackowski, K., Woźniak, M.: Method of classifier selection using the genetic approach. Expert Syst. 27(2), 114–128 (2010)CrossRefGoogle Scholar
  17. 17.
    Kittler, J., Alkoot, F.M.: Sum versus vote fusion in multiple classifier systems. IEEE Trans. Pattern Anal. Mach. Intell. 25(1), 110–115 (2003)CrossRefGoogle Scholar
  18. 18.
    Kuncheva, L.I.: A theoretical study on six classifier fusion strategies. IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 281–286 (2002)CrossRefGoogle Scholar
  19. 19.
    Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken (2004) CrossRefzbMATHGoogle Scholar
  20. 20.
    Lam, L., Suen, C.Y.: Application of majority voting to pattern recognition: an analysis of its behavior and performance. IEEE Trans. Syst. Man Cybern. Part A 27(5), 553–568 (1997)CrossRefGoogle Scholar
  21. 21.
    Przewoźniczek, M., Walkowiak, K., Woźniak, M.: Optimizing distributed computing systems for k-nearest neighbours classifiers evolutionary approach. Log. J. IGPL 19(2), 357–372 (2010)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Ranawana, R., Palade, V.: Multi-classifier systems: review and a roadmap for developers. Int. J. Hybrid Intell. Syst. 3(1), 35–61 (2006)CrossRefzbMATHGoogle Scholar
  23. 23.
    Rejer, I.: Genetic algorithms in EEG feature selection for the classification of movements of the left and right hand. In: Burduk, R., Jackowski, K., Kurzynski, M., Wozniak, M., Zolnierek, A. (eds.) CORES 2013. AISC, vol. 226, pp. 581–590. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  24. 24.
    Ruta, D., Gabrys, B.: Classifier selection for majority voting. Inf. Fus. 6(1), 63–81 (2005)CrossRefzbMATHGoogle Scholar
  25. 25.
    Smȩtek, M., Trawiński, B.: Selection of heterogeneous fuzzy model ensembles using self-adaptive genetic algorithms. New Gener. Comput. 29(3), 309–327 (2011)CrossRefGoogle Scholar
  26. 26.
    Suen, C.Y., Legault, R., Nadal, C.P., Cheriet, M., Lam, L.: Building a new generation of handwriting recognition systems. Pattern Recognit. Lett. 14(4), 303–315 (1993)CrossRefGoogle Scholar
  27. 27.
    Trawiński, K., Cordon, O., Quirin, A.: A study on the use of multiobjective genetic algorithms for classifier selection in furia-based fuzzy multiclassifiers. Int. J. Comput. Intell. Syst. 5(2), 231–253 (2012)CrossRefGoogle Scholar
  28. 28.
    Ulas, A., Semerci, M., Yildiz, O.T., Alpaydin, E.: Incremental construction of classifier and discriminant ensembles. Inf. Sci. 179(9), 1298–1318 (2009)CrossRefGoogle Scholar
  29. 29.
    Woloszyński, T., Kurzyński, M.: A probabilistic model of classifier competence for dynamic ensemble selection. Pattern Recognit. 44(10–11), 2656–2668 (2011)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Department of Systems and Computer NetworksWroclaw University of TechnologyWroclawPoland

Personalised recommendations