Skip to main content
Log in

An approach of classifiers fusion based on hierarchical modifications

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Classifiers fusion is considered as an effective way to promote the accuracy of pattern recognition. In practice, its performance is mainly limited by potentials and reliabilities of base classifiers, which are learned from different attribute spaces. In order to overcome the above problems, we present a new approach of classifiers fusion based on hierarchical modifications in the framework of belief function theory. At first, an intra-attribute modification is proposed to taking into account the potentials and reliabilities of base classifiers. Instead of discounting a classifier with a weight only, we employ a piece of evidence derived from the nearest labeled neighbor to modify the weighted output of one base classifier in its individual attribute space. Then, the modified output is combined with other modified results from their own attribute spaces and this procedure could be seen as an inter-attribute modification. Both modifications aim to make the classification result as close to the truth as possible, so we take them into account to construct a new objective function for optimizing the weights. Finally, some real data sets are used in experimental applications to demonstrate that the proposed method is superior to other related belief based fusion methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. https://sci2s.ugr.es/keel/datasets.php

References

  1. Kuncheva LI (2004) Combining pattern classifiers: Methods and algorithms. Wiley, Hoboken

    Book  Google Scholar 

  2. Yu ZW, Li L, Liu JM, Han GQ (2015) Hybrid adaptive classifier ensemble. IEEE Trans Cybern 45(2):177–190

    Article  Google Scholar 

  3. Ruta D, Gabrys B (2000) An overview of classifier fusion methods. Comput Inf Syst 7:1–10

    Google Scholar 

  4. Kuncheva LI (2002) A theoretical study on six classifier fusion strategies. IEEE Trans Pattern Anal Mach Intell 24(2):281–286

    Article  Google Scholar 

  5. Liu ZG, Zhang XX, Niu JW, Dezert J (2021) Combination of classifiers with different frames of discernment based on belief functions. IEEE Trans Fuzzy Syst 29(7):1764–1774

    Article  Google Scholar 

  6. Moreno-Seco F, Inesta JM, Ponce de Leon PJ, Mico L (2006) Comparison of classifier fusion methods for classification in pattern recognition tasks, Joint Iapr International Conference on Structural. Springer, pp 705–713

  7. Ruta D, Gabrys G (2005) Classifier selection for majority voting. Inf Fusion 6(1):63–81

    Article  Google Scholar 

  8. Pizzi NJ, Pedrycz W (2010) Aggregating multiple classification results using fuzzy integration and stochastic feature selection. Int J Approx Reason 51:883–894

    Article  Google Scholar 

  9. Song YF, Wang XD, Zhu JW, Lei L (2018) Sensor dynamic reliability evaluation based on evidence theory and intuitionistic fuzzy sets. Appl Intell 48:3950–3962

    Article  Google Scholar 

  10. Yang JB, Xu DL (2013) Evidential reasoning rule for evidence combinations. Artif Intell 205:1–29

    Article  MathSciNet  Google Scholar 

  11. Liu ZG, Dezert J, Pan Q, Mercier G (2011) Combination of sources of evidence with different discounting factors based on a new dissimilarity measure. Decis Support Syst 52(1):133–141

    Article  Google Scholar 

  12. Deng Y (2015) Generalized evidence theory. Appl Intell 43(3):530–543

    Article  Google Scholar 

  13. Liu ZG, Liu Y, Dezert J, Cuzzolin F (2020) Evidence combination based on credal belief redistribution for pattern classification. IEEE Trans Fuzzy Syst 28(4):618–631

    Article  Google Scholar 

  14. Liu ZG, Qiu GH, Mercier G, Pan Q (2021) A transfer classification method for heterogeneous data based on evidence theory. IEEE Trans Syst Man Cybern: Syst 51(8):5129–5141

    Article  Google Scholar 

  15. Bi YX, Guan JW, Bell D (2008) The combination of multiple classifiers using an evidential reasoning approach. Artif Intell 172:1731–1751

    Article  Google Scholar 

  16. Xiao FY (2020) Generalization of Dempster-Shafer theory: A complex mass function. Appl Intell 50(10):3266–3275

    Article  Google Scholar 

  17. Quost B, Masson M-H, Denoeux T (2011) Classifier fusion in the Dempster-Shafer framework using optimized t-norm based combination rules. Int J Approx Reason 52(3):353–374

    Article  MathSciNet  Google Scholar 

  18. Liu ZG, Huang LQ, Zhou K, Denoeux T (2021) Combination of transferable classification with multisource domain adaptation based on evidential reasoning. IEEE Trans Neural Netw Learn Syst 32 (5):2015–2029

    Article  MathSciNet  Google Scholar 

  19. Mercier D, Quost B, Denoeux T (2008) Refined modeling of sensor reliability in the belief function framework using contextual discounting. Inf Fusion 9(2):246–258

    Article  Google Scholar 

  20. Kuncheva LI, Bezdek J, Duin R (2001) Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognit 34(2):299–314

    Article  Google Scholar 

  21. Elouedi Z, Mellouli K, Smets P (2004) Assessing sensor reliability for multisensor data fusion within the transferable belief model. IEEE Trans Syst Man Cybern Part B: Cybern 34(1):782–787

    Article  Google Scholar 

  22. Liu ZG, Pan Q, Dezert J, Martin A (2018) Combination of classifiers with optimal weight based on evidential reasoning. IEEE Trans Fuzzy Syst 26(3):1217–1230

    Article  Google Scholar 

  23. Liu ZG, Zhang ZW, Liu Y, Dezert J, Pan Q (2019) A new pattern classification improvement method with local quality matrix based on k-NN. Knowl Based Syst 164:336–347

    Article  Google Scholar 

  24. Sun YX, Song L, Liu ZG (2019) Belief-based system for fusing multiple classification results with local weights. Opt Eng 58(4):041604.1-041604.9

    Google Scholar 

  25. Liu ZG, Pan Q, Dezert J, Han JW, He Y (2018) Classifier fusion with contextual reliability evaluation. IEEE Trans Cybern 48(5):1605–1618

    Article  Google Scholar 

  26. Denoeux T (1995) A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern 25(5):804–813

    Article  Google Scholar 

  27. Su ZG, Denoeux T, Hao YS, Zhao M, Evidential K -N N (2018) Classification with enhanced performance via optimizing a class of parametric conjunctive trules. Knowl Based Syst 142:7–16

    Article  Google Scholar 

  28. Denoeux T, Kanjanatarakul O, Sriboonchitta S (2015) EK-NNClus: a clustering procedure based on the evidential K-nearest neighbor rule. Knowl Based Syst 88:57–69

    Article  Google Scholar 

  29. Ani AA, Deriche M (2002) A new technique for combining multiple classifiers using the Dempster-Shafer theory of evidence. J Artif Intell Res 17:333–361

    Article  MathSciNet  Google Scholar 

  30. Shafer G (1976) A mathematical theory of evidence. Princeton University Press, Princeton

    Book  Google Scholar 

  31. Alcala-fdez J, Fernandez A, Luengo J, Derrac J, Garcia S, Sanchez L, Herrera F (2011) KEEL Data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Multiple-Valued Log Soft Comput 17(2-3):255–287

    Google Scholar 

  32. Moreno-Torres JG, Saez JA, Herrera F (2012) Study on the impact of partition-induced dataset shift on k-fold cross-validation. IEEE Trans Neural Netw Learn Syst 23(8):1304–1313

    Article  Google Scholar 

  33. Yu K, Zhang T (2009) High dimensional nonlinear learning using local coordinate coding. Statistics. arXiv:0906.5190

  34. Rish I (2001) An empirical study of the naive bayes classifier. IJCAI Worksh Empir Methods Artif Intell 3(22):41–46

    Google Scholar 

  35. Denoeux T (2000) A neural network classifier based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern Part A: Syst Hum 30(2):131–150

    Article  Google Scholar 

Download references

Acknowledgements

This work has been partially supported by Key Research and Development Project of Shaanxi Province (No. 2020SF-367) and Soft Science Project in Shaanxi innovation Pillar Program(No. 2020KRM079).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lin Song.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Song, L., Sun, Yx. An approach of classifiers fusion based on hierarchical modifications. Appl Intell 52, 6464–6476 (2022). https://doi.org/10.1007/s10489-021-02777-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-021-02777-6

Keywords

Navigation