In order to improve overall performance with respect to a classification problem, a path of research consists in using several classifiers and to fuse their outputs. To perform this fusion, some approaches merge the classifier outputs using a rule of combination. This requires that the outputs be made comparable beforehand, which is usually done thanks to a probabilistic calibration of each classifier. The fusion can also be performed by concatenating the classifier outputs into a vector and applying a joint probabilistic calibration to this vector. Recently, extensions of probabilistic calibration techniques of an individual classifier have been proposed using evidence theory, in order to better represent the uncertainties inherent to the calibration process. In this paper, we adapt this latter idea to joint probabilistic calibration techniques, leading to evidential versions of joint calibration techniques. In addition, our proposal was tested on generated and real datasets and the results showed that it either outperforms or is comparable to state-of-the-art approaches.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
We used the software for the evidential \(\kappa \)NN classifier with parameter optimization available at: https://www.hds.utc.fr/~tdenoeux/dokuwiki/en/software/k-nn
Bache K, Lichman M (2013) UCI machine learning repository. University of California, School of Information and Computer Sciences. http://archive.ics.uci.edu/ml
Bagley SC, White H, Golomb BA (2001) Logistic regression in the medical literature: standards for use and reporting, with particular attention to one medical domain. J Clin Epidemiol 54(10):979–985
Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2:27:1–27:27. http://www.csie.ntu.edu.tw/~cjlin/libsvm
Dempster AP (1966) New methods for reasoning towards posterior distributions based on sample data. Ann Math Stat 37(2):355–374
Dempster AP (1968) Upper and lower probabilities generated by a random closed interval. Ann Math Stat 39(3):957–966
Denœux T (1995) A k-nearest neighbor classification rule based on dempster-shafer theory. IEEE Trans Syst Man Cybern 25(5):804–813
Denœux T (1997) Analysis of evidence-theoretic decision rules for pattern classification. Pattern Recognit 30(7):1095–1107
Denœux T (2014) Likelihood-based belief function: justification and some extensions to low-quality data. Int J Approx Reason 55(7):1535–1547
Denœux T, Smets P (2006) Classification using belief functions: relationship between case-based and model-based approaches. IEEE Trans Syst Man Cybern B 36(6):1395–1406
Duin RPW (2002) The combining classifier: to train or not to train? In: Proceedings of the 16th International Conference on Pattern Recognition, Quebec City, Quebec, Canada, August, 2002, IEEE, vol 2, pp 765–770
Hosmer DW, Lemeshow S, Sturdivant RX (2013) Applied logistic regression, vol 398. Wiley, Hoboken
Kanjanatarakul O, Sriboonchitta S, Denœux T (2014) Forecasting using belief functions: an application to marketing econometrics. Int J Approx Reason 55(5):1113–1128
Kanjanatarakul O, Denœux T, Sriboonchitta S (2016) Prediction of future observations using belief functions: a likelihood-based approach. Int J Approx Reason 72:71–94
Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. Wiley, Hoboken
Minary P, Pichon F, Mercier D, Lefevre E, Droit B (2016) An evidential pixel-based face blurring approach. In: Vejnarov J, Kratochvil V (eds) Proceedings of the 4th International Conference on Belief Functions, Prague, Czech Republic, September 21–23, Springer, Lecture Notes in Computer Science, vol 9861, pp 222–230
Minary P, Pichon F, Mercier D, Lefevre E, Droit B (2017) Evidential joint calibration of binary svm classifiers using logistic regression. In: Proceedings of the 11th international conference on scalable uncertainty management, Granada, Spain, October 4–6, 2017, Lecture Notes in Artificial Intelligence, Springer, p 7
Minka TP (2003) Algorithms for maximum-likelihood logistic regression. Technical Report 758, Carnegie Mellon University
Nguyen HT (2006) An Introduction to Random Sets. Chapman and Hall/CRC Press, Boca Raton
Platt JC (1999) Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv Large Margin Classif 10(3):61–74
Shafer G (1976) A mathematical theory of evidence, vol 1. Princeton University Press, Princeton
Smets P (1993) Belief functions: the disjunctive rule of combination and the generalized Bayesian theorem. Int J Approx Reason 9(1):1–35
Smets P, Kennes R (1994) The transferable belief model. Artif Intell 66:191–243
Tulyakov S, Jaeger S, Govindaraju V, Doermann D (2008) Review of classifier combination methods. In: Marinai S, Fujisawa H (eds) Machine learning in document analysis and recognition. Springer, Berlin, pp 361–386
Xu P, Davoine F, Denœux T (2015) Evidential multinomial logistic regression for multiclass classifier calibration. In: Proceedings of the 18th international conference on information fusion, Washington, DC, USA, July 6–9, 2015, IEEE, pp 1106–1112
Xu P, Davoine F, Zha H, Denœux T (2016) Evidential calibration of binary SVM classifiers. Int J Approx Reason 72:55–70
Zadrozny B, Elkan C (2001) Obtaining calibrated probability estimates from decision trees and naive bayesian classifiers. In: Proceedings of the 18th international conference on machine learning, Morgan Kaufmann, pp 609–616
Zadrozny B, Elkan C (2002) Transforming classifier scores into accurate multiclass probability estimates. In: Proceedings of the 8th international conference on knowledge discovery and data mining, New York, NY, USA, 2002, ACM, pp 694–699
Zhong W, Kwok JT (2013) Accurate probability calibration for multiple classifiers. In: Proceedings of the 23rd international joint conference on artificial intelligence, Beijing, China, August, 2013, pp 1939–1945
Zouhal LM, Denœux T (1998) An evidence-theoretic k-nn rule with parameter optimization. IEEE Trans Syst Man Cybern C 28(2):263–271
Conflict of interest
All authors declare that they have no conflict of interest.
This article does not contain any studies with human participants or animals performed by any of the authors.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This paper is an extended and revised version of (Minary et al. 2017).
Communicated by A. Di Nola.
About this article
Cite this article
Minary, P., Pichon, F., Mercier, D. et al. Evidential joint calibration of binary SVM classifiers. Soft Comput 23, 4655–4671 (2019). https://doi.org/10.1007/s00500-018-3429-x
- Belief functions
- Information fusion
- Evidential calibration