Building Locally Discriminative Classifier Ensemble Through Classifier Fusion Among Nearest Neighbors
Many studies on ensemble learning that combines multiple classifiers have shown that, it is an effective technique to improve accuracy and stability of a single classifier. In this paper, we propose a novel discriminative classifier fusion method, which applies local classification results of classifiers among nearest neighbors to build a local classifier ensemble. From this dynamically selected process, discriminative classifiers are weighted heavily to build a locally discriminative ensemble. Experimental results on several UCI datasets have shown that, our proposed method achieves best classification performance among individual classifiers, majority voting and AdaBoost algorithms.
KeywordsEnsemble learning Classifier ensemble Classifier combination Classifier fusion
This work was funded in part by the National Natural Science Foundation of China (No. 61572240,61502208), Natural Science Foundation of Jiangsu Province of China (No. BK20150522), and the Open Project Program of the National Laboratory of Pattern Recognition(NLPR) (No. 201600005).
- 5.Rennie, J., Shih, L., Teevan, J., Karger, D.: Spam filtering with Naive Bayes which Naive Bayes? In: Proceedings of the Twentieth International Conference on Machine Learning (ICML), pp. 285–295 (2003)Google Scholar
- 7.Skalak, D.: The sources of increased accuracy for two proposed boosting algorithms. In: Proceedings of American Association for Artificial Intelligence, Integrating Multiple Learned Models Workshop, AAAI 1996, pp. 120–125 (1996)Google Scholar
- 9.Sim, J., Wright, C.C.: The Kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys. Therapy 85(3), 257–268 (2005)Google Scholar
- 13.Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning (ICML), pp. 148–156 (1996)Google Scholar