Analysis of Diversity Assurance Methods for Combined Classifiers
Assuring diversity of classifiers in an ensemble plays a crucial role in the multiple classifier system design. The paper presents a comparative study of selected methods which can assure the diversity by manipulating the individual classifier inputs i.e., they train learner using subspaces of a feature set or they try to exploit local competencies of individual classifier for a given subset of feature space. This work is a starting point for developing new methods of diversity assurance embedded in a multiple classifier system design. All methods had been evaluated on the basis of computer experiments which were carried out on benchmark datasets. On the basis of received results conclusions about the usefulness of examined methods for certain types of problems were drawn.
KeywordsFeature Space Benchmark Dataset Ensemble Method Feature Selection Algorithm Random Subspace
Unable to display preview. Download preview PDF.
- 4.Frank, A., Asuncion, A.: UCI machine learning repository (2010), http://archive.ics.uci.edu/ml
- 5.Giacinto, G.: Design multiple classifier systems. Technical Report PhD thesis, Universita Degli Studi di Salerno, Salerno, Italy (1998)Google Scholar
- 6.Giacinto, G., Roli, F., Fumera, G.: Design of effective multiple classifier systems by clustering of classifiers. In: Proceedings of 15th International Conference on Pattern Recognition, vol. 2, pp. 160–163 (2000)Google Scholar
- 8.Holmes, G., Donkin, A., Witten, I.H.: Weka: A machine learning workbench. In: Proceedings of Australian and New Zealand Conference on Intelligent Information Systems, pp. 357–361 (1994)Google Scholar
- 13.Krawczyk, B.: Classifier Committee Based on Feature Selection Method for Obstructive Nephropathy Diagnosis. In: Katarzyniak, R., Chiu, T.-F., Hong, C.-F., Nguyen, N.T. (eds.) Semantic Methods for Knowledge Management and Communication. SCI, vol. 381, pp. 115–125. Springer, Heidelberg (2011)CrossRefGoogle Scholar
- 14.Kuncheva, L.I.: Clustering-and-selection model for classifier combination. In: Proceedings of Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies, vol. 1, pp. 185–188 (2000)Google Scholar
- 19.R Development Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2008)Google Scholar
- 20.Wolpert, D.H.: The supervised learning no-free-lunch theorems. In: Proc. 6th Online World Conference on Soft Computing in Industrial Applications, pp. 25–42 (2001)Google Scholar
- 21.Yu, L., Liu, H.: Feature selection for high-dimensional data: A fast correlation-based filter solution. In: Proceedings of Twentieth International Conference on Machine Learning, vol. 2, pp. 856–863 (2003)Google Scholar