Chernoff-Based Multi-class Pairwise Linear Dimensionality Reduction
Linear dimensionality reduction techniques have been studied very well for the two-class problem, while the corresponding issues encountered when dealing with multiple classes are far from trivial. In this paper, we show that dealing with multiple classes, it is not expedient to treat it as a multi-class problem, but it is better to treat it as an ensemble of Chernoff-based two-class reductions onto different subspaces. The solution is achieved by resorting to either Voting, Weighting, or a Decision Tree combination scheme. The ensemble methods were tested on benchmark datasets demonstrating that the proposed method is not only efficient, but also yields an accuracy comparable to that obtained by the optimal Bayes classifier.
KeywordsLinear Dimensionality Reduction Fisher’s Discriminant Analysis Heteroscedastic Discriminant Analysis Chernoff Distance
- 2.Asuncion, A., Newman, D.J.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
- 3.Kim, S., Magnani, A., Boyd, S.: Optimal Kernel Selection in Kernel Fisher Discriminant Analysis. In: Proc. of the 23rd International Conference on Machine Learning, Pittsburgh, USA, pp. 465–472 (2006)Google Scholar
- 6.Rohl, M., Weihs, C.: Optimal vs. classical linear dimension reduction. In: Information Age, Studies in Classification, Data Analysis, and Knowledge Organization, pp. 252–259. Springer, Heidelberg (1999)Google Scholar
- 10.Tax, D.M.J., Duin, R.P.W.: Using Two-Class Classifiers for Multiclass Classification. In: Proceedings of the 16th International Conference on Pattern Recognition, Quebec, Canada, vol. 2, pp. 124–127 (2002)Google Scholar