On the Performance of Chernoff-Distance-Based Linear Dimensionality Reduction Techniques
We present a performance analysis of three linear dimensionality reduction techniques: Fisher’s discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), which aims to maximize the Chernoff distance in the transformed space. A comprehensive performance analysis of these methods combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data shows that LD and RH outperform FDA, specially in the quadratic classifier, which is strongly related to the Chernoff distance in the transformed space. In the case of the linear classifier, the superiority of RH over the other two methods is also demonstrated.
KeywordsLinear Discriminant Analysis Machine Intelligence Back Propagation Neural Network Original Space Lower Error Rate
Unable to display preview. Download preview PDF.
- 4.Duda, R., Hart, P., Stork, D.: Pattern Classification, 2nd edn. John Wiley and Sons, Inc., New York (2000)Google Scholar
- 5.Lippman, R.: An Introduction to Computing with Neural Nets. In: Neural Networks: Theoretical Foundations and Analsyis, pp. 5–24. IEEE Press, Los Alamitos (1992)Google Scholar
- 9.Murphy, O.: Nearest Neighbor Pattern Classification Perceptrons. In: Neural Networks: Theoretical Foundations and Analysis, pp. 263–266. IEEE Press, Los Alamitos (1992)Google Scholar
- 10.Newman, D., Hettich, S., Blake, C., Merz, C.: UCI repository of machine learning databases, University of California, Irvine, Dept. of Computer Science (1998)Google Scholar
- 16.Rueda, L., Herrera, M.: Linear Discriminant Analysis by Maximizing the Chernoff Distance in the Transformed Space (submitted for Publication) (2006)Google Scholar