Abstract
The aim of this paper is to present a strategy by which a new philosophy for pattern classification pertaining to dissimilarity-based classifiers (DBCs) can be efficiently implemented. Proposed by Duin and his co-authors, DBCs are a way of defining classifiers among classes; they are not based on the feature measurements of individual patterns, but rather on a suitable dissimilarity measure among the patterns. The problem with this strategy is that we need to select a representative set of data that is both compact and capable of representing the entire data set. However, it is difficult to find the optimal number of prototypes and, furthermore, selecting prototype stage may potentially lose some useful information for discrimination. To avoid these problems, in this paper, we propose an alternative approach where we use all available samples from the training set as prototypes and subsequently apply dimensionality reduction schemes. That is, we prefer not to directly select the representative prototypes from the training samples; rather, we use a dimensionality reduction scheme after computing the dissimilarity matrix with the entire training samples. Our experimental results demonstrate that the proposed mechanism can improve the classification accuracy of conventional approaches for two real-life benchmark databases.
This work was supported by the Korea Research Foundation Grant funded by the Korea Government (MOEHRD-KRF-2007-211-D00109).
Chapter PDF
References
Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 22(1), 4–37 (2000)
Pekalska, E., Duin, R.P.W.: The Dissimilarity Representation for Pattern Recognition: Foundations and Applications. World Scientific Publishing, Singapore (2005)
Pekalska, E., Duin, R.P.W., Paclik, P.: Prototype selection for dissimilarity-based classifiers. Pattern Recognition 39, 189–208 (2006)
Kim, S.-W.: Optimizing dissimilarity-based classifiers using a newly modified Hausdorff distance. In: Hoffmann, A., Kang, B.-h., Richards, D., Tsumoto, S. (eds.) PKAW 2006. LNCS (LNAI), vol. 4303, pp. 177–186. Springer, Heidelberg (2006)
Kim, S.-W., Oommen, B.J.: On using prototype reduction schemes to optimize dissimilarity-based classification. Pattern Recognition 40, 2946–2957 (2007)
Riesen, K., Kilchherr, V., Bunke, H.: Reducing the dimensionality of vector space embeddings of graphs. In: Perner, P. (ed.) MLDM 2007. LNCS (LNAI), vol. 4571, pp. 563–573. Springer, Heidelberg (2007)
Bunke, H., Riesen, K.: A family of novel graph kernels for structural pattern recognition. In: Rueda, L., Mery, D., Kittler, J. (eds.) CIARP 2007. LNCS, vol. 4756, pp. 20–31. Springer, Heidelberg (2007)
Belhumeour, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 19(7), 711–720 (1997)
Adini, Y., Moses, Y., Ullman, S.: Face Recognition: The problem of compensating for changes in illumination direction. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 19(7), 721–732 (1997)
Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data - with application to face recognition. Pattern Recognition 34, 2067–2070 (2001)
Yang, M.-H.: Kernel eigenfaces vs. kernel Fisherfaces: Face recognition using kernel methods. In: Proceedings of 5th IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 215–220 (2002)
Cevikalp, H., Neamtu, M., Wilkes, M., Barkana, A.: Discriminative common vectors for face recognition. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 27(1), 4–13 (2005)
Loog, M., Duin, R.P.W.: Linear dimensionality reduction via a heteroscedastic extension of LDA: The Cherno criterion. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 26(6), 732–739 (2004)
Rueda, L., Herrera, M.: A new approach to multi-class linear dimensionality reduction. In: Martínez-Trinidad, J.F., Carrasco Ochoa, J.A., Kittler, J. (eds.) CIARP 2006. LNCS, vol. 4225, pp. 634–643. Springer, Heidelberg (2006)
Zhang, S., Sim, T.: Discriminant subspace analysis: A Fukunaga-Koontz approach. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 29(10), 1732–1745 (2007)
Roweis, S., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Kim, T.-K., Kittler, J.: Locally linear discriminant analysis for multimodally distributed classes for face recognition with a single model image. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 27(3), 318–327 (2005)
Frley, C., Raftery, A.E.: How many clusters? Which clustering method? Answers via model-based cluster analysis. The Computer Journal 41(8), 578–588 (1998)
Halbe, Z., Aladjem, M.: Model-based mixture discriminant analysis - An experimental study. Pattern Recognition 38, 437–440 (2005)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kim, SW., Gao, J. (2008). On Using Dimensionality Reduction Schemes to Optimize Dissimilarity-Based Classifiers. In: Ruiz-Shulcloper, J., Kropatsch, W.G. (eds) Progress in Pattern Recognition, Image Analysis and Applications. CIARP 2008. Lecture Notes in Computer Science, vol 5197. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-85920-8_38
Download citation
DOI: https://doi.org/10.1007/978-3-540-85920-8_38
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-85919-2
Online ISBN: 978-3-540-85920-8
eBook Packages: Computer ScienceComputer Science (R0)