On Using Dimensionality Reduction Schemes to Optimize Dissimilarity-Based Classifiers

  • Sang-Woon Kim
  • Jian Gao
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5197)

Abstract

The aim of this paper is to present a strategy by which a new philosophy for pattern classification pertaining to dissimilarity-based classifiers (DBCs) can be efficiently implemented. Proposed by Duin and his co-authors, DBCs are a way of defining classifiers among classes; they are not based on the feature measurements of individual patterns, but rather on a suitable dissimilarity measure among the patterns. The problem with this strategy is that we need to select a representative set of data that is both compact and capable of representing the entire data set. However, it is difficult to find the optimal number of prototypes and, furthermore, selecting prototype stage may potentially lose some useful information for discrimination. To avoid these problems, in this paper, we propose an alternative approach where we use all available samples from the training set as prototypes and subsequently apply dimensionality reduction schemes. That is, we prefer not to directly select the representative prototypes from the training samples; rather, we use a dimensionality reduction scheme after computing the dissimilarity matrix with the entire training samples. Our experimental results demonstrate that the proposed mechanism can improve the classification accuracy of conventional approaches for two real-life benchmark databases.

Keywords

Dissimilarity Representation Dissimilarity-based Classification Dimensionality Reduction Schemes Appearance-based Face Recognition 

References

  1. 1.
    Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 22(1), 4–37 (2000)CrossRefGoogle Scholar
  2. 2.
    Pekalska, E., Duin, R.P.W.: The Dissimilarity Representation for Pattern Recognition: Foundations and Applications. World Scientific Publishing, Singapore (2005)CrossRefMATHGoogle Scholar
  3. 3.
    Pekalska, E., Duin, R.P.W., Paclik, P.: Prototype selection for dissimilarity-based classifiers. Pattern Recognition 39, 189–208 (2006)CrossRefMATHGoogle Scholar
  4. 4.
    Kim, S.-W.: Optimizing dissimilarity-based classifiers using a newly modified Hausdorff distance. In: Hoffmann, A., Kang, B.-h., Richards, D., Tsumoto, S. (eds.) PKAW 2006. LNCS (LNAI), vol. 4303, pp. 177–186. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  5. 5.
    Kim, S.-W., Oommen, B.J.: On using prototype reduction schemes to optimize dissimilarity-based classification. Pattern Recognition 40, 2946–2957 (2007)CrossRefMATHGoogle Scholar
  6. 6.
    Riesen, K., Kilchherr, V., Bunke, H.: Reducing the dimensionality of vector space embeddings of graphs. In: Perner, P. (ed.) MLDM 2007. LNCS (LNAI), vol. 4571, pp. 563–573. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  7. 7.
    Bunke, H., Riesen, K.: A family of novel graph kernels for structural pattern recognition. In: Rueda, L., Mery, D., Kittler, J. (eds.) CIARP 2007. LNCS, vol. 4756, pp. 20–31. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  8. 8.
    Belhumeour, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 19(7), 711–720 (1997)CrossRefGoogle Scholar
  9. 9.
    Adini, Y., Moses, Y., Ullman, S.: Face Recognition: The problem of compensating for changes in illumination direction. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 19(7), 721–732 (1997)CrossRefGoogle Scholar
  10. 10.
    Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data - with application to face recognition. Pattern Recognition 34, 2067–2070 (2001)CrossRefMATHGoogle Scholar
  11. 11.
    Yang, M.-H.: Kernel eigenfaces vs. kernel Fisherfaces: Face recognition using kernel methods. In: Proceedings of 5th IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 215–220 (2002)Google Scholar
  12. 12.
    Cevikalp, H., Neamtu, M., Wilkes, M., Barkana, A.: Discriminative common vectors for face recognition. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 27(1), 4–13 (2005)CrossRefGoogle Scholar
  13. 13.
    Loog, M., Duin, R.P.W.: Linear dimensionality reduction via a heteroscedastic extension of LDA: The Cherno criterion. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 26(6), 732–739 (2004)CrossRefGoogle Scholar
  14. 14.
    Rueda, L., Herrera, M.: A new approach to multi-class linear dimensionality reduction. In: Martínez-Trinidad, J.F., Carrasco Ochoa, J.A., Kittler, J. (eds.) CIARP 2006. LNCS, vol. 4225, pp. 634–643. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  15. 15.
    Zhang, S., Sim, T.: Discriminant subspace analysis: A Fukunaga-Koontz approach. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 29(10), 1732–1745 (2007)CrossRefGoogle Scholar
  16. 16.
    Roweis, S., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  17. 17.
    Kim, T.-K., Kittler, J.: Locally linear discriminant analysis for multimodally distributed classes for face recognition with a single model image. IEEE Trans. Pattern Anal. and Machine Intell. PAMI 27(3), 318–327 (2005)CrossRefGoogle Scholar
  18. 18.
    Frley, C., Raftery, A.E.: How many clusters? Which clustering method? Answers via model-based cluster analysis. The Computer Journal 41(8), 578–588 (1998)CrossRefMATHGoogle Scholar
  19. 19.
    Halbe, Z., Aladjem, M.: Model-based mixture discriminant analysis - An experimental study. Pattern Recognition 38, 437–440 (2005)CrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Sang-Woon Kim
    • 1
  • Jian Gao
    • 1
  1. 1.Dept. of Computer Science and EngineeringMyongji UniversityYonginSouth Korea

Personalised recommendations