Advertisement

Extending Kernel Fisher Discriminant Analysis with the Weighted Pairwise Chernoff Criterion

  • Guang Dai
  • Dit-Yan Yeung
  • Hong Chang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3954)

Abstract

Many linear discriminant analysis (LDA) and kernel Fisher discriminant analysis (KFD) methods are based on the restrictive assumption that the data are homoscedastic. In this paper, we propose a new KFD method called heteroscedastic kernel weighted discriminant analysis (HKWDA) which has several appealing characteristics. First, like all kernel methods, it can handle nonlinearity efficiently in a disciplined manner. Second, by incorporating a weighting function that can capture heteroscedastic data distributions into the discriminant criterion, it can work under more realistic situations and hence can further enhance the classification accuracy in many real-world applications. Moreover, it can effectively deal with the small sample size problem. We have performed some face recognition experiments to compare HKWDA with several linear and nonlinear dimensionality reduction methods, showing that HKWDA consistently gives the best results.

Keywords

Face Recognition Kernel Principal Component Analysis Discriminatory Information Discriminant Criterion Small Sample Size Problem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kumar, N., Andreou, A.G.: Heteroscedastic discriminant analysis and reduced rank HMMS for improved speech recognition. Speech Communication 26, 283–297 (1998)CrossRefGoogle Scholar
  2. 2.
    Hastie, T., Tibshirani, R.: Discriminant analysis by Gaussian mixture. Journal of the Royal Statistical Society, Series B 58, 155–176 (1996)MathSciNetMATHGoogle Scholar
  3. 3.
    Loog, M., Duin, R.P.W.: Linear dimensionality reduction via a heteroscedastic extension of LDA: the Chernoff criterion. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(6), 32–739 (2004)Google Scholar
  4. 4.
    Li, Y.X., Gao, Y.Q., Erdogan, H.: Weighted pairwise scatter to improve linear discriminant analysis. In: Proceedings of the 6th International Conference on Spoken Language Processing (2000)Google Scholar
  5. 5.
    Lotlikar, R., Kothari, R.: Fractional-step dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(6), 623–627 (2000)CrossRefGoogle Scholar
  6. 6.
    Lu, J.W., Plataniotis, K.N., Venetsanopoulos, A.N.: Face recognition using LDA-based algorithms. IEEE Transactions on Neural Networks 14, 195–200 (2003)CrossRefGoogle Scholar
  7. 7.
    Loog, M., Duin, R.P.W., Haeb-Umbach, R.: Multiclass linear dimension reduction by weighted pairwise Fisher criteria. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(7), 762–766 (2001)CrossRefGoogle Scholar
  8. 8.
    Qin, A.K., Suganthan, P.N., Loog, M.: Uncorrelated heteroscedastic LDA based on the weighted pairwise Chernoff criterion. Pattern Recognition (2005)Google Scholar
  9. 9.
    Belhumeur, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 711–720 (1997)CrossRefGoogle Scholar
  10. 10.
    Chen, L.F., Liao, H.Y.M., Ko, M.T., Lin, J.C., Yu, G.J.: A new LDA-based face recognition system which can solve the small sample size problem. Pattern Recognition 33, 1713–1726 (2000)CrossRefGoogle Scholar
  11. 11.
    Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data with application to face recognition. Pattern Recognition 34, 2067–2070 (2001)CrossRefMATHGoogle Scholar
  12. 12.
    Huang, R., Liu, Q., Lu, H., Ma, S.: Solving the small size problem of LDA. In: Proceedings of the Sixteenth International Conference on Pattern Recognition, vol. 3, pp. 29–32 (2002)Google Scholar
  13. 13.
    Yang, J., Yang, J.Y.: Why can LDA be performed in PCA transformed space? Pattern Recognition 36, 563–566 (2003)CrossRefGoogle Scholar
  14. 14.
    Cevikalp, H., Neamtu, M., Wilkes, M., Barkana, A.: Discriminative common vectors for face recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(1), 4–13 (2005)CrossRefGoogle Scholar
  15. 15.
    Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1999)CrossRefGoogle Scholar
  16. 16.
    Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.R.: Fisher discriminant analysis with kernels. In: Hu, Y.H., Larsen, J., Wilson, E., Douglas, S. (eds.) Proceedings of the Neural Networks for Signal Processing IX, pp. 41–48 (1999)Google Scholar
  17. 17.
    Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Computation 12, 2385–2404 (2000)CrossRefGoogle Scholar
  18. 18.
    Xiong, T., Ye, J.P., Li, Q., Cherkassky, V., Janardan, R.: Efficient kernel discriminant analysis via QR decomposition. In: Advances in Neural Information Processing Systems 17 (2005)Google Scholar
  19. 19.
    Park, C.H., Park, H.: Nonlinear discriminant analysis using kernel functions and the generalized singular value decomposition. SIAM Journal on Matrix Analysis and Application (to appear), http://www-users.cs.umn.edu/~hpark/pub.html
  20. 20.
    Yang, M.H.: Kernel eigenfaces vs. kernel Fisherfaces: face recognition using kernel methods. In: Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, May 2002, pp. 215–220 (2002)Google Scholar
  21. 21.
    Lu, J.W., Plataniotis, K.N., Venetsanopoulos, A.N.: Face recognition using kernel direct discriminant analysis algorithms. IEEE Transactions on Neural Networks 12, 117–126 (2003)Google Scholar
  22. 22.
    Dai, G., Qian, Y.T.: Modified kernel-based nonlinear feature extraction. In: Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, pp. 17–21 (2004)Google Scholar
  23. 23.
    Dai, G., Qian, Y.T.: Kernel generalized nonlinear discriminant analysis algorithm for pattern recognition. In: Proceedings of the IEEE International Conference on Image Processing, pp. 2697–2700 (2004)Google Scholar
  24. 24.
    Yang, J., Frangi, A.F., Yang, J.Y., Zhang, D., Jin, Z.: KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(2), 230–244 (2005)CrossRefGoogle Scholar
  25. 25.
    Dai, G., Yeung, D.Y.: Nonlinear dimensionality reduction for classification using kernel weighted subspace method. In: Proceedings of the IEEE International Conference on Image Processing, pp. 838–841 (2005)Google Scholar
  26. 26.
    Zhou, S.H., Chellappa, R.: From sample similarity to ensemble similarity. Technical Report SCR Technical Report (SCR-05-TR-774), Maryland University (2005), http://www.umiacs.umd.edu/~shaohua/
  27. 27.
    Tipping, M.E., Bishop, C.M.: Probabilistic principal component analysis. Journal of the Royal Statistical Society, Series B 61(3), 611–622 (1999)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Guang Dai
    • 1
  • Dit-Yan Yeung
    • 1
  • Hong Chang
    • 1
  1. 1.Department of Computer ScienceHong Kong University of Science and TechnologyKowloonHong Kong

Personalised recommendations