Kernel Uncorrelated Discriminant Analysis for Radar Target Recognition

  • Ling Wang
  • Liefeng Bo
  • Licheng Jiao
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4233)


Kernel fisher discriminant analysis (KFDA) has received extensive study in recent years as a dimensionality reduction technique. KFDA always encounters an intrinsic singularity of scatter matrices in the feature space, namely ‘small sample size’ (SSS) problem. Several novel methods have been proposed to cope with this problem. In this paper, kernel uncorrelated discriminant analysis (KUDA) is proposed, which not only can bear on the SSS problem but also extract uncorrelated features, a desirable property for many applications. And then, we have conducted a comparative study on the application of KUDA and other variants of KFDA in radar target recognition problem. The experimental results indicate the effectiveness of KUDA and illustrate the utility of KFDA on the problem.


Discriminant Analysis Linear Discriminant Analysis Inverse Synthetic Aperture Radar Cauchy Kernel Range Profile 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bo, L.F., Wang, L., Jiao, L.C.: Training support vector machines using greedy stagewise algorithm. In: Ho, T.-B., Cheung, D., Liu, H. (eds.) PAKDD 2005. LNCS (LNAI), vol. 3518, pp. 632–638. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  2. 2.
    Bo, L.F., Wang, L., Jiao, L.C.: Feature scaling for kernel fisher discriminant analysis using leave-one-out cross validation. Neural Computation 18(4), 961–978 (2006)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Mika, S., Ratsch, G., Weston, J.: Fisher discriminant analysis with kernels. In: Proceedings of the IEEE Workshop on Neural Networks for signal Processing, pp. 41–48 (1999)Google Scholar
  4. 4.
    Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Nerual Compuation 12(10), 2385–2404 (2000)CrossRefGoogle Scholar
  5. 5.
    Yang, M.H.: Kernel eigenfaces vs. kernel fisherfaces: face recognition using kernel methods. In: Proceedings of Fifth IEEE International Conference Automatic Face and Gesture Recognition, pp. 215–220 (2002)Google Scholar
  6. 6.
    Park, C.H., Park, H.: Nonlinear discriminant analysis using kernel functions and the generalized singular value decomposition. SIAM Journal on Matrix Analysis and Applications (to appear)Google Scholar
  7. 7.
    Jin, Z., Yang, J.Y., Tang, Z.M., Hu, Z.S.: A theorem on the uncorrelated optimal discriminant vectors. Pattern Recognition 34, 2041–2047 (2001)MATHCrossRefGoogle Scholar
  8. 8.
    Ye, J.P., Janardan, R., Li, Q., Park, H.: Feature extraction via generalized uncorrelated linear discriminant analysis. In: Proceedings of the 21st International Conference on Machine Learning, Banff, Canada (2004)Google Scholar
  9. 9.
    Ye, J.P.: Characterization of a family of algorithms for generalized discriminant analysis on undersampled problems. Journal of Machine Learning Research 6(4), 483–502 (2005)Google Scholar
  10. 10.
    Yang, J., Yang, J.Y.: Why can LDA be performed in PCA transformed space? Pattern Recognition 36, 563–566 (2003)CrossRefGoogle Scholar
  11. 11.
    Howland, P., Park, H.: Generalizing Discriminant Analysis Using the Generalized Singular Value Decomposition. IEEE Transactions on PAMI 26(8), 995–1006 (2004)Google Scholar
  12. 12.
    Ye, J.P., Janardan, R., Park, C.H., Park, H.: An optimization criterion for generalized discriminant analysis on undersampled problems. IEEE Transactions on PAMI 26(8), 982–994 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ling Wang
    • 1
  • Liefeng Bo
    • 1
  • Licheng Jiao
    • 1
  1. 1.Institute of Intelligent Information ProcessingXidian UniversityXi’anChina

Personalised recommendations