On Optimizing Kernel-Based Fisher Discriminant Analysis Using Prototype Reduction Schemes

  • Sang-Woon Kim
  • B. John Oommen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4109)


Fisher’s Linear Discriminant Analysis (LDA) is a traditional dimensionality reduction method that has been proven to be successful for decades. Numerous variants, such as the Kernel-based Fisher Discriminant Analysis (KFDA) have been proposed to enhance the LDA’s power for nonlinear discriminants. Though effective, the KFDA is computationally expensive, since the complexity increases with the size of the data set. In this paper, we suggest a novel strategy to enhance the computation for an entire family of KFDA’s. Rather than invoke the KFDA for the entire data set, we advocate that the data be first reduced into a smaller representative subset using a Prototype Reduction Scheme (PRS), and that dimensionality reduction be achieved by invoking a KFDA on this reduced data set. In this way data points which are ineffective in the dimension reduction and classification can be eliminated to obtain a significantly reduced kernel matrix, K, without degrading the performance. Our experimental results demonstrate that the proposed mechanism dramatically reduces the computation time without sacrificing the classification accuracy for artificial and real-life data sets.


Training Sample Linear Discriminant Analysis Near Neighbor Kernel Matrix Kernel Principal Component Analysis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, San Diego (1990)MATHGoogle Scholar
  2. 2.
    Camastra, F.: Data Dimensionality Estimation Methods: A Survey. Pattern Recognition 36, 2945–2954 (2003)MATHCrossRefGoogle Scholar
  3. 3.
    Lotlikar, R., Kothari, R.: Adaptive Linear Dimensionality Reduction for Classification. Pattern Recognition 33, 185–194 (2000)CrossRefGoogle Scholar
  4. 4.
    Yang, J., Yang, J.-Y.: Why can LDA be performed in PCA transformed Space? Pattern Recognition 36, 563–566 (2003)CrossRefGoogle Scholar
  5. 5.
    Cooke, T.: Two Variations on Fisher’s Linear Discriminant for Pattern Recognition. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-24(2), 268–273 (2002)CrossRefMathSciNetGoogle Scholar
  6. 6.
    Schölkopf, B., Smola, A.J., Müller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10, 1299–1319 (1998)CrossRefGoogle Scholar
  7. 7.
    Mika, S., Ratsch, G., Schölkopf, B., Smola, A., Weston, J., Müller, K.R.: Fisher Discriminant Analysis with Kernels. In: Proc. of IEEE International Workshop Neural Networks for Signal Processing IX, pp. 41–48 (August 1999)Google Scholar
  8. 8.
    Baudat, G., Anouar, F.: Generalized Discriminant Analysis Using a Kernel Approach. Neural Comput. 12, 2385–2404 (2000)CrossRefGoogle Scholar
  9. 9.
    Yang, J., Frangi, A.F., Yang, J.-Y., Zhang, D.: KPCA plus LDA: A Complete Kernel Fisher Discriminant Framework for Feature Extraction and Recognition. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-27(2), 230–244 (2005)CrossRefGoogle Scholar
  10. 10.
    Loog, M., Duin, R.P.W.: Linear dimensionality reduction via a Heteroscedastic extension of LDA: The Chernoff criterion. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-26(6), 732–739 (2004)Google Scholar
  11. 11.
    Achlioptas, D., McSherry, F.: Fast computation of low-rank approximations. In: Proc. of the Thirty-Third Annual ACM Symposium on the Theory of Computing, Hersonissos, Greece, pp. 611–618. ACM Press, New York (2001)CrossRefGoogle Scholar
  12. 12.
    Achlioptas, D., McSherry, F., Schölkopf, B.: Sampling techniques for kernel methods. In: Advances in Neural Information Processing Systems, vol. 14, pp. 335–342. MIT Press, Cambridge (2002)Google Scholar
  13. 13.
    Smola, A.J., Schölkopf, B.: Sparse greedy matrix approximation for machine learning. In: Proc. of ICML 2000, Bochum, Germany, pp. 911–918. Morgan Kaufmann, San Francisco (2000)Google Scholar
  14. 14.
    Williams, C., Seeger, M.: Using the Nystrom method to speed up kernel machines. In: Advances in Neural Information Processing Systems, vol. 13. MIT Press, Cambridge (2001)Google Scholar
  15. 15.
    Tipping, M.: Sparse kernel principal component analysis. In: Advances in Neural Information Processing Systems, vol. 13, pp. 633–639. MIT Press, Cambridge (2001)Google Scholar
  16. 16.
    Cawley, G.C., Talbot, N.L.C.: Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers. Pattern Recognition 36, 2585–2592 (2003)MATHCrossRefGoogle Scholar
  17. 17.
    Xu, Y., Yang, J.-Y., Yang, J.: A reformative kernel Fisher discriminant analysis. Pattern Recognition 37, 1299–1302 (2004)MATHCrossRefGoogle Scholar
  18. 18.
    Xu, Y., Yang, J.-Y., Lu, J., Yu, D.-J.: An efficient renovation on kernel Fisher discriminant analysis and face recognition experiments. Pattern Recognition 37, 2091–2094 (2004)CrossRefGoogle Scholar
  19. 19.
    Billings, S.A., Lee, K.L.: Nonlinear Fisher discriminant analysis using a minimum squared error cost function and the orthogonal least squares algorithm. Neural Networks 15(2), 263–270 (2002)CrossRefGoogle Scholar
  20. 20.
    Bezdek, J.C., Kuncheva, L.I.: Nearest prototype classifier designs: An experimental study. Int’l. Journal of Intelligent Systems 16(12), 1445–1473 (2001)MATHCrossRefGoogle Scholar
  21. 21.
    Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)Google Scholar
  22. 22.
    Hart, P.E.: The condensed nearest neighbor rule. IEEE Trans. Inform. Theory IT-14, 515–516 (1968)CrossRefGoogle Scholar
  23. 23.
    Chang, C.L.: Finding prototypes for nearest neighbor classifiers. IEEE Trans. Computers C-23(11), 1179–1184 (1974)CrossRefGoogle Scholar
  24. 24.
    Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)CrossRefGoogle Scholar
  25. 25.
    Kim, S.-W., Oommen, B.J.: Enhancing prototype reduction schemes with LVQ3-type algorithms. Pattern Recognition 36(5), 1083–1093 (2003)MATHCrossRefGoogle Scholar
  26. 26.
    Kim, S.-W., Oommen, B.J.: On using prototype reduction schemes to optimize kernel-based Fisher discriminant analysis (Unabridged version of this paper)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Sang-Woon Kim
    • 1
  • B. John Oommen
    • 2
  1. 1.Dept. of Computer Science and EngineeringMyongji UniversityYonginKorea
  2. 2.School of Computer ScienceCarleton UniversityOttawaCanada

Personalised recommendations