Advertisement

A Fast Feature Extraction Method for Kernel 2DPCA

  • Ning Sun
  • Hai-xian Wang
  • Zhen-hai Ji
  • Cai-rong Zou
  • Li Zhao
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4113)

Abstract

Recently, a new approach called two-dimensional principal component analysis (2DPCA) has been proposed for face representation and recognition. The essence of 2DPCA is that it computes the eigenvectors of the so called image covariance matrix without matrix-to-vector conversion. Kernel principal component analysis (KPCA) is a non-linear generation of the popular principal component analysis via the kernel trick. Similarly, the kernelization of 2DPCA can be benefit to develop the nonlinear structures in the input data. However, the standard K2DPCA always suffers from the computational problem for using the image matrix directly. In this paper, we propose an efficient algorithm to speed up the training procedure of K2DPCA. The results of experiment on face recognition show that the proposed algorithm can achieve much more computational efficiency and remarkably save the memory-consuming compared to the standard K2DPCA. required format.

Keywords

Face Recognition Recognition Accuracy Kernel Matrix Kernel Principal Component Analysis Kernel Trick 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kirby, Y., Sirovich, L.: Application of the Karhunen-loeve Procedure for the Characterization of Human Faces. IEEE Trans. PAMI 12, 103–108 (1990)Google Scholar
  2. 2.
    Turk, M., Pentland, A.: Eigenfaces for Recognition. J Cognitive Neuroscience, 71–86 (1991)Google Scholar
  3. 3.
    Yang, J., Zhang, D., Frangi, A.F., Yang, J.Y.: Two-dimensional PCA: A New Approach to Appearance based Face Representation and Recognition. IEEE Trans. PAMI 26(1), 131–137 (2004)Google Scholar
  4. 4.
    Schölkopf, B., Smola, A., Muller, K.R.: Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation 10, 1299–1319 (1998)CrossRefGoogle Scholar
  5. 5.
    Rosipal, R., Rirolami, M., Trejo, L., Cichoki, A.: An Expectation-maximization Approach to Nolinear Component Analysis. Neural Computation 13, 505–510 (2001)MATHCrossRefGoogle Scholar
  6. 6.
    Zheng, W.M., Zou, C.R., Zhao, L.: An Improved Algorithm for Kernel Principal Component Analysis. Neural Processing Letters 22, 49–56 (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ning Sun
    • 1
    • 2
  • Hai-xian Wang
    • 1
  • Zhen-hai Ji
    • 1
    • 2
  • Cai-rong Zou
    • 2
  • Li Zhao
    • 1
    • 2
  1. 1.Research Center of Learning ScienceSoutheast UniversityNanjingChina
  2. 2.Department of Radio EngineeringSoutheast UniversityNanjingChina

Personalised recommendations