Dual Unsupervised Discriminant Projection for Face Recognition

  • Lei Tang
  • Jie Gui
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6215)


In this paper we proposed a dual unsupervised discriminant projection (DUDP) method for dimensionality reduction tasks. The proposed method is derived from the efficient unsupervised method called unsupervised discriminant projection (UDP). UDP takes into account both the local and nonlocal characteristics to seek a projection that simultaneously maximizes the nonlocal scatter and minimizes the local scatter. While UDP adopt PCA procedure to avoid a singular scatter matrix by ruling out some small principal components in which it lost a lot of potential and valuable discriminant information of original data. To overcome this problem, we proposed our algorithm to carry out discriminant analysis both in null space and range space to avoid loss of discriminant information. The advantage of our algorithm is borne out by comparison with some other widely used methods in the experiments on Yale face database.


Dimensionality reduction unsupervised discriminant projection subspace learning face recognition 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1989)zbMATHGoogle Scholar
  2. 2.
    Duda, R., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, Chichester (2001)zbMATHGoogle Scholar
  3. 3.
    Belhumeur, P.N., Hepanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 711–720 (1997)CrossRefGoogle Scholar
  4. 4.
    Tenenbaum, J., de Silva, V., Langford, J.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  5. 5.
    Roweis, S., Saul, L.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  6. 6.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering. Advances in Neural Information Processing Systems 14, 585–591 (2001)Google Scholar
  7. 7.
    Bengio, Y., Paiement, J.F., Vincent, P.: Out-of-sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering. Technical Report 1238 Université de Montreal (2003)Google Scholar
  8. 8.
    Yang, J., Zhang, D., Yang, J.Y., Niu, B.: Globally Maximizing, Locally Minimizing: Unsupervised Discriminant Projection with Application to Face and Palm Biometrics. IEEE Transactions on Pattern Analysis and Machine Intelligence 29, 650–664 (2007)CrossRefGoogle Scholar
  9. 9.
    Yang, J., Yang, J.-y.: Why can LDA be performed in PCA transformed space? Pattern Recognition 36, 563–566 (2003)CrossRefGoogle Scholar
  10. 10.
    Raudys, S.J., Jain, A.K.: Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners. IEEE Transactions on Pattern Analysis and Machine Intelligence 13, 252–264 (1991)CrossRefGoogle Scholar
  11. 11.

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Lei Tang
    • 1
    • 2
  • Jie Gui
    • 1
  1. 1.Intelligent Computing Lab, Hefei Institute of Intelligent MachinesChinese Academy of SciencesHefeiChina
  2. 2.Department of AutomationUniversity of Science and Technology of ChinaHefeiChina

Personalised recommendations