Discriminant Analysis on Embedded Manifold

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3021)


Previous manifold learning algorithms mainly focus on uncovering the low dimensional geometry structure from a set of samples that lie on or nearly on a manifold in an unsupervised manner. However, the representations from unsupervised learning are not always optimal in discriminating capability. In this paper, a novel algorithm is introduced to conduct discriminant analysis in term of the embedded manifold structure. We propose a novel clustering algorithm, called Intra-Cluster Balanced K-Means (ICBKM), which ensures that there are balanced samples for the classes in a cluster; and the local discriminative features for all clusters are simultaneously calculated by following the global Fisher criterion. Compared to the traditional linear/kernel discriminant analysis algorithms, ours has the following characteristics: 1) it is approximately a locally linear yet globally nonlinear discriminant analyzer; 2) it can be considered a special Kernel-DA with geometry-adaptive-kernel, in contrast to traditional KDA whose kernel is independent to the samples; and 3) its computation and memory cost are reduced a great deal compared to traditional KDA, especially for the cases with large number of samples. It does not need to store the original samples for computing the low dimensional representation for new data. The evaluation on toy problem shows that it is effective in deriving discriminative representations for the problem with nonlinear classification hyperplane. When applied to the face recognition problem, it is shown that, compared with LDA and traditional KDA on YALE and PIE databases, the proposed algorithm significantly outperforms LDA and


Discriminant Analysis Face Recognition Near Neighbor Locality Preserve Projection Nonlinear Dimensionality Reduction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering. In: Advances in Neural Information Processing System 15, Vancouver, British Columbia, Canada (2001)Google Scholar
  2. 2.
    Bregler, C., Omohundro, S.M.: Nonlinear manifold learning for visual speech recognition. In: Fifth International Conference on Computer Vision, June 20-23 (1995)Google Scholar
  3. 3.
    Brand, M.: Charting a manifold. In: Advances in Neural Information Processing Systems 15 (2002)Google Scholar
  4. 4.
    Cheung, D.W., Lee, S.D., Xiao, Y.: Effect of Data Skewness and Workload Balance in Parallel Data Mining. IEEE Transaction on Knowledge and Data Engineering 14(3), 498–513 (2002)CrossRefGoogle Scholar
  5. 5.
    Chung, F.R.K.: Spectral Graph Theory. In: Regional Conferences Series in Mathematics, vol. 92 (1997)Google Scholar
  6. 6.
    Freedman, D.: Efficient simplicial reconstructions of manifolds from their samples. IEEE Trans. On PAMI 24(10), 1349–1357 (2002)Google Scholar
  7. 7.
    Gomes, J., Mojsilovic, A.: A variational approach to recovering a manifold from sample points. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002. LNCS, vol. 2351, pp. 18–30. Springer, Heidelberg (2002)Google Scholar
  8. 8.
    He, X., Niyogi, P.: Locality Preserving Projections (LPP). TR-2002-09 (2002)Google Scholar
  9. 9.
    Hinton, G., Roweis, S.T.: Stochastic Neighbor Embedding. In: Advances in Neural Information Processing Systems 15 (2002)Google Scholar
  10. 10.
    Liu, Q., Huang, R., Lu, H., Ma, S.: Face Recognition Using Kernel Based Fisher Discriminant Analysis. In: Proceeding of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, May 20-21, p. 197 (2002)Google Scholar
  11. 11.
    MacQueen, J.: On convergence of k-means and partitions with minimum average variance. Ann. Math. Statist. 36 (1965)Google Scholar
  12. 12.
    Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290 (December 22, 2000)Google Scholar
  13. 13.
    Roweis, S., Saul, L., Hinton, G.: Global Coordination of Local Linear Models. In: Advances in Neural Information Processing System 14 (2001)Google Scholar
  14. 14.
    Shi, J., Malik, J.: Normalized Cuts and Image Segmentation. IEEE Trans. on Pattern Analysis and Machine Intelligence 22, 888–905 (2000)CrossRefGoogle Scholar
  15. 15.
    Sim, T., Baker, S., Bsat, M.: The CMU Pose, Illumination, and Expression (PIE) Database. In: Proceedings of the IEEE International Conference on Automatic Face and Gesture Recognition (May 2002)Google Scholar
  16. 16.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290 (December 22, 2000)Google Scholar
  17. 17.

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  1. 1.Microsoft Research AsiaBeijingP.R.China
  2. 2.School of Mathematical SciencesPeking UniversityBeijingP.R. China

Personalised recommendations