Advertisement

Riemannian Manifold Learning for Nonlinear Dimensionality Reduction

  • Tony Lin
  • Hongbin Zha
  • Sang Uk Lee
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3951)

Abstract

In recent years, nonlinear dimensionality reduction (NLDR) techniques have attracted much attention in visual perception and many other areas of science. We propose an efficient algorithm called Riemannian manifold learning (RML). A Riemannian manifold can be constructed in the form of a simplicial complex, and thus its intrinsic dimension can be reliably estimated. Then the NLDR problem is solved by constructing Riemannian normal coordinates (RNC). Experimental results demonstrate that our algorithm can learn the data’s intrinsic geometric structure, yielding uniformly distributed and well organized low-dimensional embedding data.

Keywords

Riemannian Manifold Simplicial Complex Edge Point Neural Information Processing System Face Data 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Donoho, D.: High-Dimensional Data Analysis: The Curses and Blessings of Dimensionality. In: Mathematical Challenges of the 21st Century, American Mathematical Society, Los Angeles (2000)Google Scholar
  2. 2.
    Jolliffe, I.: Principal Component Analysis. Springer, New York (1989)MATHGoogle Scholar
  3. 3.
    Cox, T., Cox, M.: Multidimensional Scaling. Chapman & Hall, London (1994)MATHGoogle Scholar
  4. 4.
    Bourlard, H., Kamp, Y.: Auto-association by multilayer perceptrons and singular value decomposition. Biological Cybernetics 59, 291–294 (1988)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Erwin, E., Obermayer, K., Schulten, K.: Self-organizing maps: Ordering, convergence properties and energy functions. Biological Cybernetics 67, 47–55 (1992)CrossRefMATHGoogle Scholar
  6. 6.
    Durbin, R., Willshaw, D.: An analogue approach to the travelling salesman problem using an elastic net method. Nature 326, 689–691 (1987)CrossRefGoogle Scholar
  7. 7.
    Bishop, C., Svenson, M., Williams, C.: Gtm: The generative topographic mapping. Neural Computation 10, 215–234 (1998)CrossRefGoogle Scholar
  8. 8.
    Hastie, T., Stuetzle, W.: Principal curves. J. American Statistical Association 84, 502–516 (1989)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Tenenbaum, J., Silva, V.d., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  10. 10.
    Silva, V., Tenenbaum, J.: Global versus local methods in nonlinear dimensionality reduction. In: Advances in Neural Information Processing Systems, MIT Press, Cambridge (2003)Google Scholar
  11. 11.
    Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  12. 12.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15, 1373–1396 (2003)CrossRefMATHGoogle Scholar
  13. 13.
    He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.: Face recognition using laplacianfaces. IEEE Trans. Pattern Analysis and Machine Intelligence 27, 328–340 (2005)CrossRefGoogle Scholar
  14. 14.
    Donoho, D., Grimes, C.: Hessian eigenmaps: New tools for nonlinear dimensionality reduction. In: Proc. National Academy of Science, pp. 5591–5596 (2003)Google Scholar
  15. 15.
    Weinberger, K., Saul, L.: Unsupervised learning of image manifolds by semidefinite programming. In: Proc. CVPR, pp. 988–995 (2004)Google Scholar
  16. 16.
    Brand, M.: Charting a manifold. In: Advances in Neural Information Processing Systems, MIT Press, Cambridge (2003)Google Scholar
  17. 17.
    Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM Journal on Scientific Computing 26, 313–338 (2004)MathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    Nadler, B., Lafon, S., Coifman, R., Kevrekidis, I.: Diffusion maps, spectral clustering and reaction coordinates of dynamical systems. Journal of Applied and Computational Harmonic Analysis (submitted)Google Scholar
  19. 19.
    Seung, H., Lee, D.: The manifold ways of perception. Science 290, 2268–2269 (2000)CrossRefGoogle Scholar
  20. 20.
    Sha, F., Saul, L.: Analysis and extension of spectral methods for nonlinear dimensionality reduction. In: Proc. Int. Conf. Machine Learning, Germany (2005)Google Scholar
  21. 21.
    Brun, A., Westin, C.F., Herberthson, M., Knutsson, H.: Fast manifold learning based on riemannian normal coordinates. In: Proc. 14th Scandinavian Conf. on Image Analysis, Joensuu, Finland (2005)Google Scholar
  22. 22.
    Jost, J.: Riemannian Geometry and Geometric Analysis, 3rd edn. Springer, Heidelberg (2002)CrossRefMATHGoogle Scholar
  23. 23.
    Freedman, D.: Efficient simplicial reconstructions of manifolds from their samples. IEEE Trans. Pattern Analysis and Machine Intelligence 24, 1349–1357 (2002)CrossRefGoogle Scholar
  24. 24.
    Cormen, T., Leiserson, C., Rivest, R., Stein, C.: Introduction to Algorithms. MIT Press, Cambridge (2001)MATHGoogle Scholar
  25. 25.
    Golub, G., Loan, C.: Matrix Computations, 3rd edn. Jonhs Hopkins Univ. (1996)Google Scholar
  26. 26.
    Levina, E., Bickel, P.: Maximum likelihood estimation of intrinsic dimension. In: Advances in Neural Information Processing Systems, MIT Press, Cambridge (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Tony Lin
    • 1
  • Hongbin Zha
    • 1
  • Sang Uk Lee
    • 2
  1. 1.National Laboratory on Machine PerceptionPeking UniversityBeijingChina
  2. 2.School of Electrical EngineeringSeoul National UniversitySeoulKorea

Personalised recommendations