Advertisement

Nonlinear Dimensionality Reduction Using Circuit Models

  • Fredrik Andersson
  • Jens Nilsson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3540)

Abstract

The problem addressed in nonlinear dimensionality reduction, is to find lower dimensional configurations of high dimensional data, thereby revealing underlying structure. One popular method in this regard is the Isomap algorithm, where local information is used to find approximate geodesic distances. From such distance estimations, lower dimensional representations, accurate on a global scale, are obtained by multidimensional scaling. The property of global approximation sets Isomap in contrast to many competing methods, which approximate only locally.

A serious drawback of Isomap is that it is topologically instable, i.e., that incorrectly chosen algorithm parameters or perturbations of data may abruptly alter the resulting configurations. To handle this problem, we propose new methods for more robust approximation of the geodesic distances. This is done using a viewpoint of electric circuits. The robustness is validated by experiments. By such an approach we achieve both the stability of local methods and the global approximation property of global methods.

Keywords

Root Mean Square Error Root Mean Square Circuit Model Geodesic Distance Adjacency Graph 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Andersson, F., Nilsson, J.: Circuit models for manifold learning. Tech. rep., Lund University (2005)Google Scholar
  2. 2.
    Balasubramanian, M., Schwartz, E.L., Tenenbaum, J.B., de Silva, V., Langford, J.C.: The Isomap algorithm and topological stability. Science 295 (2002)Google Scholar
  3. 3.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps for dimensionality reduction and data representation. Neural Computation 15, 1373–1396 (2003)zbMATHCrossRefGoogle Scholar
  4. 4.
    Bernstein, M., de Silva, V., Langford, J.C., Tenenbaum, J.B.: Graph approximations to geodesics on embedded manifolds. Tech. rep., Stanford University (2000)Google Scholar
  5. 5.
    Chatfield, C., Collins, A.J.: Introduction to multivariate analysis. Chapman & Hall, London (1980)zbMATHGoogle Scholar
  6. 6.
    Cox, T.F., Cox, M.A.A.: Multidimensional scaling. Monographs on Statistics and Applied Probability, vol. 59. Chapman & Hall, London (1994)zbMATHGoogle Scholar
  7. 7.
    de Silva, V., Tenenbaum, J.: Global versus local methods in nonlinear dimensionality reduction. Neural Information Processing Systems 15, 705–712 (2003)Google Scholar
  8. 8.
    Donoho, D., Grimes, C.: Hessian Eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. 100, 5591–5596 (2003)zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: ICML 2004: Twenty-first international conference on Machine learning. ACM Press, New York (2004)Google Scholar
  10. 10.
    Jolliffe, I.T.: Principal component analysis. Springer, New York (2002)zbMATHGoogle Scholar
  11. 11.
    Lafon, S.: Diffusion maps and geometric harmonics. Doctorate thesis, Yale University (2004)Google Scholar
  12. 12.
    Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  13. 13.
    Tenenbaum, J., de Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2322 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Fredrik Andersson
    • 1
  • Jens Nilsson
    • 1
  1. 1.Centre for Mathematical SciencesLund University/LTHLundSweden

Personalised recommendations