Advertisement

An Empirical Study on the Performance of Spectral Manifold Learning Techniques

  • Peter Mysling
  • Søren Hauberg
  • Kim Steenstrup Pedersen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6791)

Abstract

In recent years, there has been a surge of interest in spectral manifold learning techniques. Despite the interest, only little work has focused on the empirical behavior of these techniques. We construct synthetic data of variable complexity and observe the performance of the techniques as they are subjected to increasingly difficult problems. We evaluate performance in terms of both a classification and a regression task. Our study includes Isomap, LLE, Laplacian eigenmaps, and diffusion maps. Among others, our results indicate that the techniques are highly dependent on data density, sensitive to scaling, and greatly influenced by intrinsic dimensionality.

Keywords

Near Neighbour Regression Task Regression Measure Nonlinear Dimensionality Reduction Intrinsic Dimensionality 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Roweis, S., Saul, L.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  2. 2.
    Tenenbaum, J., de Silva, V., Langford, J.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  3. 3.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 15, 1373–1396 (2003)CrossRefzbMATHGoogle Scholar
  4. 4.
    Brand, M.: Charting a Manifold. In: NIPS, vol. 15, pp. 977–984. IEEE Press, Los Alamitos (2003)Google Scholar
  5. 5.
    Donoho, D.L., Grimes, C.: Hessian Eigenmaps: Locally Linear Embedding Techniques for High-dimensional Data. In: PNAS, vol. 100, pp. 5591–5596. National Academy Sciences, Washington (2003)Google Scholar
  6. 6.
    Zhang, Z., Zha, H.: Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment. SIAM J. Sci. Comput. 26, 313–338 (2004)CrossRefzbMATHMathSciNetGoogle Scholar
  7. 7.
    Coifman, R.R., Lafon, S.: Diffusion Maps. Applied and Computational Harmonics Analysis 21, 5–30 (2006)CrossRefzbMATHMathSciNetGoogle Scholar
  8. 8.
    Yeh, M.C., Lee, I.H., Wu, G., Wu, Y., Chang, E.Y.: Manifold Learning, a Promised Land or Work in Progress. In: Proc. of IEEE Intl. Conf. on Multimedia and Expo, pp. 1154–1157. IEEE Press, New York (2005)Google Scholar
  9. 9.
    Niskanen, M., Silven, O.: Comparison of Dimensionality Reduction Methods for Wood Surface Inspection. In: Proc. of the 6th Intl. Conference on Quality Control by Artificial Vision, pp. 179–188 (2003)Google Scholar
  10. 10.
    van der Maaten, L., Postma, E.O., van den Herik, H.J.: Dimensionality Reduction: A Comparative Review. Technical report, Tilburg Uni (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Peter Mysling
    • 1
  • Søren Hauberg
    • 1
  • Kim Steenstrup Pedersen
    • 1
  1. 1.The eScience Center, Dept. of Computer ScienceUniversity of CopenhagenCopenhagen ØDenmark

Personalised recommendations