Skip to main content

Parameterless Isomap with Adaptive Neighborhood Selection

  • Conference paper
Pattern Recognition (DAGM 2006)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4174))

Included in the following conference series:

Abstract

Isomap is a highly popular manifold learning and dimensionality reduction technique that effectively performs multidimensional scaling on estimates of geodesic distances. However, the resulting output is extremely sensitive to parameters that control the selection of neighbors at each point. To date, no principled way of setting these parameters has been proposed, and in practice they are often tuned ad hoc, sometimes empirically based on prior knowledge of the desired output. In this paper we propose a parameterless technique that adaptively defines the neighborhood at each input point based on intrinsic dimensionality and local tangent orientation. In addition to eliminating the guesswork associated with parameter configuration, the adaptive nature of this technique enables it to select optimal neighborhoods locally at each point, resulting in superior performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jolliffe, I.T.: Principal Component Analysis. Springer, New York (1986)

    Google Scholar 

  2. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. John Wiley and Sons, Inc., New York (2000)

    Google Scholar 

  3. Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10(5), 1299–1319 (1998)

    Article  Google Scholar 

  4. Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  5. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  6. Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Dietterich, T.G., Becker, S., Ghahramani, Z. (eds.) Advances in Neural Information Processing Systems, vol. 14. MIT Press, Cambridge (2002)

    Google Scholar 

  7. Donoho, D.L., Grimes, C.E.: Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Arts and Sciences 100, 5591–5596 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  8. Abraham, R., Marsden, J.E.: Foundations of Mechanics, 2nd edn. Addison-Wesley, Reading (1978)

    MATH  Google Scholar 

  9. de Silva, V., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. In: Becker, S., T., S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems 15. MIT Press, Cambridge

    Google Scholar 

  10. Medioni, G., Lee, M.S., Tang, C.K.: Computational Framework for Segmentation and Grouping. Elsevier Science Inc., New York (2000)

    MATH  Google Scholar 

  11. Falconer, K.: Fractal Geometry: Mathematical Foundations and Applications. John Wiley & Sons, Chichester (1990)

    MATH  Google Scholar 

  12. Wang, J., Zhang, Z., Zha, H.: Adaptive manifold learning. In: NIPS (2004)

    Google Scholar 

  13. Lorenz, E.: Deterministic nonperiodic flow  20, 130–141 (1963)

    Google Scholar 

  14. Fukunaga, K., Olsen, D.: An algorithm for finding intrinsic dimensionality of data. IEEE Transactions on Computer 20(2), 176–183 (1971)

    Article  MATH  Google Scholar 

  15. Costa, J., Girotra, A., Hero, A.O.: Estimating local intrinsic dimension with k-nearest neighbor graphs. In: IEEE Workshop on Statistical Signal Processing (SSP), Bordeaux (2005)

    Google Scholar 

  16. Levina, E., Bickel, P.J.: Maximum likelihood estimation of intrinsic dimension. In: Saul, L.K., Weiss, Y. (eds.) Advances in Neural Information Processing Systems 17, pp. 777–784. MIT Press, Cambridge (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mekuz, N., Tsotsos, J.K. (2006). Parameterless Isomap with Adaptive Neighborhood Selection. In: Franke, K., Müller, KR., Nickolay, B., Schäfer, R. (eds) Pattern Recognition. DAGM 2006. Lecture Notes in Computer Science, vol 4174. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11861898_37

Download citation

  • DOI: https://doi.org/10.1007/11861898_37

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44412-1

  • Online ISBN: 978-3-540-44414-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics