Abstract
Isomap is a highly popular manifold learning and dimensionality reduction technique that effectively performs multidimensional scaling on estimates of geodesic distances. However, the resulting output is extremely sensitive to parameters that control the selection of neighbors at each point. To date, no principled way of setting these parameters has been proposed, and in practice they are often tuned ad hoc, sometimes empirically based on prior knowledge of the desired output. In this paper we propose a parameterless technique that adaptively defines the neighborhood at each input point based on intrinsic dimensionality and local tangent orientation. In addition to eliminating the guesswork associated with parameter configuration, the adaptive nature of this technique enables it to select optimal neighborhoods locally at each point, resulting in superior performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Jolliffe, I.T.: Principal Component Analysis. Springer, New York (1986)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. John Wiley and Sons, Inc., New York (2000)
Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10(5), 1299–1319 (1998)
Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Dietterich, T.G., Becker, S., Ghahramani, Z. (eds.) Advances in Neural Information Processing Systems, vol. 14. MIT Press, Cambridge (2002)
Donoho, D.L., Grimes, C.E.: Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Arts and Sciences 100, 5591–5596 (2003)
Abraham, R., Marsden, J.E.: Foundations of Mechanics, 2nd edn. Addison-Wesley, Reading (1978)
de Silva, V., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. In: Becker, S., T., S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems 15. MIT Press, Cambridge
Medioni, G., Lee, M.S., Tang, C.K.: Computational Framework for Segmentation and Grouping. Elsevier Science Inc., New York (2000)
Falconer, K.: Fractal Geometry: Mathematical Foundations and Applications. John Wiley & Sons, Chichester (1990)
Wang, J., Zhang, Z., Zha, H.: Adaptive manifold learning. In: NIPS (2004)
Lorenz, E.: Deterministic nonperiodic flow  20, 130–141 (1963)
Fukunaga, K., Olsen, D.: An algorithm for finding intrinsic dimensionality of data. IEEE Transactions on Computer 20(2), 176–183 (1971)
Costa, J., Girotra, A., Hero, A.O.: Estimating local intrinsic dimension with k-nearest neighbor graphs. In: IEEE Workshop on Statistical Signal Processing (SSP), Bordeaux (2005)
Levina, E., Bickel, P.J.: Maximum likelihood estimation of intrinsic dimension. In: Saul, L.K., Weiss, Y. (eds.) Advances in Neural Information Processing Systems 17, pp. 777–784. MIT Press, Cambridge (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Mekuz, N., Tsotsos, J.K. (2006). Parameterless Isomap with Adaptive Neighborhood Selection. In: Franke, K., Müller, KR., Nickolay, B., Schäfer, R. (eds) Pattern Recognition. DAGM 2006. Lecture Notes in Computer Science, vol 4174. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11861898_37
Download citation
DOI: https://doi.org/10.1007/11861898_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44412-1
Online ISBN: 978-3-540-44414-5
eBook Packages: Computer ScienceComputer Science (R0)