Locally Adaptive Nonlinear Dimensionality Reduction

  • Yuexian Hou
  • Hongmin Yang
  • Pilian He
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4293)


Popular nonlinear dimensionality reduction algorithms, e.g., SIE and Isomap suffer a difficulty in common: global neighborhood parameters often fail in tackling data sets with high variation in local manifold. To improve the availability of nonlinear dimensionality reduction algorithms in the field of machine learning, an adaptive neighbors selection scheme based on locally principal direction reconstruction is proposed in this paper. Our method involves two main computation steps. First, it selects an appropriate neighbors set for each data points such that all neighbors in a neighbors set form a d-dimensional linear subspace approximately and computes locally principal directions for each neighbors set respectively. Secondly, it fits each neighbor by means of locally principal directions of corresponding neighbors set and deletes the neighbors whose fitting error exceeds a predefined threshold. The simulation shows that our proposal could deal with data set with high variation in local manifold effectively. Moreover, comparing with other adaptive neighbors selection strategy, our method could circumvent false connectivity induced by noise or high local curvature.


Principal Direction Linear Manifold Nonlinear Dimensionality Reduction Order Taylor Expansion Noisy Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Barlow, H.: Unsupervised learning. Neural Computation 1, 295–311 (1989)CrossRefGoogle Scholar
  2. 2.
    Marcus, G.: Programs of the Mind. Science 304, 1450-1451Google Scholar
  3. 3.
    Baum, E.: What Is Thought? MIT Press, Cambridge (2004)Google Scholar
  4. 4.
    Mardia, K.V., Kent, J.T., Bibby, J.M.: Multivariate Analysis. Academic Press, London (1979)MATHGoogle Scholar
  5. 5.
    Tenenbaum, J.B., et al.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  6. 6.
    Roweis, S.T., et al.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  7. 7.
    De Silva, V., Tenenbaum, J.: Global versus local methods in nonlinear dimensionality reduction. In: de Silva, V., et al. (eds.) NIPS (2002)Google Scholar
  8. 8.
    Hou, Y., et al.: Robust Nonlinear Dimension Reduction: A Self-organized Approach. In: FSDK 2005 (2005)Google Scholar
  9. 9.
    Wang, J., Zhang, Z., Zha, H.: Adaptive Manifold Learning. In: NIPS 2004 (2004)Google Scholar
  10. 10.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 15(6), 1373–1396 (2003)MATHCrossRefGoogle Scholar
  11. 11.
    Hou, Y., et al.: Adaptive manifold learning Based on Statistical Criterions, Tech report of Tianjin universityGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yuexian Hou
    • 1
  • Hongmin Yang
    • 1
  • Pilian He
    • 1
  1. 1.Department of Computer Science and TechnologyTianjin UniversityTianjinChina

Personalised recommendations