Incremental Manifold Learning Via Tangent Space Alignment

  • Xiaoming Liu
  • Jianwei Yin
  • Zhilin Feng
  • Jinxiang Dong
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4087)


Several algorithms have been proposed to analysis the structure of high-dimensional data based on the notion of manifold learning. They have been used to extract the intrinsic characteristic of different type of high-dimensional data by performing nonlinear dimensionality reduction. Most of them operate in a “batch” mode and cannot be efficiently applied when data are collected sequentially. In this paper, we proposed an incremental version (ILTSA) of LTSA (Local Tangent Space Alignment), which is one of the key manifold learning algorithms. Besides, a landmark version of LTSA (LLTSA) is proposed, where landmarks are selected based on LASSO regression, which is well known to favor sparse approximations because it uses regularization with l1 norm. Furthermore, an incremental version (ILLTSA) of LLTSA is also proposed. Experimental results on synthetic data and real word data sets demonstrate the effectivity of our algorithms.


manifold learning LTSA incremental learning LASSO 


  1. 1.
    Seung, S., Daniel, D.L.: The manifold ways of perception. Science 290(5500), 2268–2269 (2000)CrossRefGoogle Scholar
  2. 2.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290(5500), 2319–2323 (2000)CrossRefGoogle Scholar
  3. 3.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  4. 4.
    David, L.D., Caroe, G.: Hessian eigenmaps Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America 100(10), 5591–5596 (2003)zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Zhang, Z.Y., Zha, H.Y.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM Journal of Scientific Computing 26(1), 313–338 (2004)zbMATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Martin, H.C., Law, A.K.J.: Incremental Nonlinear Dimensionality Reduction by Manifold Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(3), 377–391 (2006)CrossRefGoogle Scholar
  7. 7.
    Olga Kouropteva, O.O., Pietikainen, M.: Incremental locally linear embedding. Pattern Recognition 38, 1764–1767 (2005)zbMATHCrossRefGoogle Scholar
  8. 8.
    ZhenYue Zhang, H.Z.: A Domain Decomposition Method for Fast Manifold Learning. In: Advances in Neural Information Processing Systems, MIT Press, Cambridge (2006)Google Scholar
  9. 9.
    Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning. Springer, Heidelberg (2001)zbMATHGoogle Scholar
  10. 10.
    Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Annals of Statistics (2003)Google Scholar
  11. 11.
    de Silva, V.J.B.T.: Global versus Local Approaches to Nonlinear Dimensionality Reduction. In: Advances in Neural Information Processing Systems, vol. 15 (2003)Google Scholar
  12. 12.
    Jorge Gomes da Silva, J.S.M.: Jo?o Manuel Lage de Miranda Lemos. Selecting Landmark Points for Sparse Manifold Learning. In: Advances in Neural Information Processing Systems, MIT Press, Vancouver, Canada (2006)Google Scholar
  13. 13.
    Golub, G.H., V.L., C.F.: Matrix Computations. Johns Hopkins University Press (1996)Google Scholar
  14. 14.
    Bjorck, A., Golub, G.H.: Numerical methods for computing angles between linear subspaces. Mathematics of Computation 27(123), 579–594 (1973)CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Xiaoming Liu
    • 1
  • Jianwei Yin
    • 1
  • Zhilin Feng
    • 1
    • 2
  • Jinxiang Dong
    • 1
  1. 1.Department of Computer Science and TechnologyZhejiang UniversityChina
  2. 2.Zhijiang CollegeZhejiang University of TechnologyHangzhouChina

Personalised recommendations