Topology Learning Embedding: A Fast and Incremental Method for Manifold Learning

  • Tao Zhu
  • Furao Shen
  • Jinxi Zhao
  • Yu Liang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10634)


In this paper, we propose a novel manifold learning method named topology learning embedding (TLE). The key issue of manifold learning is studying data’s structure. Instead of blindly calculating the relations between each pair of available data, TLE learns data’s internal structure model in a smarter way: it constructs a topology preserving network rapidly and incrementally through online input data; then with the Isomap-based embedding strategy, it achieves out-of-sample data embedding efficiently. Experiments on synthetic data and real-world handwritten digit data demonstrate that TLE is a promising method for dimensionality reduction.


Dimensionality reduction Manifold learning Incremental learning Neural network SOINN 



This work is supported in part by the National Science Foundation of China under Grant Nos. (61373130, 61375064, 61373001), and Jiangsu NSF grant (BK20141319).


  1. 1.
    Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A Global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)CrossRefGoogle Scholar
  2. 2.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)CrossRefGoogle Scholar
  3. 3.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems, vol. 14, pp. 586–691. MIT Press (2001)Google Scholar
  4. 4.
    He, X.F., Niyogi, P.: Locality preserving projections. Adv. Neural Inf. Process. Syst. 45(1), 186–197 (2005)Google Scholar
  5. 5.
    Wang, J.Z.: Local tangent space alignment. In: Geometric Structure of High-Dimensional Data and Dimensionality Reduction, pp. 211–234. Springer, Berlin, Heidelberg (2012). doi: 10.1007/978-3-642-27497-8_11
  6. 6.
    Bellman, R.: Adaptative Control Processes: A Guided Tour. Princeton University, Princeton (1961)CrossRefMATHGoogle Scholar
  7. 7.
    Gan, Q., Shen, F.R., Zhao, J.X.: Improved Manifold Learning with competitive Hebbian rule. In: International Joint Conference on Neural Networks 2015, pp. 1–6 (2015)Google Scholar
  8. 8.
    Shen, F.R., Hasegawa, O.: An incremental network for on-line unsupervised classification and topology learning. Neural Networks 19(1), 90–106 (2006)CrossRefMATHGoogle Scholar
  9. 9.
    Xing, Y.L., Shi, X.F., Shen, F.R., Zhou, K., Zhao, J.X.: A self-organizing incremental neural network based on local distribution learning. Neural Networks 84, 143–160 (2016)CrossRefGoogle Scholar
  10. 10.
    Martinetz, T., Schulten, K.: Topology Representing Networks. Neural Networks 7(3), 507–522 (1994)CrossRefGoogle Scholar
  11. 11.
    Silva, V.D., Tenenbaum, J.B.: Sparse Multidimensional Scaling using Landmark Points (2004)Google Scholar
  12. 12.
    Bengio, Y., Paiement, J.F., Vincent, P., Delalleau, O., Roux, N.L., Ouimet, M.: Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering. Adv. Neural Inf. Process. Syst. 16, 177–184 (2004)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.National Key Laboratory for Novel Software Technology, Department of Computer Science and Technology, Collaborative Innovation Center of Novel Software Technology and IndustrializationNanjing UniversityNanjingPeople’s Republic of China

Personalised recommendations