Advertisement

Clustering-Based Nonlinear Dimensionality Reduction on Manifold

  • Guihua Wen
  • Lijun Jiang
  • Jun Wen
  • Nigel R. Shadbolt
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4099)

Abstract

This paper proposes a clustering-based nonlinear dimensionality reduction approach. It utilizes the clustering approaches to form the clustering structure by which the distance between any two data points are rescaled to make data points from different clusters separated more easily. This rescaled distance matrix is then provided to improve the nonlinear dimensionality reduction approaches such as Isomap to achieve the better performance. Furthermore, the proposed approach also decreases the time complexity on the large data sets, as it provides good neighborhood structure that can speed up the subsequent dimensionality reducing process. Unlike the supervised approaches, this approach does not take the labelled data set as prerequisite, so that it is unsupervised. This makes it applicable to the broader domains. The conducted experiments by classification on benchmark data sets have validated the proposed approach.

Keywords

Geodesic Distance Locally Linear Embedding Neighborhood Graph Linear Embedding Supervise Approach 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  2. 2.
    Silva, V.D., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. Neural Information Processing Systems 15, 705–712 (2003)Google Scholar
  3. 3.
    Balasubramanian, M., Schwartz, E.L.: The Isomap Algorithm and Topological Stability. Science 295, 7 (2002)CrossRefGoogle Scholar
  4. 4.
    Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  5. 5.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computing 15, 1373–1396 (2003)zbMATHCrossRefGoogle Scholar
  6. 6.
    Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding, techniques for high-dimensional data. Proc.Natl.Acad. Sci. U. S. A 100, 5591–5596 (2003)zbMATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Saul, L.K., Roweis, S.T.: Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds. Journal of Machine Learning Research 4, 119–155 (2003)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Vlachos, M., Domeniconi, C., Gunopulos, D., et al.: Non-Linear Dimensionality Reduction Techniques for Classification and Visualization. Knowledge Discovery and Data Mining. Edmonton, Canada (2002)Google Scholar
  9. 9.
    Geng, X., Zhan, D.C., Zhou, Z.H.: Supervised Nonlinear Dimensionality Reduction for Visualization and Classification. IEEE Transactions on Systems, Man and Cybernetics 35, 1098–1107 (2005)CrossRefGoogle Scholar
  10. 10.
    Charalampidis, D.: A modified k-means algorithm for circular invariant clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence 27, 1856–1865 (2005)CrossRefGoogle Scholar
  11. 11.
    Law, M.H.C., Jain, A.K.: Incremental nonlinear dimensionality reduction by manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 377–391 (2006)CrossRefGoogle Scholar
  12. 12.
    Shao, C., Huang, H.K.: Selection of the optimal parameter value for the ISOMAP algorithm. In: Gelbukh, A., de Albornoz, Á., Terashima-Marín, H. (eds.) MICAI 2005. LNCS (LNAI), vol. 3789, pp. 396–404. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  13. 13.
    de Ridder, D., Kouropteva, O., Okun, O., et al.: Supervised locally linear embedding. In: Kaynak, O., Alpaydın, E., Oja, E., Xu, L. (eds.) ICANN 2003 and ICONIP 2003. LNCS, vol. 2714, pp. 333–341. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  14. 14.
    Xiao, J., Zhou, Z.T., Hu, D.W., et al.: Self-organized locally linear embedding for nonlinear dimensionality reduction. In: Wang, L., Chen, K., Ong, Y. (eds.) ICNC 2005. LNCS, vol. 3610, pp. 101–109. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  15. 15.
    Yang, L.: Building k edge-disjoint spanning trees of minimum total length for isometric data embedding. IEEE Transactions on Pattern Analysis and Machine Intelligence 27, 1680–1683 (2005)CrossRefGoogle Scholar
  16. 16.
    Jenkins, O., Mataric, M.: A Spatio-temporal Extension to Isomap Nonlinear Dimension Reduction. In: Proceedings of the Twenty-First International Conference on Machine Learning, Alberta, Canada, pp. 1680–1683 (2004)Google Scholar
  17. 17.
    Sebastiani, F.: Machine Learning in Automated Text Categorization. ACM Computing Surveys 34, 1–47 (2002)CrossRefMathSciNetGoogle Scholar
  18. 18.
    Kouropteva, Olga, Okun, Oleg, Pietikainen, Matti: Incremental locally linear embedding. Pattern Recognition 38, 1764–1767 (2005)zbMATHCrossRefGoogle Scholar
  19. 19.
    de Ridder, D., Loog, M., Reinders, M.J.T.: Local Fisher embedding. Proceedings of the 17th International Conference on Pattern Recognition 2, 295–298 (2004)CrossRefGoogle Scholar
  20. 20.
    Saxena, A., Gupta, A., Mukerjee, A.: Non-linear dimensionality reduction by locally linear isomaps. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds.) ICONIP 2004. LNCS, vol. 3316, pp. 1038–1043. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  21. 21.
    Chang, J.P., Shen, H.X., Zhou, Z.H.: Unified locally linear embedding and linear discriminant analysis algorithm (ULLELDA) for face recognition. In: Li, S.Z., Lai, J.-H., Tan, T., Feng, G.-C., Wang, Y. (eds.) SINOBIOMETRICS 2004. LNCS, vol. 3338, pp. 296–304. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  22. 22.
    Abusham, E.E., Ngo, D., Teoh, A.: Fusion of locally linear embedding and principal component analysis for face recognition (FLLEPCA). In: Singh, S., Singh, M., Apte, C., Perner, P. (eds.) ICAPR 2005. LNCS, vol. 3687, pp. 326–333. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  23. 23.
    Weng, S., Zhang, C., Lin, Z.: Exploring the structure of supervised data by Discriminant Isometric Mapping. Pattern Recognition 38, 599–601 (2005)CrossRefGoogle Scholar
  24. 24.
    Kouropteva, O., Okun, O., Pietikainen, M.: Selection of the optimal parameter value for the locally linear embedding algorithm. In: Proceedings of 1th International Conference on Fuzzy Systems and Knowledge Discovery, pp. 359–363 (2002)Google Scholar
  25. 25.
    Saxena, A., Gupta, A., Mukerjee, A.: Non-linear dimensionality reduction by locally Linear Isomaps. In: Proceedings of 11th International Conference on Neural Information Processing, India, pp. 1038–1043 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Guihua Wen
    • 1
    • 3
  • Lijun Jiang
    • 1
  • Jun Wen
    • 2
  • Nigel R. Shadbolt
    • 3
  1. 1.South China University of TechnologyGuangzhouChina
  2. 2.Hubei Insitute for NationalitiesHuBei EnsiChina
  3. 3.University of SouthamptonSouthamptonUnited Kingdom

Personalised recommendations