Advertisement

Self-organizing Isometric Embedding Based on Statistical Criterions

  • Ruiguo Yu
  • Yuexian Hou
  • Pilian He
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4223)

Abstract

Popular nonlinear dimensionality reduction algorithms, e.g., LLE, Isomap and SIE suffer a difficulty in common: neighborhood parameter has to be configured in advance to gain meaningful embedding results. Simulation shows that embedding often loses relevance under improper parameters configures. But current embedding residual criterions of neighborhood parameters selection are not independent to neighborhood parameters. Therefore it cannot work universally. To improve the availability of nonlinear dimensionality reduction algorithms in the field of self-adaptive machine learning, it is necessary to find some transcendent criterions to achieve unsupervised parameters selection. This paper begins with a discussion of optimal embedding principles and proposes a statistics based on spatial mutual information and normalized dependency index spectrum to determine reasonable parameters configuration. The simulation supports our proposal effectively.

Keywords

Mutual Information Statistical Criterion Geodesic Distance Kolmogorov Complexity Simple Principle 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Barlow, H.: Unsupervised learning. Neural Computation 1, 295–311 (1989)CrossRefGoogle Scholar
  2. 2.
    Marcus, G.: Programs of the Mind. Science 304, 1450–1451Google Scholar
  3. 3.
    Baum, E.: What Is Thought? MIT Press, Cambridge, MA (2004)Google Scholar
  4. 4.
    Mardia, K.V., Kent, J.T., Bibby, J.M.: Multivariate Analysis. Academic Press, London (1979)MATHGoogle Scholar
  5. 5.
    Tenenbaum, J.B., et al.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  6. 6.
    Roweis, S.T., et al.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  7. 7.
    De Silva, V., Tenenbaum, J.: Global versus local methods in nonlinear dimensionality reduction. In: de Silva, V., et al. (eds.) NIPS 2002 (2002)Google Scholar
  8. 8.
    Yuexian, H., et al.: Robust Nonlinear dimension reduction: a self-organizing approach. In: FSDK 2005 (2005)Google Scholar
  9. 9.
    Li, M., Vitanyi, P.M.B.: An Introduction to Kolmogorov Complexity and Its Applications, 2nd edn. Springer, New York (1997)MATHGoogle Scholar
  10. 10.
    Ash, R.: Information Theory. John Wiley and Sons Press, Indianapolis (1965)MATHGoogle Scholar
  11. 11.
    Tsallis, C.: Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 52(1/2), 479–487 (1988)MATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering. In: NIPS 2001 (2001)Google Scholar
  13. 13.
    Yuexian, H., Pilian, H.: Identification of Neural Network Predictor by Means of Prediction Complexity. Information and control 30(1) (2001)Google Scholar
  14. 14.
    Balasubramanian, M., Schwartz, E.L.: The ISOMAP algorithm and topological stability. In: Tenenbaum, J.B., de Silva, V., Langford, J.C. (eds.) Science, vol. 295, p. 7a (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ruiguo Yu
    • 1
  • Yuexian Hou
    • 1
  • Pilian He
    • 1
  1. 1.School of Computer Science and TechnologyTianjin UniversityTianjinChina

Personalised recommendations