Advertisement

Combining Smooth Graphs with Semi-supervised Learning

  • Liang Liu
  • Weijun Chen
  • Jianmin Wang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4505)

Abstract

The key points of the semi-supervised learning problem are the label smoothness and cluster assumptions. In graph-based semi-supervised learning, graph representations of the data are so important that different graph representations can affect the classification results heavily. We present a novel method to produce a graph called smooth Markov random walk graph which takes into account the two assumptions employed by semi-supervised learning. The new graph is achieved by modifying the eigenspectrum of the transition matrix of Markov random walk graph and is sufficiently smooth with respect to the intrinsic structure of labeled and unlabeled points. We believe the smoother graph will benefit semi-supervised learning. Experiments on artificial and real world dataset indicate that our method provides superior classification accuracy over several state-of-the-art methods.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Seeger, M.: Learning with labeled and unlabeled data. Technical report, Edinburgh University (2000)Google Scholar
  2. 2.
    Zhu, X.: Semi-Supervised Learning with Graphs. Doctoral Thesis. CMU-LTI-05-192 (2005)Google Scholar
  3. 3.
    Zhu, X., Lafferty, J., Ghahramani, Z.: Semi-Supervised Learning Using Gaussian Fields and Harmonic Function. In: Proceedings of ICML-03 (2003)Google Scholar
  4. 4.
    Zhou, D., et al.: Learning with local and global consistency. In: Advances in Neural Information Processing System, vol. 16 (2004)Google Scholar
  5. 5.
    Chapelle, O., Zien, A.: Semi-Supervised Classification by Low Density Separation. In: Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, pp. 57–64 (2005)Google Scholar
  6. 6.
    Chapelle, O., Weston, J., Scholkopf, B.: Cluster Kernels for Semi-Supervised Learning. In: Advances in Neural Information Processing Systems, vol. 15, pp. 585–592. MIT Press, Cambridge (2003)Google Scholar
  7. 7.
    Szummer, M., Jaakkola, T.: Partially labeled classification with Markov random walks. In: Neural Information Processing Systems (NIPS), vol. 14 (2001)Google Scholar
  8. 8.
    Stewart, G.W., Sun, J.G.: Matrix perturbation Theory. Academic Press, London (1990)zbMATHGoogle Scholar
  9. 9.
    Chung, F.: Spectral Graph Theory. CBMS Regional Conference Series in Mathematics, vol. 92. American Mathematical Society, Providence (1997)zbMATHGoogle Scholar
  10. 10.
    Ng, A.Y., Jordan, M.I., Weiss, Y.: On spectral clustering: Analysis and an algorithm. In: Advances in Neural Information Proceeding Systems, vol. 14 (2001)Google Scholar
  11. 11.
    Henk, C.T.: Stochastic Models: An Algorithmic Approach. John Wiley & Sons, Chichester (1994)zbMATHGoogle Scholar
  12. 12.
    Zhou, X., Li, C.: Combining Smooth Graphs with Semi-supervised Classification. In: Ng, W.-K., Kitsuregawa, M., Li, J., Chang, K. (eds.) PAKDD 2006. LNCS (LNAI), vol. 3918, pp. 400–409. Springer, Heidelberg (2006)CrossRefGoogle Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Liang Liu
    • 1
  • Weijun Chen
    • 1
  • Jianmin Wang
    • 1
  1. 1.School of Software, Tsinghua University, Beijing, 100084P.R. China

Personalised recommendations