Combining Smooth Graphs with Semi-supervised Classification

  • Xueyuan Zhou
  • Chunping Li
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3918)


In semi-supervised classification, many methods use the graph representation of data. Based on the graph, different methods, e.g. random walk model, spectral cluster, Markov chain, and regularization theory etc., are employed to design classification algorithms. However, all these methods use the form of graphs constructed directly from data, e.g. kNN graph. In reality, data is only the observation with noise of hidden variables. Classification results using data directly from the observation may be biased by noise. Therefore, filtering the noise before using any classification methods can give a better classification. We propose a novel method to filter the noise in high dimension data by smoothing the graph. The analysis is given from the aspects of spectral theory, Markov chain, and regularization. We show that our method can reduce the high frequency components of the graph, and also has an explanation from regularization view. A graph volume based parameter learning method can be efficiently applied to classification. Experiments on artificial and real world data set indicate that our method has a superior classification accuracy.


Markov Chain Random Walk Transition Matrix Spectral Cluster High Frequency Component 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Belkin, M., Niyogi, P.: Semi-supervised Learning on Riemannian Manifolds. Machine Learning 56, 209–239 (2004); Special Issue on ClusteringCrossRefzbMATHGoogle Scholar
  2. 2.
    Chapelle, O., Weston, J., Schölkopf, B.: Cluster Kernels for Semi-Supervised Learning. In: Advances in Neural Information Processing Systems, vol. 15, pp. 585–592. MIT Press, Cambridge (2003)Google Scholar
  3. 3.
    Chapelle, O., Zien, A.: Semi-Supervised Classification by Low Density Separation. In: Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, pp. 57–64 (2005)Google Scholar
  4. 4.
    Coifman, R.R., Lafon, S., Lee, A.B., Maggioni, M., Nadler, B., Warner, F., Zucker, S.W.: Geometric Diffusions as a Tool for Harmonic Analysis and Structure Definition of Data. In: Proceedings of the National Academy of Sciences (2005)Google Scholar
  5. 5.
    Henk, C.T.: Stochastic Models: An Algorithmic Approach. John Wiley & Sons, Chichester (1994)zbMATHGoogle Scholar
  6. 6.
    Meila, M., Shi, J.: Learning Segmentation by Random Walks. Neural Information Processing Systems 13, 873–879 (2000)Google Scholar
  7. 7.
    Szummer, M., Jaakkola, T.: Partially labeled Classification with Markov Random Walks. Neural Information Processing Systems (NIPS) 14 (2001)Google Scholar
  8. 8.
    Zhou, D., Schölkopf, B.: Learning from Labeled and Unlabeled Data Using Random Walks. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM 2004. LNCS, vol. 3175, pp. 237–244. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    Zhou, D., et al.: Learning with Local and Global Consistency. In: Advances in Neural Information Processing System, vol. 16, pp. 321–328. MIT Press, Cambridge (2004)Google Scholar
  10. 10.
    Zhu, X., Lafferty, J., Ghahramani, Z.: Semi-Supervised Learning Using Gaussian Fields and Harmonic Function. In: The Twentieth International Conference on Machine Learning (2003) Google Scholar
  11. 11.
    Zhu, X.: Semi-Supervised Learning with Graphs. Doctoral Thesis. CMU-LTI-05-192, Carnegie Mellon University (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Xueyuan Zhou
    • 1
  • Chunping Li
    • 1
  1. 1.School of SoftwareTsinghua UniversityBeijingP.R. China

Personalised recommendations