Probabilistic Relaxation Labeling by Fokker-Planck Diffusion on a Graph

  • Hong-Fang Wang
  • Edwin R. Hancock
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4538)

Abstract

In this paper we develop a new formulation of probabilistic relaxation labeling for the task of data classification using the theory of diffusion processes on graphs. The state space of our process as the nodes of a support graph which represent potential object-label assignments. The edge-weights of the support graph encode data-proximity and label consistency information. The state-vector of the diffusion process represents the object-label probabilities. The state vector evolves with time according to the Fokker-Planck equation. We show how the solution state vector can be estimated using the spectrum of the Laplacian matrix for the weighted support graph. Experiments on various data clustering tasks show effectiveness of our new algorithm.

Keywords

Initial Error Laplacian Matrix Support Graph Weighted Adjacency Matrix State Probability Vector 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Agarwal, A., Triggs, B.: Tracking articulated motion using a mixture of autoregressive models. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3023, pp. 54–65. Springer, Heidelberg (2004)Google Scholar
  2. 2.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. NIPS (2001)Google Scholar
  3. 3.
    Chung, F.R.K.: Spectral Graph Theory. American Mathematical Society (1997)Google Scholar
  4. 4.
    Blake, C.L., Newman, D.J., Hettich, S., Merz, C.J.: UCI repository of machine learning databases (1998)Google Scholar
  5. 5.
    Faugeras, O., Berthod, M.: Improving consistency and reducing ambiguity in stochastic labeling: An optimization approach. IEEE Trans. PAMI, vol. 3(4) (1981)Google Scholar
  6. 6.
    Fischer, I., Poland, J.: New methods for spectral clustering. Technical Report IDSIA-12-04, IDSIA (2004)Google Scholar
  7. 7.
    Hancock, E.R., Kittler, J.: Edge-labeling using dictionary based relaxation. IEEE Trans. PAMI 12, 165–181 (1990)Google Scholar
  8. 8.
    Hummel, R.A., Zucker, S.W.: On the foundations of relaxation labeling processes. IEEE Trans. PAMI, l.PAMI-5, 267 (1983)Google Scholar
  9. 9.
    Kittler, J., Hancock, E.R.: Combining evidence in probabilistic relaxation. Int. J. Pattern Recognition And Artificial Inteligence 3(1), 29–51 (1989)CrossRefGoogle Scholar
  10. 10.
    Kondor, R.I., Lafferty, J.: Diffusion kernels on graphs and other discrete input spaces. In: ICML, pp. 315–322 (2002)Google Scholar
  11. 11.
    Lafon, S., Lee, A.B.: Diffusion Maps and Coarse-Graining: A Unified Framework for Dimensionality Reduction, Graph Partitioning and Data Set Parameterization. IEEE Trans. PAMI 28, 1393–1403 (2006)Google Scholar
  12. 12.
    Moler, C., van Loan, C.: Nineteen dubious ways to compute the exponential of a matrix, twenty-five years later. SIAM Review 45(1), 3–49 (2003)MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Nadler, B., Lafon, S., Coifman, R.R., Kevrekidis, I.G.: Diffusion maps, spectral clustering and eigenfunctions of Fokker-Planck operators. NIPS (2005)Google Scholar
  14. 14.
    Okuma, K., Taleghani, A., de Freitas, N., Little, J., Lowe, D.: A boosted particle filter: Multitarget detection and tracking. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3024, pp. 28–39. Springer, Heidelberg (2004)Google Scholar
  15. 15.
    Pelillo, M., Refice, M.: Learning compatibility coefficients for relaxation labeling processes. IEEE Trans. PAMI 16(9), 933–945 (1994)Google Scholar
  16. 16.
    Rosenfeld, A., Hummel, R., Zucker, S.: Scene labeling by relaxation operations. IEEE Trans. Systems. Man and Cybernetics 6, 420–433 (1976)MATHCrossRefMathSciNetGoogle Scholar
  17. 17.
    Smola, A.J., Kondor, R.: Kernels and regularization on graphs. In: Warmuth, M., Schökopf, B. (eds) COLT/KW 2003 (2003)Google Scholar
  18. 18.
    Sudderth, E.B., Ihler, A.T., Freeman, W.T., Willsky, A.S.: Nonparametric belief propagation. In: CVPR, pp. 605–612 (2003)Google Scholar
  19. 19.
    Szummer, M., Jaakkola, T.: Partially labeled classification with Markov random walks. In: NIPS 15 (2002)Google Scholar
  20. 20.
    Tishby, N., Slonim, N.: Data clustering by Markovian relaxation and the information bottleneck method. In: NIPS 13 (2000)Google Scholar
  21. 21.
    Tsuda, K., Noble, W.S.: Learning kernels from biological networks by maximizing entropy. Bioinformatics 20, 326–333 (2004)CrossRefGoogle Scholar
  22. 22.
    Vert, J.-P., Kanehisa, M.: Graph-driven feature extraction from microarray data using diffusion kernels and kernel CCA. In: NIPS (2002)Google Scholar
  23. 23.
    Waltz, D.L.: Generating semantic descriptions from drawings of scenes with shadows. Technical Report 271, MIT AI Lab (1972)Google Scholar
  24. 24.
    Weiss, Y.: Interpreting images by propagating Baysian beliefs. In: ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS, pp. 908–914 (1997)Google Scholar
  25. 25.
    Zheng, Y., Doermann, D.: Robust point matching for two-dimensional nonrigid shapes. In: Proceedings IEEE Conf. on Computer Vision, pp. 1561–1566 (2005)Google Scholar
  26. 26.
    Zhou, D., Schölkopf, B.: Learning from labeled and unlabeled data using random walks. In: Proceedings of the 26th DAGM Symposium, pp. 237–244 (2004)Google Scholar
  27. 27.
    Zhu, X., Ghahramani, Z., Lafferty, J.: Semi-supervised learning using Gaussian fields and harmonic functions. In: ICML, pp. 1561–1566 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Hong-Fang Wang
    • 1
  • Edwin R. Hancock
    • 1
  1. 1.Computer Science Department, University of York, Heslington, York, YO10 5DDUK

Personalised recommendations