Multiclass Semi-supervised Learning on Graphs Using Ginzburg-Landau Functional Minimization

  • Cristina Garcia-Cardona
  • Arjuna Flenner
  • Allon G. Percus
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 318)


We present a graph-based variational algorithm for classification of high-dimensional data, generalizing the binary diffuse interface model to the case of multiple classes. Motivated by total variation techniques, the method involves minimizing an energy functional made up of three terms. The first two terms promote a stepwise continuous classification function with sharp transitions between classes, while preserving symmetry among the class labels. The third term is a data fidelity term, allowing us to incorporate prior information into the model in a semi-supervised framework. The performance of the algorithm on synthetic data, as well as on the COIL and MNIST benchmark datasets, is competitive with state-of-the-art graph-based multiclass segmentation methods.


Diffuse interfaces Learning on graphs Semi-supervised methods 



This research has been supported by the Air Force Office of Scientific Research MURI grant FA9550-10-1-0569 and by ONR grant N0001411AF00002.


  1. 1.
    Coifman, R.R., Lafon, S., Lee, A.B., Maggioni, M., Nadler, B., Warner, F., Zucker, S.W.: Geometric diffusions as a tool for harmonic analysis and structure definition of data: diffusion maps. Proc. Natl. Acad. Sci. 102, 7426–7431 (2005)CrossRefGoogle Scholar
  2. 2.
    Allwein, E.L., Schapire, R.E., Singer, Y.: Reducing multiclass to binary: a unifying approach for margin classifiers. J. Mach. Learn. Res. 1, 113–141 (2000)MathSciNetGoogle Scholar
  3. 3.
    Bertozzi, A.L., Flenner, A.: Diffuse interface models on graphs for classification of high dimensional data. Multiscale Model. Simul. 10, 1090–1118 (2012)CrossRefMathSciNetMATHGoogle Scholar
  4. 4.
    Bertozzi, A., Esedoḡlu, S., Gillette, A.: Inpainting of binary images using the Cahn-Hilliard equation. IEEE Trans. Image Process. 16, 285–291 (2007)CrossRefMathSciNetMATHGoogle Scholar
  5. 5.
    Jung, Y.M., Kang, S.H., Shen, J.: Multiphase image segmentation via Modica-Mortola phase transition. SIAM J. Appl. Math. 67, 1213–1232 (2007)CrossRefMathSciNetMATHGoogle Scholar
  6. 6.
    Li, Y., Kim, J.: Multiphase image segmentation using a phase-field model. Comput. Math. Appl. 62, 737–745 (2011)CrossRefMathSciNetMATHGoogle Scholar
  7. 7.
    Chung, F.R.K.: Spectral graph theory. In: Regional Conference Series in Mathematics. Conference Board of the Mathematical Sciences (CBMS), vol. 92. Washington (1997)Google Scholar
  8. 8.
    Zhou, D., Schölkopf, B.: A regularization framework for learning from graph data. In: Workshop on Statistical Relational Learning. International Conference on Machine Learning. Banff (2004)Google Scholar
  9. 9.
    Szlam, A.D., Maggioni, M., Coifman, R.R.: Regularization on graphs with function-adapted diffusion processes. J. Mach. Learn. Res. 9, 1711–1739 (2008)MathSciNetMATHGoogle Scholar
  10. 10.
    Wang, J., Jebara, T., Chang, S.F.: Graph transduction via alternating minimization. In: Proceedings of the 25th International Conference on Machine Learning (2008)Google Scholar
  11. 11.
    Bühler, T., Hein, M.: Spectral clustering based on the graph p-Laplacian. In: Bottou, L., Littman, M. (eds.) Proceedings of the 26th International Conference on Machine Learning, pp. 81–88. Omnipress, Montreal (2009)Google Scholar
  12. 12.
    Szlam, A., Bresson, X.: Total variation and cheeger cuts. In: Fürnkranz, J., Joachims, T. (eds.) Proceedings of the 27th International Conference on Machine Learning, pp. 1039–1046. Omnipress, Haifa (2010)Google Scholar
  13. 13.
    Hein, M., Setzer, S.: Beyond spectral clustering—tight relaxations of balanced  graph cuts. In: Shawe-Taylor, J., Zemel, R., Bartlett, P., Pereira, F., Weinberger, K. (eds.) Advances in Neural Information Processing Systems, vol. 24, pp. 2366–2374 (2011)Google Scholar
  14. 14.
    Dietterich, T.G., Bakiri, G.: Solving multiclass learning problems via error-correcting output codes. J. Artif. Intell. Res. 2, 263–286 (1995)MATHGoogle Scholar
  15. 15.
    Hastie, T., Tibshirani, R.: Classification by pairwise coupling. In: Advances in Neural Information Processing Systems, vol. 10. MIT Press, Cambridge (1998)Google Scholar
  16. 16.
    Har-Peled, S., Roth, D., Zimak, D.: Constraint classification for multiclass classification and ranking. In: Becker, S.T.S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems, vol. 15, pp. 785–792. MIT Press, Cambridge (2003)Google Scholar
  17. 17.
    Subramanya, A., Bilmes, J.: Semi-supervised learning with measure propagation. J. Mach. Learn. Res. 12, 3311–3370 (2011)MathSciNetMATHGoogle Scholar
  18. 18.
    Zhou, D., Bousquet, O., Lal, T.N., Weston, J., Schölkopf, B.: Learning with local and global consistency. In: Thrun, S., Saul, L.K., Schölkopf, B. (eds.) Advances in Neural Information Processing Systems, vol. 16, pp. 321–328. MIT Press, Cambridge (2004)Google Scholar
  19. 19.
    Kohn, R.V., Sternberg, P.: Local minimizers and singular perturbations. Proc. R. Soc. Edinburgh Sect. A 111, 69–84 (1989)CrossRefMathSciNetMATHGoogle Scholar
  20. 20.
    Dobrosotskaya, J.A., Bertozzi, A.L.: A wavelet-Laplace variational technique for image deconvolution and inpainting. IEEE Trans. Image Process. 17, 657–663 (2008)CrossRefMathSciNetGoogle Scholar
  21. 21.
    Gilboa, G., Osher, S.: Nonlocal operators with applications to image processing. Multiscale Model. Simul. 7, 1005–1028 (2008)CrossRefMathSciNetMATHGoogle Scholar
  22. 22.
    Zelnik-Manor, L., Perona, P.: Self-tuning spectral clustering. In: Saul, L.K., Weiss, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems, vol. 17. MIT Press, Cambridge (2005)Google Scholar
  23. 23.
    von Luxburg, U.: A tutorial on spectral clustering. Technical Report TR-149, Max Planck Institute for Biological Cybernetics (2006)Google Scholar
  24. 24.
    Dobrosotskaya, J.A., Bertozzi, A.L.: Wavelet analogue of the Ginzburg-Landau energy and its gamma-convergence. Interfaces Free Bound. 12, 497–525 (2010)CrossRefMathSciNetMATHGoogle Scholar
  25. 25.
    Bertozzi, A., van Gennip, Y.: Gamma-convergence of graph Ginzburg-Landau functionals. Adv. Differ. Equ. 17, 1115–1180 (2012)MATHGoogle Scholar
  26. 26.
    Surendran, D.: Swiss roll dataset. (2004)
  27. 27.
    Nene, S., Nayar, S., Murase, H.: Columbia object image library (COIL-100). Technical  Report CUCS-006-96 (1996)Google Scholar
  28. 28.
    Chapelle, O., Schölkopf, B., Zien, A. (eds.): Semi-supervised Learning. MIT Press, Cambridge (2006)Google Scholar
  29. 29.
    LeCun, Y., Cortes, C.: The MNIST database of handwritten digits.
  30. 30.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998)CrossRefGoogle Scholar
  31. 31.
    Hinton, G.E., Osindero, S., Teh, Y.W.: A fast learning algorithm for deep belief nets. Neural Comput. 18, 1527–1554 (2006)CrossRefMathSciNetMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Cristina Garcia-Cardona
    • 1
  • Arjuna Flenner
    • 2
  • Allon G. Percus
    • 1
  1. 1.Institute of Mathematical SciencesClaremont Graduate UniversityClaremontUSA
  2. 2.Physics and Computational SciencesNaval Air Warfare CenterChina LakeUSA

Personalised recommendations