A Convex Discrete-Continuous Approach for Markov Random Fields

  • Christopher Zach
  • Pushmeet Kohli
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7577)


We propose an extension of the well-known LP relaxation for Markov random fields to explicitly allow continuous label spaces. Unlike conventional continuous formulations of labelling problems which assume that the unary and pairwise potentials are convex, our formulation allows them to be general piecewise convex functions with continuous domains. Furthermore, we present the extension of the widely used efficient scheme for handling L 1 smoothness priors over discrete ordered label sets to continuous label spaces. We provide a theoretical analysis of the proposed model, and empirically demonstrate that labelling problems with huge or continuous label spaces can benefit from our discrete-continuous representation.


Unary Potential Markov Random Label Problem Match Cost Pairwise Potential 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Boykov, Y., Jolly, M.P.: Interactive graph cuts for optimal boundary & region segmentation of objects in N-D images. In: Proc. ICCV, pp. 105–112 (2001)Google Scholar
  2. 2.
    Beale, E.M.L., Forrest, J.J.H.: Global optimization using special ordered sets. Mathematical Programming 10(1), 52–69 (1976)MathSciNetzbMATHCrossRefGoogle Scholar
  3. 3.
    Wainwright, M.J., Jordan, M.I.: Graphical models, exponential families, and variational inference. Found. Trends Mach. Learn. 1, 1–305 (2008)zbMATHCrossRefGoogle Scholar
  4. 4.
    Werner, T.: A linear programming approach to max-sum problem: A review. IEEE Trans. Pattern Anal. Mach. Intell. 29(7) (2007)Google Scholar
  5. 5.
    Ishikawa, H., Geiger, D.: Occlusions, Discontinuities, and Epipolar Lines in Stereo. In: Burkhardt, H.-J., Neumann, B. (eds.) ECCV 1998. LNCS, vol. 1406, pp. 232–248. Springer, Heidelberg (1998)Google Scholar
  6. 6.
    Boykov, Y., Veksler, O., Zabih, R.: Markov random fields with efficient approximations. In: Proc. CVPR, pp. 648–655 (1998)Google Scholar
  7. 7.
    Ishikawa, H.: Exact optimization for Markov random fields with convex priors. IEEE Trans. Pattern Anal. Mach. Intell. 25(10), 1333–1336 (2003)CrossRefGoogle Scholar
  8. 8.
    Bhusnurmath, A., Taylor, C.J.: Stereo matching using interior point methods. In: Proc. 3DPVT (2008)Google Scholar
  9. 9.
    Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Int. Joint Conf. on Artificial Intell., pp. 674–679 (1981)Google Scholar
  10. 10.
    Horn, B.K.P., Schunck, B.G.: Determining optical flow. Artificial Intelligence 17, 185–203 (1981)CrossRefGoogle Scholar
  11. 11.
    Rother, C., Kohli, P., Feng, W., Jia, J.: Minimizing sparse higher order energy functions of discrete variables. In: Proc. CVPR (2009)Google Scholar
  12. 12.
    Kohli, P., Kumar, M.P.: Energy minimization for linear envelope MRFs. In: Proc. CVPR (2010)Google Scholar
  13. 13.
    Komodakis, N., Paragios, N.: Beyond pairwise energies: Efficient optimization for higher-order MRFs. In: Proc. CVPR (2009)Google Scholar
  14. 14.
    Alberti, G., Bouchitté, G., Maso, G.D.: The calibration method for the Mumford-Shah functional and free-discontinuity problems. Calc. Var. Partial Differential Equations 16(3), 299–333 (2003)MathSciNetzbMATHCrossRefGoogle Scholar
  15. 15.
    Chambolle, A., Cremers, D., Pock, T.: A convex approach for computing minimal partitions. Technical report, Ecole Polytechnique (2008)Google Scholar
  16. 16.
    Pock, T., Cremers, D., Bischof, H., Chambolle, A.: An algorithm for minimizing the piecewise smooth Mumford-Shah functional. In: Proc. ICCV (2009)Google Scholar
  17. 17.
    Strekalovskiy, E., Goldluecke, B., Cremers, D.: Tight convex relaxations for vector-valued labeling problems. In: Proc. ICCV (2011)Google Scholar
  18. 18.
    Peng, J., Hazan, T., McAllester, D., Urtasun, R.: Convex max-product algorithms for continuous MRFs with applications to protein folding. In: Proc. ICML (2011)Google Scholar
  19. 19.
    Glocker, B., Paragios, N., Komodakis, N., Tziritas, G., Navab, N.: Optical flow estimation with uncertainties through dynamic MRFs. In: Proc. CVPR (2008)Google Scholar
  20. 20.
    Kolmogorov, V.: Convergent tree-reweighted message passing for energy minimization. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1568–1583 (2006)CrossRefGoogle Scholar
  21. 21.
    Globerson, A., Jaakkola, T.: Fixing max-product: Convergent message passing algorithms for MAP LP-relaxations. In: NIPS (2007)Google Scholar
  22. 22.
    Weiss, Y., Yanover, C., Meltzer, T.: MAP estimation, linear programming and belief propagation with convex free energies. In: UAI (2007)Google Scholar
  23. 23.
    Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Physica D 60, 259–268 (1992)zbMATHCrossRefGoogle Scholar
  24. 24.
    Chambolle, A.: An algorithm for total variation minimization and applications. J. Math. Imag. Vision 20(1-2), 89–97 (2004)MathSciNetGoogle Scholar
  25. 25.
    Zach, C., Häne, C., Pollefeys, M.: What is optimized in convex relaxations for multi-label problems: Connecting discrete and continuously-inspired MAP inference. Technical report, Microsoft Research (2012)Google Scholar
  26. 26.
    Combettes, P.L., Pesquet, J.C.: Proximal Splitting Methods in Signal Processing. In: Fixed-Point Algorithms for Inverse Problems in Science and Engineering, pp. 185–212. Springer (2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Christopher Zach
    • 1
  • Pushmeet Kohli
    • 1
  1. 1.Microsoft Research CambridgeUK

Personalised recommendations