Efficient Continuous Relaxations for Dense CRF

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9906)

Abstract

Dense conditional random fields (CRF) with Gaussian pairwise potentials have emerged as a popular framework for several computer vision applications such as stereo correspondence and semantic segmentation. By modeling long-range interactions, dense CRFs provide a more detailed labelling compared to their sparse counterparts. Variational inference in these dense models is performed using a filtering-based mean-field algorithm in order to obtain a fully-factorized distribution minimising the Kullback-Leibler divergence to the true distribution. In contrast to the continuous relaxation-based energy minimisation algorithms used for sparse CRFs, the mean-field algorithm fails to provide strong theoretical guarantees on the quality of its solutions. To address this deficiency, we show that it is possible to use the same filtering approach to speed-up the optimisation of several continuous relaxations. Specifically, we solve a convex quadratic programming (QP) relaxation using the efficient Frank-Wolfe algorithm. This also allows us to solve difference-of-convex relaxations via the iterative concave-convex procedure where each iteration requires solving a convex QP. Finally, we develop a novel divide-and-conquer method to compute the subgradients of a linear programming relaxation that provides the best theoretical bounds for energy minimisation. We demonstrate the advantage of continuous relaxations over the widely used mean-field algorithm on publicly available datasets.

Keywords

Energy minimisation Dense CRF Inference Linear programming Quadratic programming 

Supplementary material

419974_1_En_50_MOESM1_ESM.pdf (2.3 mb)
Supplementary material 1 (pdf 2345 KB)

References

  1. 1.
    Ravikumar, P., Lafferty, J.: Quadratic programming relaxations for metric labeling and Markov random field MAP estimation. In: ICML (2006)Google Scholar
  2. 2.
    Kleinberg, J., Tardos, E.: Approximation algorithms for classification problems with pairwise relationships: metric labeling and Markov random fields. JACM 49, 616–639 (2002)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    Kumar, P., Kolmogorov, V., Torr, P.: An analysis of convex relaxations for MAP estimation. In: NIPS (2008)Google Scholar
  4. 4.
    Chekuri, C., Khanna, S., Naor, J., Zosin, L.: Approximation algorithms for the metric labeling problem via a new linear programming formulation. In: SODA (2001)Google Scholar
  5. 5.
    Krähenbühl, P., Koltun, V.: Efficient inference in fully connected CRFs with Gaussian edge potentials. In: NIPS (2011)Google Scholar
  6. 6.
    Tappen, M., Liu, C., Adelson, E., Freeman, W.: Learning Gaussian conditional random fields for low-level vision. In: CVPR (2007)Google Scholar
  7. 7.
    Koller, D., Friedman, N.: Probabilistic Graphical Models: Principles and Techniques. MIT Press, Cambridge (2009)MATHGoogle Scholar
  8. 8.
    Adams, A., Baek, J., Abraham, M.: Fast high-dimensional filtering using the permutohedral lattice. In: Eurographics (2010)Google Scholar
  9. 9.
    Frank, M., Wolfe, P.: An algorithm for quadratic programming. Nav. Res. Log. Q. 3, 95–110 (1956)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Krähenbühl, P., Koltun, V.: Parameter learning and convergent inference for dense random fields. In: ICML (2013)Google Scholar
  11. 11.
    Baqué, P., Bagautdinov, T., Fleuret, F., Fua, P.: Principled parallel mean-field inference for discrete random fields. In: CVPR (2016)Google Scholar
  12. 12.
    Vineet, V., Warrell, J., Torr, P.: Filter-based mean-field inference for random fields with higher-order terms and product label-spaces. IJCV 110, 290–307 (2014)MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    Ladicky, L., Russell, C., Kohli, P., Torr, P.H.S.: Graph cut based inference with co-occurrence statistics. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part V. LNCS, vol. 6315, pp. 239–253. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  14. 14.
    Kohli, P., Kumar, P., Torr, P.: P3 & beyond: solving energies with higher order cliques. In: CVPR (2007)Google Scholar
  15. 15.
    Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: CVPR (2015)Google Scholar
  16. 16.
    Chen, L., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.: Semantic image segmentation with deep convolutional nets and fully connected CRFs. In: ICLR (2015)Google Scholar
  17. 17.
    Schwing, A., Urtasun, R.: Fully connected deep structured networks. CoRR (2015)Google Scholar
  18. 18.
    Zheng, S., Jayasumana, S., Romera-Paredes, B., Vineet, V., Su, Z., Du, D., Huang, C., Torr, P.: Conditional random fields as recurrent neural networks. In: ICCV (2015)Google Scholar
  19. 19.
    Zhang, Y., Chen, T.: Efficient inference for fully-connected CRFs with stationarity. In: CVPR (2012)Google Scholar
  20. 20.
    Wang, P., Shen, C., van den Hengel, A.: Efficient SDP inference for fully-connected CRFs based on low-rank decomposition. In: CVPR (2015)Google Scholar
  21. 21.
    Goemans, M., Williamson, D.: Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming. JACM 42, 1115–1145 (1995)MathSciNetCrossRefMATHGoogle Scholar
  22. 22.
    Lacoste-Julien, S., Jaggi, M., Schmidt, M., Pletscher, P.: Block-coordinate Frank-Wolfe optimization for structural SVMs. In: ICML (2013)Google Scholar
  23. 23.
    Yuille, A., Rangarajan, A.: The concave-convex procedure (CCCP). In: NIPS (2002)Google Scholar
  24. 24.
    Sriperumbudur, B., Lanckriet, G.: On the convergence of the concave-convex procedure. In: NIPS (2009)Google Scholar
  25. 25.
    Kumar, P., Koller, D.: MAP estimation of semi-metric MRFs via hierarchical graph cuts. In: UAI (2009)Google Scholar
  26. 26.
    Condat, L.: Fast projection onto the simplex and the \(l_1\) ball. Math. Program. 158, 575–585 (2015)MathSciNetCrossRefMATHGoogle Scholar
  27. 27.
    Scharstein, D., Szeliski, R.: A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. IJCV 47, 7–42 (2002)CrossRefMATHGoogle Scholar
  28. 28.
    Everingham, M., Van Gool, L., Williams, C., Winn, J., Zisserman, A.: The PASCAL visual object classes challenge. In: VOC 2010 Results (2010)Google Scholar
  29. 29.
    Snoek, J., Larochelle, H., Adams, R.: Practical bayesian optimization of machine learning algorithms. In: NIPS (2012)Google Scholar
  30. 30.
    Boykov, Y., Veksler, O., Zabih, R.: Fast approximate energy minimization via graph cuts. PAMI 23, 1222–1239 (2001)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Department of Engineering ScienceUniversity of OxfordOxfordUK
  2. 2.Microsoft ResearchRedmondUSA

Personalised recommendations