Diverse M-Best Solutions in Markov Random Fields

  • Dhruv Batra
  • Payman Yadollahpour
  • Abner Guzman-Rivera
  • Gregory Shakhnarovich
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7576)


Much effort has been directed at algorithms for obtaining the highest probability (MAP) configuration in probabilistic (random field) models. In many situations, one could benefit from additional high-probability solutions. Current methods for computing the M most probable configurations produce solutions that tend to be very similar to the MAP solution and each other. This is often an undesirable property. In this paper we propose an algorithm for the Diverse M-Best problem, which involves finding a diverse set of highly probable solutions under a discrete probabilistic model. Given a dissimilarity function measuring closeness of two solutions, our formulation involves maximizing a linear combination of the probability and dissimilarity to previous solutions. Our formulation generalizes the M-Best MAP problem and we show that for certain families of dissimilarity functions we can guarantee that these solutions can be found as easily as the MAP solution.


Lagrangian Relaxation Probable Solution Markov Random Dissimilarity Function Diverse Solution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Barbu, A., Zhu, S.-C.: Generalizing swendsen-wang to sampling arbitrary posterior probabilities. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1239–1253 (2005)CrossRefGoogle Scholar
  2. 2.
    Barinova, O., Lempitsky, V., Kohli, P.: On detection of multiple object instances using hough transforms. In: CVPR, pp. 2233–2240 (2010)Google Scholar
  3. 3.
    Blaschko, M.: Branch and Bound Strategies for Non-maximal Suppression in Object Detection. In: Boykov, Y., Kahl, F., Lempitsky, V., Schmidt, F.R. (eds.) EMMCVPR 2011. LNCS, vol. 6819, pp. 385–398. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  4. 4.
    Bottou, L., Bousquet, O.: The tradeoffs of large scale learning. In: Adv. in NIPS, pp. 161–168 (2008)Google Scholar
  5. 5.
    Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press (March 2004)Google Scholar
  6. 6.
    Boykov, Y., Jolly, M.-P.: Interactive graph cuts for optimal boundary and region segmentation of objects in n-d images. In: ICCV (2001)Google Scholar
  7. 7.
    Boykov, Y., Veksler, O., Zabih, R.: Efficient approximate energy minimization via graph cuts. PAMI 20(12), 1222–1239 (2001)CrossRefGoogle Scholar
  8. 8.
    Emma Rollon, N.F., Dechter, R.: Inference schemes for m best solutions for soft csps. In: Proceedings of Workshop on Preferences and Soft Constraints (2011)Google Scholar
  9. 9.
    Fromer, M., Globerson, A.: An LP view of the m-best MAP problem. In: NIPS (2009)Google Scholar
  10. 10.
    Kohli, P., Kumar, M.P.: Energy minimization for linear envelope mrfs. In: CVPR, pp. 1863–1870 (2010)Google Scholar
  11. 11.
    Kohli, P., Torr, P.H.S.: Effciently solving dynamic markov random fields using graph cuts. In: ICCV, pp. 922–929 (2005)Google Scholar
  12. 12.
    Kohli, P., Torr, P.H.S.: Measuring uncertainty in graph cut solutions. CVIU 112(1), 30–38 (2008)Google Scholar
  13. 13.
    Kolmogorov, V., Zabih, R.: What energy functions can be minimized via graph cuts? PAMI 26(2), 147–159 (2004)CrossRefGoogle Scholar
  14. 14.
    Komodakis, N., Paragios, N.: Beyond pairwise energies: Efficient optimization for higher-order MRFs. In: CVPR (2009)Google Scholar
  15. 15.
    Ladickỳ, L., Russell, C., Kohli, P., Torr, P.H.S.: Associative hierarchical CRFs for object class image segmentation. In: ICCV (2009)Google Scholar
  16. 16.
    Ladicky, L., Russell, C., Kohli, P., Torr, P.H.S.: Graph Cut Based Inference with Co-occurrence Statistics. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part V. LNCS, vol. 6315, pp. 239–253. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  17. 17.
    Ladicky, L., Torr, P.H.: The automatic labelling environment,
  18. 18.
    Lawler, E.L.: A procedure for computing the k best solutions to discrete optimization problems and its application to the shortest path problem. Management Science 18, 401–405 (1972)MathSciNetzbMATHCrossRefGoogle Scholar
  19. 19.
    Li, F., Carreira, J., Sminchisescu, C.: Object recognition as ranking holistic figure-ground hypotheses. In: CVPR (2010)Google Scholar
  20. 20.
    Meltzer, T., Yanover, C., Weiss, Y.: Globally optimal solutions for energy minimization in stereo vision using reweighted belief propagation. In: ICCV, pp. 428–435 (2005)Google Scholar
  21. 21.
    Flerova, N., Rollon, E., Dechter, R.: Bucket and Mini-bucket Schemes for M Best Solutions over Graphical Models. In: Croitoru, M., Rudolph, S., Wilson, N., Howse, J., Corby, O. (eds.) GKR 2011. LNCS, vol. 7205, pp. 91–118. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  22. 22.
    Nilsson, D.: An efficient algorithm for finding the m most probable configurations in probabilistic expert systems. Statistics and Computing 8, 159–173 (1998), doi:10.1023/A:1008990218483MathSciNetCrossRefGoogle Scholar
  23. 23.
    Nowozin, S., Lampert, C.: Global connectivity potentials for random field models. In: CVPR (2009)Google Scholar
  24. 24.
    Papandreou, G., Yuille, A.: Perturb-and-map random fields: Using discrete optimization to learn and sample from energy models. In: ICCV, pp. 193–200 (November 2011)Google Scholar
  25. 25.
    Park, D., Ramanan, D.: N-best maximal decoders for part models. In: ICCV (2011)Google Scholar
  26. 26.
    Porway, J., Zhu, S.-C.: C 4: Exploring multiple solutions in graphical models by cluster sampling. PAMI 33(9), 1713–1727 (2011)CrossRefGoogle Scholar
  27. 27.
    Roth, S., Black, M.: Fields of experts. IJCV 82(2) (April 2009)Google Scholar
  28. 28.
    Rother, C., Kohli, P., Feng, W., Jia, J.: Minimizing sparse higher order energy functions of discrete variables. In: CVPR, pp. 1382–1389 (2009)Google Scholar
  29. 29.
    Rother, C., Kolmogorov, V., Blake, A.: “Grabcut”: interactive foreground extraction using iterated graph cuts. In: SIGGRAPH (2004)Google Scholar
  30. 30.
    Seroussi, B., Golmard, J.: An algorithm directly finding the k most probable configurations in bayesian networks. Int. J. of Approx. Reasoning 11(3), 205–233 (1994)CrossRefGoogle Scholar
  31. 31.
    Shor, N.: Minimization methods for non-differentiable functions. Springer Series in Computational Mathematics. Springer (1985)Google Scholar
  32. 32.
    Szeliski, R., Zabih, R., Scharstein, D., Veksler, O., Kolmogorov, V., Agarwala, A., Tappen, M., Rother, C.: A comparative study of energy minimization methods for markov random fields with smoothness-based priors. PAMI 30(6), 1068–1080 (2008)CrossRefGoogle Scholar
  33. 33.
    Tarlow, D., Givoni, I.E., Zemel, R.S.: Hop-map: Efficient message passing with high order potentials. In: AISTATS, pp. 812–819 (2010)Google Scholar
  34. 34.
    Tsochantaridis, I., Joachims, T., Hofmann, T., Altun, Y.: Large margin methods for structured and interdependent output variables. JMLR 6, 1453–1484 (2005)MathSciNetzbMATHGoogle Scholar
  35. 35.
    Tu, Z., Zhu, S.-C.: Image segmentation by data-driven markov chain monte carlo. IEEE Trans. Pattern Anal. Mach. Intell. 24, 657–673 (2002)CrossRefGoogle Scholar
  36. 36.
    Werner, T.: A linear programming approach to max-sum problem: A review. PAMI 29(7), 1165–1179 (2007)CrossRefGoogle Scholar
  37. 37.
    Yang, Y., Ramanan, D.: Articulated pose estimation with flexible mixtures-of-parts. In: CVPR, pp. 1385–1392 (2011)Google Scholar
  38. 38.
    Yanover, C., Weiss, Y.: Finding the m most probable configurations using loopy belief propagation. In: NIPS (2003)Google Scholar
  39. 39.
    Yue, Y., Joachims, T.: Predicting diverse subsets using structural SVMs. In: ICML, pp. 271–278 (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Dhruv Batra
    • 1
  • Payman Yadollahpour
    • 1
  • Abner Guzman-Rivera
    • 2
  • Gregory Shakhnarovich
    • 1
  1. 1.TTI-ChicagoUSA
  2. 2.UIUCUSA

Personalised recommendations