Advertisement

A Geometric Approach to Image Labeling

  • Freddie Åström
  • Stefania Petra
  • Bernhard Schmitzer
  • Christoph Schnörr
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9909)

Abstract

We introduce a smooth non-convex approach in a novel geometric framework which complements established convex and non-convex approaches to image labeling. The major underlying concept is a smooth manifold of probabilistic assignments of a prespecified set of prior data (the “labels”) to given image data. The Riemannian gradient flow with respect to a corresponding objective function evolves on the manifold and terminates, for any \(\delta > 0\), within a \(\delta \)-neighborhood of an unique assignment (labeling). As a consequence, unlike with convex outer relaxation approaches to (non-submodular) image labeling problems, no post-processing step is needed for the rounding of fractional solutions. Our approach is numerically implemented with sparse, highly-parallel interior-point updates that efficiently converge, largely independent from the number of labels. Experiments with noisy labeling and inpainting problems demonstrate competitive performance.

Keywords

Image labeling Assignment manifold Fisher-Rao metric Riemannian gradient flow 

Notes

Acknowledgments

FÅ, SP and CS thank the German Research Foundation (DFG) for support via grant GRK 1653. BS was supported by the European Research Council (project SIGMA-Vision).

References

  1. 1.
    Wang, C., Komodakis, N., Paragios, N.: Markov random field modeling, inference & learning in computer vision & image understanding: a survey. Comput. Vis. Image Underst. 117(11), 1610–1627 (2013)CrossRefGoogle Scholar
  2. 2.
    Kappes, J., Andres, B., Hamprecht, F., Schnörr, C., Nowozin, S., Batra, D., Kim, S., Kausler, B., Kröger, T., Lellmann, J., Komodakis, N., Savchynskyy, B., Rother, C.: A comparative study of modern inference techniques for structured discrete energy minimization problems. Int. J. Comp. Vis. 115(2), 155–184 (2015)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Werner, T.: A linear programming approach to max-sum problem: a review. IEEE Trans. Patt. Anal. Mach. Intell. 29(7), 1165–1179 (2007)CrossRefGoogle Scholar
  4. 4.
    Wainwright, M., Jordan, M.: Graphical models, exponential families, and variational inference. Found. Trends Mach. Learn. 1(1–2), 1–305 (2008)zbMATHGoogle Scholar
  5. 5.
    Sundaramoorthi, G., Hong, B.W.: Fast label: easy and efficient solution of joint multi-label and estimation problems. In: 2014 CVPR, pp. 3126–3133, June 2014Google Scholar
  6. 6.
    Jung, M., Chung, G., Sundaramoorthi, G., Vese, L.A., Yuille, A.L.: Sobolev gradients and joint variational image segmentation, denoising, and deblurring. In: Proceedings of the SPIE, vol. 7246, pp. 72460I–72460I-13 (2009)Google Scholar
  7. 7.
    Åström, F., Petra, S., Schmitzer, B., Schnörr, C.: Image Labeling by Assignment 16 March 2016, preprint: http://arxiv.org/abs/1603.05285
  8. 8.
    Amari, S.I., Nagaoka, H.: Methods of Information Geometry. Amer. Math. Soc. and Oxford University Press (2000)Google Scholar
  9. 9.
    Jost, J.: Riemannian Geometry and Geometric Analysis, 4th edn. Springer, Heidelberg (2005)zbMATHGoogle Scholar
  10. 10.
    Kass, R.: The geometry of asymptotic inference. Statist. Sci. 4(3), 188–234 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Karcher, H.: Riemannian center of mass and mollifier smoothing. Comm. Pure Appl. Math. 30, 509–541 (1977)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    C̆encov, N.: Statistical Decision Rules and Optimal Inference. Amer. Math. Soc. (1982)Google Scholar
  13. 13.
    Montúfar, G., Rauh, J., Ay, N.: On the fisher metric of conditional probability polytopes. Entropy 16(6), 3207–3233 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Losert, V., Alin, E.: Dynamics of games and genes: discrete versus continuous time. J. Math. Biol. 17(2), 241–251 (1983)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Phys. D 60(1–4), 259–268 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Andres, B., Beier, T., Kappes, J.: OpenGM: A C++ library for discrete graphical models. CoRR abs/1206.0111 (2012)Google Scholar
  17. 17.
    Kolmogorov, V.: Convergent tree-reweighted message passing for energy minimization. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1568–1583 (2006)CrossRefGoogle Scholar
  18. 18.
    Szeliski, R., Zabih, R., Scharstein, D., Veksler, O., Kolmogorov, V., Agarwala, A., Tappen, M., Rother, C.: A comparative study of energy minimization methods for markov random fields with smoothness-based priors. IEEE Trans. Pattern Anal. Mach. Intell. 30(6), 1068–1080 (2008)CrossRefGoogle Scholar
  19. 19.
    Boykov, Y., Veksler, O., Zabih, R.: Fast approximate energy minimization via graph cuts. IEEE Trans. Pattern Anal. Mach. Intell. 23(11), 1222–1239 (2001)CrossRefGoogle Scholar
  20. 20.
    Kolmogorov, V., Zabin, R.: What energy functions can be minimized via graph cuts? IEEE PAMI 26(2), 147–159 (2004)CrossRefGoogle Scholar
  21. 21.
    Komodakis, N., Tziritas, G.: Approximate labeling via graph cuts based on linear programming. IEEE Trans. Pattern Anal. Mach. Intell. 29(8), 1436–1453 (2007)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Freddie Åström
    • 1
    • 3
  • Stefania Petra
    • 1
    • 2
  • Bernhard Schmitzer
    • 4
  • Christoph Schnörr
    • 1
    • 3
  1. 1.HCIHeidelberg UniversityHeidelbergGermany
  2. 2.MIGHeidelberg UniversityHeidelbergGermany
  3. 3.IPAHeidelberg UniversityHeidelbergGermany
  4. 4.CEREMADEUniversity Paris-DauphineParisFrance

Personalised recommendations