Advertisement

Journal of Mathematical Imaging and Vision

, Volume 51, Issue 2, pp 311–325 | Cite as

An Inertial Forward-Backward Algorithm for Monotone Inclusions

  • Dirk A. Lorenz
  • Thomas Pock
Article

Abstract

In this paper, we propose an inertial forward-backward splitting algorithm to compute a zero of the sum of two monotone operators, with one of the two operators being co-coercive. The algorithm is inspired by the accelerated gradient method of Nesterov, but can be applied to a much larger class of problems including convex-concave saddle point problems and general monotone inclusions. We prove convergence of the algorithm in a Hilbert space setting and show that several recently proposed first-order methods can be obtained as special cases of the general algorithm. Numerical results show that the proposed algorithm converges faster than existing methods, while keeping the computational cost of each iteration basically unchanged.

Keywords

Convex optimization Forward-backward splitting Monotone inclusions Primal-dual algorithms Saddle-point problems Image restoration 

Notes

Acknowledgments

Thomas Pock acknowledges support from the Austrian science fund (FWF) under the project “Efficient algorithms for nonsmooth optimization in imaging”, No. I1148 and the FWF-START project Bilevel optimization for Computer Vision, No. Y729. The authors wish to thank Antonin Chambolle for very helpful discussions.

References

  1. 1.
    Alvarez, F.: Weak convergence of a relaxed and inertial hybrid projection-proximal point algorithm for maximal monotone operators in hilbert space. SIAM J. Optim 14(3), 773–782 (2003)CrossRefMathSciNetGoogle Scholar
  2. 2.
    Alvarez, F., Attouch, H.: An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping. Set-Valued Anal. 9(1–2), 3–11 (2001)CrossRefzbMATHMathSciNetGoogle Scholar
  3. 3.
    Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, Berlin (2011)Google Scholar
  4. 4.
    Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)CrossRefzbMATHMathSciNetGoogle Scholar
  5. 5.
    Becker, S., Fadili, J.: A Quasi–Newton proximal splitting method. Adv. Neural Info. Process. Sys. 25, 2627–2635 (2012)Google Scholar
  6. 6.
    Bot, R.I., Csetnek, E.R.: An inertial alternating direction method of multipliers. Minimax Theory Appl., (2014). http://www.heldermann.de/MTA/MTA01/MTA011/mta01003.htm.
  7. 7.
    Bot, R.I., Csetnek, E.R., Hendrich, C.: Inertial Douglas–Rachford splitting for monotone inclusion problems. Technical report, arXiv:1403.3330, (2014)
  8. 8.
    Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)CrossRefGoogle Scholar
  9. 9.
    Bruck, R.: On the weak convergence of an ergodic iteration for the solution of variational inequalities for monotone operators in hilbert space. J. Math. Anal. Appl. 61, 159–164 (1977)CrossRefzbMATHMathSciNetGoogle Scholar
  10. 10.
    Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vision 40(1), 120–145 (2011)CrossRefzbMATHMathSciNetGoogle Scholar
  11. 11.
    Chen, G., Rockafellar, R.: Convergence rates in forward-backward splitting. SIAM J. Optim. 7(2), 421–444 (1997)CrossRefzbMATHMathSciNetGoogle Scholar
  12. 12.
    Chouzenoux, E., Pesquet, J.-C., Repetti, A.: Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function. J. Optim. Theory Appl. 1–26 (2013)Google Scholar
  13. 13.
    Combettes, P.L., Pesquet, J.-C.: Primal-dual splitting algorithm for solving inclusions with mixtures of composite, lipschitzian, and parallel-sum type monotone operators. Set-Valued Variational Anal. 20(2), 307–330 (2012)CrossRefzbMATHMathSciNetGoogle Scholar
  14. 14.
    Combettes, P.L., Vũ, B.C.: Variable metric forward-backward splitting with applications to monotone inclusions in duality. Optimization, 63(9), 1289–1318 (2012)Google Scholar
  15. 15.
    Combettes, P.L., Wajs, V.: Signal recovery by proximal forward-backward splitting. SIAM Multiscale Model. Simul. 4(4), 1168–1200 (2005)CrossRefzbMATHMathSciNetGoogle Scholar
  16. 16.
    Condat, L.: A primal-dual splitting method for convex optimization involving lipschitzian, proximable and linear composite terms. J. Optim. Theory Appl. 158(2), 460–479 (2013)CrossRefzbMATHMathSciNetGoogle Scholar
  17. 17.
    Daubechies, I., Defrise, M., De Mol, C.: An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pure Appl. Math. 57, 1413–1457 (2004)CrossRefzbMATHGoogle Scholar
  18. 18.
    Douglas, J., Rachford, H.H.: On the numerical solution of heat conduction problems in two and three space variables. Trans. Am. Math. Soc. 82, 421–439 (1956)CrossRefzbMATHMathSciNetGoogle Scholar
  19. 19.
    Duchi, J., Singer, Y.: Efficient online and batch learning using forward backward splitting. J. Mach. Learn. Res. 10, 2899–2934 (2009)zbMATHMathSciNetGoogle Scholar
  20. 20.
    Eckstein, J.: Splitting methods for monotone operators with applications to parallel optimization. PhD thesis, Massachusetts Institute of Technology, Cambridge, MA (1989)Google Scholar
  21. 21.
    Eckstein, J., Bertsekas, D.P.: On the Douglas–Rachford splitting method and the proximal point algorithm for maximal monotone operators. Math. Program. 55, 293–318 (1992)CrossRefzbMATHMathSciNetGoogle Scholar
  22. 22.
    Esser, E., Zhang, X., Chan, T.F.: A general framework for a class of first order primal-dual algorithms for convex optimization in imaging science. SIAM J. Imaging Sci. 3(4), 1015–1046 (2010)Google Scholar
  23. 23.
    Gabay, D.: Applications of the method of multipliers to variational inequalities. In: Fortin, M., Glowinski, R. (eds.) Augmented Lagrangian Methods: Applications to the Solution of Boundary Value Problems. Chapter IX, pp. 299–340. North-Holland, Amsterdam (1983)Google Scholar
  24. 24.
    Goldstein, A.A.: Convex programming in Hilbert spaces. Bull. Am. Math. Soc. 70, 709–710 (1964)CrossRefzbMATHGoogle Scholar
  25. 25.
    Goldstein, T., Osher, S.: The split Bregman method for L1-regularized problems. SIAM J. Imaging Sci. 2(2), 323–343 (2009)CrossRefzbMATHMathSciNetGoogle Scholar
  26. 26.
    Güler, O.: On the convergence of the proximal point algorithm for convex minimization. SIAM J. Control Optim. 29, 403–419 (1991)CrossRefzbMATHMathSciNetGoogle Scholar
  27. 27.
    He, B., Yuan, X.: Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective. SIAM J. Imaging Sci. 5(1), 119–149 (2012)CrossRefzbMATHMathSciNetGoogle Scholar
  28. 28.
    Levitin, E.S., Polyak, B.T.: Constrained minimization methods USSR. Comput. Math. Math. Phys. 6(5), 1–50 (1966)CrossRefGoogle Scholar
  29. 29.
    Lions, P.L., Mercier, B.: Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal. 16(6), 964–979 (1979)CrossRefzbMATHMathSciNetGoogle Scholar
  30. 30.
    Martine, B.: Brève communication régularisation d’inéquations variationnelles par approximations successives. ESAIM: Mathematical Modelling and Numerical Analysis—Modélisation Mathématique et Analyse Numérique, 4(R3):154–158, (1970)Google Scholar
  31. 31.
    Minty, G.J.: Monotone (nonlinear) operators in Hilbert space. Duke Math. J. 29, 341–346 (1962)CrossRefzbMATHMathSciNetGoogle Scholar
  32. 32.
    Moreau, J.J.: Proximité et dualité dans un espace Hilbertien. Bull. Soc. Math. France 93, 273–299 (1965)zbMATHMathSciNetGoogle Scholar
  33. 33.
    Moudafi, A., Oliny, M.: Convergence of a splitting inertial proximal method for monotone operators. J. Comput. Appl. Math. 155, 447–454 (2003)CrossRefzbMATHMathSciNetGoogle Scholar
  34. 34.
    Nesterov, Yu.: A method for solving the convex programming problem with convergence rate \(O(1/k^{2})\). Dokl. Akad. Nauk SSSR 269(3), 543–547 (1983)MathSciNetGoogle Scholar
  35. 35.
    Nesterov, Y.: Introductory lectures on convex optimization: a basic course. In: Applied Optimization, vol. 87. Kluwer Academic Publishers, Boston, MA (2004)Google Scholar
  36. 36.
    Nesterov, Yu.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)CrossRefzbMATHMathSciNetGoogle Scholar
  37. 37.
    Nesterov, Yu.: Gradient methods for minimizing composite functions. Math. Program. 140(1), 125–161 (2013)Google Scholar
  38. 38.
    Opial, Z.: Weak convergence of the sequence of successive approximations for nonexpansive mappings. Bull. Am. Math. Soc. 73, 591–597 (1967)CrossRefzbMATHMathSciNetGoogle Scholar
  39. 39.
    Passty, G.B.: Ergodic convergence to a zero of the sum of monotone operators in Hilbert space. J. Math. Anal. Appl. 72, 383–390 (1979)CrossRefzbMATHMathSciNetGoogle Scholar
  40. 40.
    Peaceman, D.W., Rachford, H.H.: The numerical solution of parabolic and elliptic differential equations. J. Soc. Ind. Appl. Math. 3(1), 28–41 (1955)CrossRefzbMATHMathSciNetGoogle Scholar
  41. 41.
    Pesquet, J.-C., Pustelnik, N.: A parallel inertial proximal optimization methods. Pac. J. Optim. 8(2), 273–305 (2012)zbMATHMathSciNetGoogle Scholar
  42. 42.
    Pock, T., Chambolle, A.: Diagonal preconditioning for first order primal-dual algorithms. In: Proceedings of the International Conference of Computer Vision (ICCV 2011), pp. 1762–1769 (2011)Google Scholar
  43. 43.
    Pock, T., Cremers, D., Bischof, H., Chambolle, A.: An algorithm for minimizing the Mumford-Shah functional. In: Proceedings of the ICCV. Lecture Notes in Computer Science. Springer, Berlin (2009)Google Scholar
  44. 44.
    Polyak, B.T.: Some methods of speeding up the convergence of iteration methods. U.S.S.R. Comput. Math. Math. Phys. 4(5), 1–17 (1964) Google Scholar
  45. 45.
    Raguet, H., Fadili, J., Peyré, G.: A generalized forward-backward splitting. SIAM J. Imaging Sci. 6(3), 1199–1226 (2013)CrossRefzbMATHMathSciNetGoogle Scholar
  46. 46.
    Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control Optim. 14(5), 877–898 (1976)CrossRefzbMATHMathSciNetGoogle Scholar
  47. 47.
    Tseng, P.: Applications of a splitting algorithm to decomposition in convex programming and variational inequalities. SIAM J. Control Optim. 29(1), 119–138 (1991)CrossRefzbMATHMathSciNetGoogle Scholar
  48. 48.
    Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization. Technical report (2008)Google Scholar
  49. 49.
    Vũ, B.: A splitting algorithm for dual monotone inclusions involving cocoercive operators. Adv. Comput. Math. 38(3), 667–681 (2013)CrossRefzbMATHMathSciNetGoogle Scholar
  50. 50.
    Villa, S., Salzo, S., Baldassarre, L., Verri, A.: Accelerated and inexact forward-backward algorithms. SIAM J. Optim. 23(3), 1607–1633 (2013)CrossRefzbMATHMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Institute for Analysis and AlgebraTU BraunschweigBraunschweigGermany
  2. 2.Institute for Computer Graphics and VisionGraz University of TechnologyGrazAustria
  3. 3.The Safety & Security DepartmentAIT Austrian Institute of Technology GmbHViennaAustria

Personalised recommendations