Skip to main content
Log in

iPiasco: Inertial Proximal Algorithm for Strongly Convex Optimization

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

In this paper, we present a forward–backward splitting algorithm with additional inertial term for solving a strongly convex optimization problem of a certain type. The strongly convex objective function is assumed to be a sum of a non-smooth convex and a smooth convex function. This additional knowledge is used for deriving a worst-case convergence rate for the proposed algorithm. It is proved to be an optimal algorithm with linear rate of convergence. For certain problems this linear rate of convergence is better than the provably optimal worst-case rate of convergence for smooth strongly convex functions. We demonstrate the efficiency of the proposed algorithm in numerical experiments and examples from image processing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. In general, this is analytically very challenging because \(\tilde{m}_\alpha \) also depends on \(\alpha \).

  2. Although FISTA is an accelerated method, a comparison is not completely fair since it does not exploit the strong convexity of the problem.

References

  1. Alvarez, F.: Weak convergence of a relaxed and inertial hybrid projection-proximal point algorithm for maximal monotone operators in Hilbert space. SIAM J. Optim. 14(3), 773–782 (2003)

    Article  MathSciNet  Google Scholar 

  2. Alvarez, F., Attouch, H.: An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping. Set-Valued Anal. 9(1–2), 3–11 (2001)

    Article  MathSciNet  Google Scholar 

  3. Attouch, H., Peypouquet, J., Redont, P.: A dynamical approach to an inertial forward-backward algorithm for convex minimization. SIAM J. Optim. 24(1), 232–256 (2014)

    Article  MathSciNet  Google Scholar 

  4. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. CMS Books in Mathematics, Springer (2011)

  5. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Appl. Math. 2(1), 183–202 (2009)

    MathSciNet  Google Scholar 

  6. Bioucas-Dias, J.M., Figueiredo, M.: A new twist: two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE Trans. Image Process. 16(12), 2992–3004 (2007)

    Article  MathSciNet  Google Scholar 

  7. Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imaging Vision 40(1), 120–145 (2011)

    Article  MathSciNet  Google Scholar 

  8. Chambolle, A., Pock, T.: On the ergodic convergence rates of a first-order primal-dual algorithm (2014). to appear

  9. Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward-backward splitting. Multiscale Modeling Simul. 4(4), 1168–1200 (2005)

    Article  MathSciNet  Google Scholar 

  10. Combettes, P.L., Dũng, D., Vũ, B.C.: Dualization of signal recovery problems. Set-Valued Var. Anal. 18(3–4), 373–404 (2010)

    Article  MathSciNet  Google Scholar 

  11. Daubechies, I., Defrise, M., De Mol, C.: An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pure Appl. Math. 57(11), 1413–1457 (2004)

    Article  Google Scholar 

  12. Drori, Y., Teboulle, M.: Performance of first-order methods for smooth convex minimization: a novel approach. Math. Program. 145(1–2), 451–482 (2014)

    Article  MathSciNet  Google Scholar 

  13. Gelfand, I.: Normierte Ringe. Rec. Math. [Mat. Sbornik] N.S. 9(51), 3–24 (1941)

    MathSciNet  Google Scholar 

  14. Goldfarb, D., Ma, S.: Fast multiple-splitting algorithms for convex optimization. SIAM J. Optim. 22(2), 533–556 (2012)

    Article  MathSciNet  Google Scholar 

  15. He, B., Yuan, X.: On the \(O(1/n)\) convergence rate of the Douglas-Rachford alternating direction method. SIAM J. Numer. Anal. 50(2), 700–709 (2012)

    Article  MathSciNet  Google Scholar 

  16. Hong, M., Luo, Z.-Q.: On the linear convergence of the alternating direction method of multipliers. ArXiv e-prints, August (2012)

  17. Mainberger, M., Weickert, J.: Edge-based image compression with homogeneous diffusion. In: Jiang, Xiaoyi, Petkov, Nicolai (eds.) Computer Analysis of Images and Patterns. Lecture Notes in Computer Science, vol. 5702, pp. 476–483. Springer, Berlin (2009)

  18. Moudafi, A., Oliny, M.: Convergence of a splitting inertial proximal method for monotone operators. J. Comput. Appl. Math. 155, 447–454 (2003)

    Article  MathSciNet  Google Scholar 

  19. Nesterov, Y.: A method of solving a convex programming problem with convergence rate \(O(1/k^2)\). Soviet Math. Doklady 27, 372–376 (1983)

    Google Scholar 

  20. Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Applied Optimization, vol. 87. Kluwer Academic Publishers, Boston (2004)

  21. Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)

    Article  MathSciNet  Google Scholar 

  22. Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2), 341–362 (2012)

    Article  MathSciNet  Google Scholar 

  23. Nesterov, Y.: Gradient methods for minimizing composite functions. Math. Program. 140(1), 125–161 (2013)

    Article  MathSciNet  Google Scholar 

  24. Ochs, P., Chen, Y., Brox, T., Pock, T.: ipiano: Inertial proximal algorithm for non-convex optimization. SIAM J. Imaging Sci. 7(2), 1388–1419 (2014)

    Article  MathSciNet  Google Scholar 

  25. Poljak, B.T.: Introduction to optimization. Optimization Software (1987)

  26. Polyak, B.T.: Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys. 4(5), 1–17 (1964)

    Article  Google Scholar 

  27. Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Phys. D 60, 259–268 (1992)

    Article  Google Scholar 

  28. Shor, N.Z.: Minimization Methods for Non-differentiable Functions. Springer-Verlag New York Inc, New York (1985)

    Book  Google Scholar 

  29. Vegeta from Dragon Ball Z. http://1.bp.blogspot.com/-g3wpzDWE0QI/UbycqlJWjDI/AAAAAAAAAbU/ty0EZl0kqXw/s1600/VEGETA%2B1.jpg

  30. Zavriev, S.K., Kostyuk, F.V.: Heavy-ball method in nonconvex optimization problems. Comput. Math. Model. 4(4), 336–341 (1993)

    Article  Google Scholar 

Download references

Acknowledgments

Thomas Pock acknowledges support from the Austrian science fund (FWF) under the START project BIVISION, No. Y729. Peter Ochs and Thomas Brox acknowledge funding by the German Research Foundation (DFG Grant BR 3815/5-1).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peter Ochs.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ochs, P., Brox, T. & Pock, T. iPiasco: Inertial Proximal Algorithm for Strongly Convex Optimization. J Math Imaging Vis 53, 171–181 (2015). https://doi.org/10.1007/s10851-015-0565-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10851-015-0565-0

Keywords

Navigation