Advertisement

Convergence rate of inertial Forward–Backward algorithm beyond Nesterov’s rule

  • Vassilis ApidopoulosEmail author
  • Jean-François Aujol
  • Charles Dossal
Full Length Paper Series A
  • 192 Downloads

Abstract

In this paper we study the convergence of an Inertial Forward–Backward algorithm, with a particular choice of an over-relaxation term. In particular we show that for a sequence of over-relaxation parameters, that do not satisfy Nesterov’s rule, one can still expect some relatively fast convergence properties for the objective function. In addition we complement this work by studying the convergence of the algorithm in the case where the proximal operator is inexactly computed with the presence of some errors and we give sufficient conditions over these errors in order to obtain some convergence properties for the objective function.

Keywords

Convex optimization Proximal operator Inertial FB algorithm Nesterov’s rule Rate of convergence 

Mathematics Subject Classification

49M20 46N10 90C25 65K10 

Notes

Acknowledgements

The authors would like to thank the anonymous reviewers for all their useful commentaries and advices and for pointing out some important references.

References

  1. 1.
    Apidopoulos, V., Aujol, J.F., Dossal, C.: The differential inclusion modeling FISTA algorithm and optimality of convergence rate in the case b\(\le 3\). SIAM J. Optim. 28(1), 551–574 (2018)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Attouch, H., Cabot, A.: Convergence rates of inertial forward–backward algorithms. SIAM J. Optim. 28(1), 849–874 (2018)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Attouch, H., Chbani, Z., Peypouquet, J., Redont, P.: Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Program. 168, 1–53 (2016)MathSciNetzbMATHGoogle Scholar
  4. 4.
    Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case \(\alpha \le 3\) (2017). arXiv preprint arXiv:1706.05671
  5. 5.
    Attouch, H., Peypouquet, J.: The rate of convergence of Nesterov’s accelerated forward–backward method is actually faster than 1/k\(^{\wedge }\)2. SIAM J. Optim. 26(3), 1824–1834 (2016)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Aujol, J.F., Dossal, C.: Stability of over-relaxations for the forward–backward algorithm, application to FISTA. SIAM J. Optim. 25(4), 2408–2433 (2015)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Aujol, J.F., Dossal, C.: Optimal rate of convergence of an ode associated to the fast gradient descent schemes for \(b>0\). J. Differ. Equ. (preprint available hal-01547251) (2017)Google Scholar
  8. 8.
    Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, Berlin (2011)CrossRefGoogle Scholar
  9. 9.
    Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Chambolle, A., Dossal, C.: On the convergence of the iterates of the fast iterative shrinkage/thresholding algorithm. J. Optim. Theory Appl. 166(3), 968–982 (2015)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward–backward splitting. Multiscale Model. Simul. 4(4), 1168–1200 (2005)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Güler, O.: New proximal point algorithms for convex minimization. SIAM J. Optim. 2(4), 649–664 (1992)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Holte, J.M.: Discrete Gronwall lemma and applications. In: MAA-NCS meeting at the University of North Dakota, vol. 24, pp. 1–7 (2009)Google Scholar
  14. 14.
    Johnstone, P.R., Moulin, P.: Local and global convergence of a general inertial proximal splitting scheme (2016). arXiv preprint arXiv:1602.02726
  15. 15.
    Kim, D., Fessler, J.A.: Optimized first-order methods for smooth convex minimization (2014). arXiv preprint arXiv:1406.5468
  16. 16.
    Nesterov, Y.: A method of solving a convex programming problem with convergence rate O (1/k2). Sov. Math. Doklady 27, 372–376 (1983)zbMATHGoogle Scholar
  17. 17.
    Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Springer, Berlin (2013)zbMATHGoogle Scholar
  18. 18.
    Salzo, S., Villa, S.: Inexact and accelerated proximal point algorithms. J. Convex Anal. 19(4), 1167–1192 (2012)MathSciNetzbMATHGoogle Scholar
  19. 19.
    Schmidt, M., Le Roux, N., Bach, F.: Convergence rates of inexact proximal-gradient methods for convex optimization. In: NIPS (2011)Google Scholar
  20. 20.
    Su, W., Boyd, S., Candes, E.J.: A differential equation for modeling Nesterovs accelerated gradient method: theory and insights. J. Mach. Learn. Res. 17(153), 1–43 (2016)MathSciNetzbMATHGoogle Scholar
  21. 21.
    Villa, S., Salzo, S., Baldassarre, L., Verri, A.: Accelerated and inexact forward–backward algorithms. SIAM J. Optim. 23(3), 1607–1633 (2013)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society 2018

Authors and Affiliations

  1. 1.IMB, UMR 5251Université de BordeauxTalenceFrance

Personalised recommendations