Convergence rate of inertial Forward–Backward algorithm beyond Nesterov’s rule
- 192 Downloads
In this paper we study the convergence of an Inertial Forward–Backward algorithm, with a particular choice of an over-relaxation term. In particular we show that for a sequence of over-relaxation parameters, that do not satisfy Nesterov’s rule, one can still expect some relatively fast convergence properties for the objective function. In addition we complement this work by studying the convergence of the algorithm in the case where the proximal operator is inexactly computed with the presence of some errors and we give sufficient conditions over these errors in order to obtain some convergence properties for the objective function.
KeywordsConvex optimization Proximal operator Inertial FB algorithm Nesterov’s rule Rate of convergence
Mathematics Subject Classification49M20 46N10 90C25 65K10
The authors would like to thank the anonymous reviewers for all their useful commentaries and advices and for pointing out some important references.
- 4.Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case \(\alpha \le 3\) (2017). arXiv preprint arXiv:1706.05671
- 7.Aujol, J.F., Dossal, C.: Optimal rate of convergence of an ode associated to the fast gradient descent schemes for \(b>0\). J. Differ. Equ. (preprint available hal-01547251) (2017)Google Scholar
- 13.Holte, J.M.: Discrete Gronwall lemma and applications. In: MAA-NCS meeting at the University of North Dakota, vol. 24, pp. 1–7 (2009)Google Scholar
- 14.Johnstone, P.R., Moulin, P.: Local and global convergence of a general inertial proximal splitting scheme (2016). arXiv preprint arXiv:1602.02726
- 15.Kim, D., Fessler, J.A.: Optimized first-order methods for smooth convex minimization (2014). arXiv preprint arXiv:1406.5468
- 19.Schmidt, M., Le Roux, N., Bach, F.: Convergence rates of inexact proximal-gradient methods for convex optimization. In: NIPS (2011)Google Scholar