Advertisement

Convergence Rate of Proximal Inertial Algorithms Associated with Moreau Envelopes of Convex Functions

  • Hedy AttouchEmail author
  • Juan Peypouquet
Chapter

Abstract

In a Hilbert space setting \({\mathcal H}\), we develop new inertial proximal-based algorithms that aim to rapidly minimize a convex lower-semicontinuous proper function \(\varPhi : \mathcal H \rightarrow {\mathbb R} \cup \{+\infty \}\). The guiding idea is to use an accelerated proximal scheme where, at each step, Φ is replaced by its Moreau envelope, with varying approximation parameter. This leads to consider a Relaxed Inertial Proximal Algorithm (RIPA) with variable parameters which take into account the effects of inertia, relaxation, and approximation. (RIPA) was first introduced to solve general maximally monotone inclusions, in which case a judicious adjustment of the parameters makes it possible to obtain the convergence of the iterates towards the equilibria. In the case of convex minimization problems, convergence analysis of (RIPA) was initially addressed by Attouch and Cabot, based on its formulation as an inertial gradient method with varying potential functions. We propose a new approach to this algorithm, along with further developments, based on its formulation as a proximal algorithm associated with varying Moreau envelopes. For convenient choices of the parameters, we show the fast optimization property of the function values, with the order o(k−2), and the weak convergence of the iterates. This is in line with the recent studies of Su-Boyd-Candès, Chambolle-Dossal, Attouch-Peypouquet. We study the impact of geometric assumptions on the convergence rates, and the stability of the results with respect to perturbations and errors. Finally, in the case of structured minimization problems smooth + nonsmooth, based on this approach, we introduce new proximal-gradient inertial algorithms for which similar convergence rates are shown.

Keywords

Inertial proximal algorithms Lyapunov analysis Maximally monotone operators Moreau envelopes Nesterov accelerated gradient method Nonsmooth convex minimization Proximal-gradient algorithms Relaxation 

AMS Subject Classification

37N40 46N10 49M30 65K05 65K10 90B50 90C25 

Notes

Acknowledgements

This work was supported by FONDECYT Grant 1181179 and CMM-Conicyt PIA AFB170001.

References

  1. 1.
    Álvarez, F.: On the minimizing property of a second-order dissipative system in Hilbert spaces. SIAM J. Control Optim. 38, 1102–1119 (2000)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Álvarez, F., Attouch, H., Bolte, J., Redont, P.: A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics. J. Math. Pures Appl. 81, 747–779 (2002)zbMATHGoogle Scholar
  3. 3.
    Apidopoulos, V., Aujol, J.-F., Dossal, Ch.: Convergence rate of inertial Forward-Backward algorithm beyond Nesterov’s rule. Math. Prog. (Ser. A). , 1–20 (2018)Google Scholar
  4. 4.
    Attouch, H.: Variational Analysis for Functions and Operators. Pitman (1984)Google Scholar
  5. 5.
    Attouch, H., Bolte, J., Redont, P.: Optimizing properties of an inertial dynamical system with geometric damping. Link with proximal methods. Control Cybernet. 31, 643–657 (2002)zbMATHGoogle Scholar
  6. 6.
    Attouch, H., Cabot, A.: Asymptotic stabilization of inertial gradient dynamics with time-dependent viscosity. J. Differential Equations 263, 5412–5458 (2017)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Attouch, H., Cabot, A.: Convergence rates of inertial forward-backward algorithms. SIAM J. Optim. 28, 849–874 (2018)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Attouch, H., Cabot, A.: Convergence of damped inertial dynamics governed by regularized maximally monotone operators. J. Differential Equations to appear. HAL-01648383v2 (2018)Google Scholar
  9. 9.
    Attouch, H., Cabot, A.: Convergence of a relaxed inertial proximal algorithm for maximally monotone operators. HAL-01708905 (2018)Google Scholar
  10. 10.
    Attouch, H., Cabot, A.: Convergence rate of a relaxed inertial proximal algorithm for convex minimization. HAL-01807041 (2018)Google Scholar
  11. 11.
    Attouch, H., Cabot, A., Redont, P.: The dynamics of elastic shocks via epigraphical regularization of a differential inclusion. Adv. Math. Sci. Appl. 12, 273–306 (2002)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Attouch, H., Cabot, A., Chbani, Z., Riahi, H.: Accelerated forward-backward algorithms with perturbations. Application to Tikhonov regularization. J. Optim. Th. Appl. 179, 1–36 (2018)CrossRefGoogle Scholar
  13. 13.
    Attouch, H., Chbani, Z., Peypouquet, J., Redont, P.: Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Prog. (Ser. B) 168, 123–175 (2018)Google Scholar
  14. 14.
    Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3, ESAIM: COCV 25 (2019)Google Scholar
  15. 15.
    Attouch, H., Peypouquet, J.: The rate of convergence of Nesterov’s accelerated forward-backward method is actually faster than 1∕k 2. SIAM J. Optim. 26, 1824–1834 (2016)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Attouch, H., Peypouquet, J.: Convergence of inertial dynamics and proximal algorithms governed by maximal monotone operators. Math. Prog. 174, 319–432 (2019)CrossRefGoogle Scholar
  17. 17.
    Attouch, H., Peypouquet, J., Redont, P.: Fast convex minimization via inertial dynamics with Hessian driven damping. J. Differential Equations 261, 5734–5783 (2016)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Aujol, J.-F., Dossal, Ch.: Stability of over-relaxations for the Forward-Backward algorithm, application to FISTA. SIAM J. Optim. 25, 2408–2433 (2015)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Baillon, J.-B.:, Un exemple concernant le comportement asymptotique de la solution du problème \(\frac {du}{dt} + \partial \phi (u) \ni 0\). J. Functional Anal. 28, 369–376 (1978)Google Scholar
  20. 20.
    Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert spaces. Springer (2011)Google Scholar
  21. 21.
    Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Beck, A., Teboulle, M.: Gradient-Based Algorithms with Applications in Signal Recovery Problems. In Convex Optimization in Signal Processing and Communications, D. Palomar and Y. Eldar Eds., Cambridge University Press, 33–88 (2010)Google Scholar
  23. 23.
    Bolte, J., Daniilidis, A., Ley, O., Mazet, L.: Characterizations of Lojasiewicz Inequalities and Applications. Trans. AMS 362, 3319–3363 (2010)CrossRefGoogle Scholar
  24. 24.
    Bolte, J., Nguyen, T.P., Peypouquet, J., Suter, B.: From error bounds to the complexity of first-order descent methods for convex functions. Math. Prog. 165, 471–507 (2017)MathSciNetCrossRefGoogle Scholar
  25. 25.
    Brézis, H.: Opérateurs maximaux monotones dans les espaces de Hilbert et équations d’évolution. North Holland, (1972)Google Scholar
  26. 26.
    Chambolle, A., Dossal, Ch.: On the convergence of the iterates of the Fast Iterative Shrinkage Thresholding Algorithm. J. Optim. Theory Appl. 166, 968–982 (2015)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Drori, Y., Teboulle, M.: Performance of first-order methods for smooth convex minimization: a novel approach. Math. Prog. (Ser. A) 145, 451–482 (2014)Google Scholar
  28. 28.
    Güler, O.: On the convergence of the proximal point algorithm for convex optimization. SIAM J. Control Optim. 29 403–419 (1991)MathSciNetCrossRefGoogle Scholar
  29. 29.
    Imbert, C.: Convex Analysis techniques for Hopf-Lax formulae in Hamilton-Jacobi equations. J. of Nonlinear Convex Anal. 2 333–343 (2001)MathSciNetzbMATHGoogle Scholar
  30. 30.
    Kim, D., Fessler, J.A.: Optimized first-order methods for smooth convex minimization. Math. Prog. to appear. DOI 10.1007/s10107–015-0949-3.Google Scholar
  31. 31.
    Liang, J., Fadili, J., Peyré, G.: Local linear convergence of forward-backward under partial smoothness. Advances in Neural Information Processing Systems, 1970–1978 (2014)Google Scholar
  32. 32.
    May, R.: Asymptotic for a second-order evolution equation with convex potential and vanishing damping term. Turkish J. Math. 41, 681–685 (2017)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Nesterov, Y.: A method of solving a convex programming problem with convergence rate O(1∕k 2). Soviet Math. Doklady 27, 372–376 (1983)zbMATHGoogle Scholar
  34. 34.
    Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Volume 87 of Applied Optimization. Kluwer Academic Publishers, Boston, MA (2004)Google Scholar
  35. 35.
    Parikh, N., Boyd, S.: Proximal algorithms. Foundations and Trends in Optimization 1, 123–231 (2013)Google Scholar
  36. 36.
    Peypouquet, J.: Convex Optimization in Normed Spaces: Theory, Methods and Examples. Springer (2015)CrossRefGoogle Scholar
  37. 37.
    Villa, S., Salzo, S., Baldassarres, L., Verri A.: Accelerated and inexact forward-backward. SIAM J. Optim. 23, 1607–1633 (2013)MathSciNetCrossRefGoogle Scholar
  38. 38.
    Su, W., Boyd, S., Candès, E.J.: A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights. Neural Information Processing Systems 27, 2510–2518 (2014)zbMATHGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.IMAGUniv. Montpellier, CNRSMontpellierFrance
  2. 2.Departamento de Ingeniería Matemática & Centro de Modelamiento Matemático (CNRS UMI2807), FCFMUniversidad de ChileSantiagoChile

Personalised recommendations