Skip to main content
Log in

Fast Convex Optimization via Differential Equation with Hessian-Driven Damping and Tikhonov Regularization

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript


In this paper, we consider a class of second-order ordinary differential equations with Hessian-driven damping and Tikhonov regularization, which arises from the minimization of a smooth convex function in Hilbert spaces. Inspired by Attouch et al. (J Differ Equ 261:5734–5783, 2016), we establish that the function value along the solution trajectory converges to the optimal value, and prove that the convergence rate can be as fast as \(o(1/t^2)\). By constructing proper energy function, we prove that the trajectory strongly converges to a minimizer of the objective function of minimum norm. Moreover, we propose a gradient-based optimization algorithm based on numerical discretization, and demonstrate its effectiveness in numerical experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others


  1. Attouch, H.: Viscosity solutions of minimization problems. SIAM J. Optim. 6, 769–806 (1996).

    Article  MathSciNet  Google Scholar 

  2. Attouch, H., Balhag, A., Chbani, Z., Riahi, H.: Accelerated gradient methods combining Tikhonov regularization with geometric damping driven by the Hessian. Appl. Math. Optim. 88, 1–29 (2023).

    Article  MathSciNet  Google Scholar 

  3. Attouch, H., Balhag, A., Chbani, Z., Riahi, H.: Damped inertial dynamics with vanishing Tikhonov regularization: strong asymptotic convergence towards the minimum norm solution. J. Differ. Equ. 311, 29–58 (2022).

    Article  MathSciNet  Google Scholar 

  4. Attouch, H., Chbani, Z., Fadili, J., Riahi, H.: First-order optimization algorithms via inertial systems with Hessian driven damping. Math. Program. (2020).

    Article  Google Scholar 

  5. Attouch, H., Chbani, Z., Peypouquet, J., Redont, P.: Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity. Math. Program. 168, 123–175 (2018).

    Article  MathSciNet  Google Scholar 

  6. Attouch, H., Chbani, Z., Riahi, H.: Combining fast inertial dynamics for convex optimization with Tikhonov regularization. J. Math. Anal. Appl. 457, 1065–1094 (2018).

    Article  MathSciNet  Google Scholar 

  7. Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the Nesterov accelerated gradient method in the subcritical case \(\alpha \le 3\). ESAIM Control Optim. Calc. Var. 25, 2–35 (2019).

    Article  MathSciNet  Google Scholar 

  8. Attouch, H., Laszlo, S.: Convex optimization via inertial algorithms with vanishing Tikhonov regularization: fast convergence to the minimum norm solution. arXiv:2104.11987 (2021)

  9. Attouch, H., Peypouquet, J., Redont, P.: Fast convex optimization via inertial dynamics with Hessian driven damping. J. Differ. Equ. 261, 5734–5783 (2016).

    Article  MathSciNet  Google Scholar 

  10. Boţ, R.I., Csetnek, E.R., László, S.C.: Tikhonov regularization of a second order dynamical system with Hessian driven damping. Math. Program. 189, 151–186 (2021).

    Article  MathSciNet  Google Scholar 

  11. Bottou, L., Curtis, F.E., Nocedal, J.: Optimization methods for large-scale machine learning. SIAM Rev. 60, 223–311 (2018).

    Article  MathSciNet  Google Scholar 

  12. Brezis, H.: Functional Analysis, Sobolev Spaces and Partial Differential Equations. Springer, Berlin (2010)

    Google Scholar 

  13. Hairer, E., Lubich, C., Wanner, G.: Geometric Numerical Integration: Structure-Preserving Algorithms for Ordinary Differential Equations. Springer, Berlin (2006)

    Google Scholar 

  14. Lin, Z., Li, H., Fang, C.: Accelerated Optimization for Machine Learning: First-Order Algorithms. Springer, Berlin (2020)

    Book  Google Scholar 

  15. May, R.: Asymptotic for a second-order evolution equation with convex potential and vanishing damping term. Turk. J. Math. 41, 681–685 (2017).

    Article  MathSciNet  Google Scholar 

  16. May, R., Mnasri, C., Elloumi, M.: Asymptotic for a second order evolution equation with damping and regularizing terms. AIMS Math. 6, 4901–4914 (2021).

    Article  MathSciNet  Google Scholar 

  17. Nesterov, Y.: A method of solving a convex programming problem with convergence rate \(o(\frac{1}{k^2})\). Soviet Math. Dokl. 27, 372–376 (1983)

    Google Scholar 

  18. Nesterov, Y.: Lectures on Convex Optimization, Second Edition, volume 137 of Springer Optimization and Its Applications. Springer, Cham (2018)

  19. Sontag, E.D.: Mathematical Control Theory: Deterministic Finite Dimensional Systems, vol. 6. Springer, Berlin (2013)

    Google Scholar 

  20. Sra, S., Nowozin, S., Wright, S.J.: Optimization for Machine Learning. MIT Press, New York (2012)

    Google Scholar 

  21. Su, W., Boyd, S., Candès, E.J.: A differential equation for modeling Nesterov’s accelerated gradient method: theory and insights. J. Mach. Learn. Res. 17, 1–43 (2016)

    MathSciNet  Google Scholar 

  22. Xu, B., Wen, B.: On the convergence of a class of inertial dynamical systems with Tikhonov regularization. Optim. Lett. 15, 2025–2052 (2020).

    Article  MathSciNet  Google Scholar 

  23. Zhang, J., Mokhtari, A., Sra, S., Jadbabaie, A.: Direct Runge-Kutta discretization achieves acceleration. arXiv:1805.00521 (2021)

Download references


The authors are supported by the National Natural Science Foundation of China (Grant No. 12071160).

Author information

Authors and Affiliations


Corresponding author

Correspondence to Ming Tang.

Additional information

Communicated by Akhtar A. Khan.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhong, G., Hu, X., Tang, M. et al. Fast Convex Optimization via Differential Equation with Hessian-Driven Damping and Tikhonov Regularization. J Optim Theory Appl (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


Mathematics Subject Classification