Advertisement

Computational Optimization and Applications

, Volume 74, Issue 3, pp 729–746 | Cite as

A delayed weighted gradient method for strictly convex quadratic minimization

  • Harry Fernando Oviedo LeonEmail author
Article
  • 94 Downloads

Abstract

In this paper is developed an accelerated version of the steepest descent method by a two-step iteration. The new algorithm uses information with delay to define the iterations. Specifically, in the first step, a prediction of the new test point is calculated by using the gradient method with the exact minimal gradient steplength and then, a correction is computed by a weighted sum between the prediction and the predecessor iterate of the current point. A convergence result is provided. In order to compare the efficiency and effectiveness of the proposal, with similar methods existing in the literature, numerical experiments are performed. The numerical comparison of the new algorithm with the classical conjugate gradient method shows that our method is a good alternative to solve large-scale problems.

Keywords

Gradient methods Convex quadratic optimization Linear system of equations 

Mathematics Subject Classification

90C20 90C25 90C52 65F10 

Notes

Acknowledgements

I would like to thank Dr. Marcos Raydan for your helpful comments and suggestions on this work, and also for sending me pertinent information. The author also would like to thank Dr. Hugo Lara and two anonymous referees for their useful suggestions and comments.

References

  1. 1.
    Kincaid, D., Kincaid, D.R., Cheney, E.W.: Numerical analysis: mathematics of scientific computing, vol. 2. American Mathematical Society (2009)Google Scholar
  2. 2.
    Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. Comptes Rendus Sci. Paris 25(1847), 536–538 (1847)Google Scholar
  3. 3.
    Hestenes, M.R., Stiefel, E. (eds.): Methods of Conjugate Gradients for Solving Linear Systems, vol. 49. NBS, Washington (1952)zbMATHGoogle Scholar
  4. 4.
    Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Yuan, Y.: A new stepsize for the steepest descent method. J. Comput. Math. 24(2), 149–156 (2006)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Dai, Y.-H., Yuan, Y.-X.: Alternate minimization gradient method. IMA J. Numer. Anal. 23(3), 377–393 (2003)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Dai, Y.-H.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2003)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Nesterov, Y.E.: One class of methods of unconditional minimization of a convex function, having a high rate of convergence. USSR Comput. Math. Math. Phys. 24(4), 80–82 (1984)CrossRefGoogle Scholar
  9. 9.
    Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13(3), 321–326 (1993)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Dai, Y.-H., Liao, L.-Z.: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Di Serafino, D., Ruggiero, V., Toraldo, G., Zanni, L.: On the steplength selection in gradient methods for unconstrained optimization. Appl. Math. Comput. 318, 176–195 (2018)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Dai, Y.-H., Fletcher, R.: On the asymptotic behaviour of some new gradient methods. Math. Program. 103(3), 541–559 (2005)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Zhou, B., Gao, L., Dai, Y.-H.: Gradient methods with adaptive step-sizes. Comput. Optim. Appl. 35(1), 69–86 (2006)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Frassoldati, G., Zanni, L., Zanghirati, G., et al.: New adaptive stepsize selections in gradient methods. J. Ind. Manag. Optim. 4(2), 299–312 (2008)MathSciNetCrossRefGoogle Scholar
  15. 15.
    De Asmundis, R., di Serafino, D., Riccio, F., Toraldo, G.: On spectral properties of steepest descent methods. IMA J. Numer. Anal. 33(4), 1416–1435 (2013)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Luenberger, D.G., Ye, Y., et al.: Linear and Nonlinear Programming, vol. 2. Springer, New York (1984)zbMATHGoogle Scholar
  17. 17.
    De Asmundis, R., Di Serafino, D., Hager, W.W., Toraldo, G., Zhang, H.: An efficient gradient method using the Yuan steplength. Comput. Optim. Appl. 59(3), 541–563 (2014)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Birgin, E.G., Martínez, J.M., Raydan, M., et al.: Spectral projected gradient methods: review and perspectives. J. Stat. Softw. 60(3), 1–21 (2014)CrossRefGoogle Scholar
  19. 19.
    Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36(1), 275–289 (1999)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Lamotte, J.-L., Molina, B., Raydan, M.: Smooth and adaptive gradient method with retards. Math. Comput. Model. 36(9–10), 1161–1168 (2002)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Brezinski, C.: Variations on richardson’s method and acceleration. Bull. Belg. Math. Soc. Simon Stevin 3(5), 33–44 (1996)zbMATHGoogle Scholar
  22. 22.
    Brezinski, C.: Multiparameter descent methods. Linear Algebra Appl. 296(1–3), 113–141 (1999)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Brezinski, C., Redivo-Zaglia, M.: Hybrid procedures for solving linear systems. Numer. Math. 67(1), 1–19 (1994)MathSciNetCrossRefGoogle Scholar
  24. 24.
    Liu, Z., Liu, H., Dong, X.: An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization 67(3), 427–440 (2018)MathSciNetCrossRefGoogle Scholar
  25. 25.
    Dai, Y.-H., Fletcher, R.: Projected Barzilai–Borwein methods for large-scale box-constrained quadratic programming. Numer. Math. 100(1), 21–47 (2005)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Steihaug, T.: The conjugate gradient method and trust regions in large scale optimization. SIAM J. Numer. Anal. 20(3), 626–637 (1983)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Mathematics Research CenterCIMAT A.C.GuanajuatoMexico

Personalised recommendations