Skip to main content
Log in

Adaptive Two-Point Stepsize Gradient Algorithm

  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

Combined with the nonmonotone line search, the two-point stepsize gradient method has successfully been applied for large-scale unconstrained optimization. However, the numerical performances of the algorithm heavily depend on M, one of the parameters in the nonmonotone line search, even for ill-conditioned problems. This paper proposes an adaptive nonmonotone line search. The two-point stepsize gradient method is shown to be globally convergent with this adaptive nonmonotone line search. Numerical results show that the adaptive nonmonotone line search is specially suitable for the two-point stepsize gradient method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. H. Akaike, On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method, Ann. Inst. Statist. Math. Tokyo 11 (1959) 1–17.

    Google Scholar 

  2. J. Barzilai and J.M. Borwein, Two point step size gradient methods, IMA J. Numer. Anal. 8 (1988) 141–148.

    Google Scholar 

  3. E.G. Birgin, I. Chambouleyron and J.M. Martínez, Estimation of the optical constants and the thickness of thin films using unconstrained optimization, J. Comput. Phys. 151 (1999) 862–880.

    Google Scholar 

  4. E.G. Birgin and Y.G. Evtushenko, Automatic differentiation and spectral projected gradient methods for optimal control problems, Optim. Methods Softw. 10 (1998) 125–146.

    Google Scholar 

  5. E.G. Birgin, J.M. Martínez and M. Raydan, Nonmonotone spectral projected gradient methods for convex sets, SIAM J. Optim. 10(4) (2000) 1196–1211.

    Google Scholar 

  6. A. Cauchy, Méthode générale pour la résolution des systèms d'equations simultanées, Comp. Rend. Sci. Paris 25 (1847) 46–89.

    Google Scholar 

  7. Y.H. Dai, On the nonmonotone line search, 2000 (accepted by JOTA).

  8. Y.H. Dai and L.Z. Liao, R-linear convergence of the Barzilai and Borwein gradient method (1999), accepted by IMA J. Numer. Anal.

  9. G.E. Forsythe, On the asymptotic directions of the s-dimensional optimum gradient method, Numer. Math. 11 (1968) 57–76.

    Google Scholar 

  10. A. Friedlander, J.M. Martínez, B. Molina and M. Raydan, Gradient method with retards and generalizations, SIAM J. Numer. Anal. 36 (1999) 275–289.

    Google Scholar 

  11. W. Glunt, T.L. Hayden and M. Raydan, Molecular conformations from distance matrices, J. Comput. Chem. 14 (1993) 114–120.

    Google Scholar 

  12. L. Grippo, F. Lampariello and S. Lucidi, A nonmonotone line search technique for Newton's method, SIAM J. Numer. Anal. 23 (1986) 707–716.

    Google Scholar 

  13. W.B. Liu and Y.H. Dai, Minimization algorithms based on supervisor and searcher co-operation: I – faster and robust gradient algorithms for minimization problems with stronger noises (1999), accepted by JOTA.

  14. J.J. Morè, B.S. Garbow and K.E. Hillstrom, Testing unconstrained optimization software, ACMTrans. Math. Software 7 (1981) 17–41.

    Google Scholar 

  15. M. Raydan, On the Barzilai and Borwein choice of steplength for the gradient method, IMA J. Numer. Anal. 13 (1993) 321–326.

    Google Scholar 

  16. M. Raydan, The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM J. Optim. 7(1) (1997) 26–33.

    Google Scholar 

  17. Ph.L. Toint, A nonmonotone trust region algorithm for nonlinear optimization subject to convex constraints, Math. Prog. 77 (1997) 69–94.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dai, YH., Zhang, H. Adaptive Two-Point Stepsize Gradient Algorithm. Numerical Algorithms 27, 377–385 (2001). https://doi.org/10.1023/A:1013844413130

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1013844413130

Navigation