Mathematical Programming

, Volume 103, Issue 3, pp 541–559 | Cite as

On the asymptotic behaviour of some new gradient methods

Article

Abstract.

The Barzilai-Borwein (BB) gradient method, and some other new gradient methods have shown themselves to be competitive with conjugate gradient methods for solving large dimension nonlinear unconstrained optimization problems. Little is known about the asymptotic behaviour, even when applied to n−dimensional quadratic functions, except in the case that n=2. We show in the quadratic case how it is possible to compute this asymptotic behaviour, and observe that as n increases there is a transition from superlinear to linear convergence at some value of n≥4, depending on the method. By neglecting certain terms in the recurrence relations we define simplified versions of the methods, which are able to predict this transition. The simplified methods also predict that for larger values of n, the eigencomponents of the gradient vectors converge in modulus to a common value, which is a similar to a property observed to hold in the real methods. Some unusual and interesting recurrence relations are analysed in the course of the study.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Statist. Math. Tokyo 11, 1–17 (1959)Google Scholar
  2. 2.
    Birgin, E.G., Chambouleyron, I., Martínez, J.M.: Estimation of the optical constants and the thickness of thin films using unconstrained optimization. J. Comput. Phys. 151, 862–880 (1999)CrossRefGoogle Scholar
  3. 3.
    Birgin, E.G., Evtushenko, Y.G.: Automatic differentiation and spectral projected gradient methods for optimal control problems. Optim. Meth. Softw. 10, 125–146 (1998)MathSciNetGoogle Scholar
  4. 4.
    Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10, 1196–1211 (2000)CrossRefGoogle Scholar
  5. 5.
    Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)Google Scholar
  6. 6.
    Cauchy, A.: Méthode générale pour la résolution des systèms d’équations simultanées. Comp. Rend. Sci. Paris 25, 536–538 (1847)Google Scholar
  7. 7.
    Dai, Y.H.: Alternate Step Gradient Method. Research report, State Key Laboratory of Scientific and Engineering Computing, Institute of Computational Mathematics and Scientific/Engineering computing, Academy of Mathematics and System Sciences, Chinese Academy of Sciences. 2001Google Scholar
  8. 8.
    Dai, Y.H., Liao, L.-Z.: R–linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 26, 1–10 (2002)CrossRefGoogle Scholar
  9. 9.
    Fletcher, R.: Low storage methods for unconstrained optimization. Lect. Appl. Math. (AMS) 26, 165–179 (1999)Google Scholar
  10. 10.
    Fletcher, R.: On the Barzilai-Borwein Method. Research report, Department of Mathematics, University of Dundee, 2001Google Scholar
  11. 11.
    Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36, 275–289 (1999)CrossRefGoogle Scholar
  12. 12.
    Glunt, W., Hayden, T.L., Raydan, M.: Molecular conformations from distance matrices. J. Comput. Chem. 14, 114–120 (1993)CrossRefGoogle Scholar
  13. 13.
    Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)CrossRefGoogle Scholar
  14. 14.
    Liu, W.B., Dai, Y.H.: Minimization Algorithms Based on Supervisor and Searcher Cooperation: J. Optim. Theor. Appl. 111, 359–379 (2001)CrossRefGoogle Scholar
  15. 15.
    Nocedal, J., Sartentaer, A., Zhu, C.: On the Behavior of the Gradient Norm in the Steepest Descent Method. Comput. Optim. Appl. 22, 5–35 (2002)CrossRefGoogle Scholar
  16. 16.
    Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993)Google Scholar
  17. 17.
    Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)CrossRefGoogle Scholar
  18. 18.
    Raydan, M.: Nonmonotone spectral methods for large-scale nonlinear systems. Report in the International Workshop on “Optimization and Control with Applications”, Erice, Italy, July 9–17, 2001Google Scholar
  19. 19.
    Raydan, M., Svaiter, B.F.: Relaxed Steepest Descent and Cauchy-Barzilai-Borwein Method. Comput. Optim. Appl. 21, 155–167 (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  1. 1.State Key Laboratory of Scientific and Engineering Computing, Institute of Computational Mathematics and Scientific/Engineering computingAcademy of Mathematics and System Sciences, Chinese Academy of SciencesBeijingPR China
  2. 2.Department of MathematicsUniversity of DundeeDundeeScotland, UK

Personalised recommendations