Advertisement

Optimization Letters

, Volume 5, Issue 4, pp 615–630 | Cite as

Global convergence of some modified PRP nonlinear conjugate gradient methods

  • Zhi-feng DaiEmail author
  • Bo-Shi Tian
Original Paper

Abstract

Recently, similar to Hager and Zhang (SIAM J Optim 16:170–192, 2005), Yu (Nonlinear self-scaling conjugate gradient methods for large-scale optimization problems. Thesis of Doctors Degree, Sun Yat-Sen University, 2007) and Yuan (Optim Lett 3:11–21, 2009) proposed modified PRP conjugate gradient methods which generate sufficient descent directions without any line searches. In order to obtain the global convergence of their algorithms, they need the assumption that the stepsize is bounded away from zero. In this paper, we take a little modification to these methods such that the modified methods retain sufficient descent property. Without requirement of the positive lower bound of the stepsize, we prove that the proposed methods are globally convergent. Some numerical results are also reported.

Keywords

Unconstrained optimization Conjugate gradient method Line search Sufficient descent property Global convergence 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hestenes M.R., Stiefel E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. Sect. B 49, 409–432 (1952)MathSciNetzbMATHGoogle Scholar
  2. 2.
    Fletcher R., Reeves C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)MathSciNetzbMATHCrossRefGoogle Scholar
  3. 3.
    Polak B., Ribiére G.: Note surla convergence des méthodes de directions conjuguées. Rev. Fr. Inf. Rech. Operatonelle 3e Année 16, 35–43 (1969)Google Scholar
  4. 4.
    Polyak B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)CrossRefGoogle Scholar
  5. 5.
    Dai Y., Yuan Y.: A nonlinear conjugate gradient with a strong global convergence properties. SIAM J. Optim. 10, 177–182 (2000)CrossRefGoogle Scholar
  6. 6.
    AL-Baali M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5, 121–124 (1985)MathSciNetzbMATHCrossRefGoogle Scholar
  7. 7.
    Dai Y., Yuan Y.: Convergence properties of the Fletcher–Reeves method. IMA J. Numer. Anal. 16(2), 155–164 (1996)MathSciNetzbMATHCrossRefGoogle Scholar
  8. 8.
    Gilbert J.C., Nocedal J.: Global convergence properties of conjugate gradient methods for optimization. SIAM. J. Optim. 2, 21–42 (1992)MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Grippo L., Lucidi S.: A globally convergent version of the Polak–Ribiére gradient method. Math. Program. 78, 375–391 (1997)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Hager W.W., Zhang H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)MathSciNetzbMATHCrossRefGoogle Scholar
  11. 11.
    Hager W.W., Zhang H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Yu, G.H.: Nonlinear self-scaling conjugate gradient methods for large-scale optimization problems. Thesis of Doctors Degree, Sun Yat-Sen University (2007)Google Scholar
  13. 13.
    Yuan G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)MathSciNetzbMATHCrossRefGoogle Scholar
  14. 14.
    Zhang L., Zhou W., Li D.H.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)MathSciNetzbMATHCrossRefGoogle Scholar
  15. 15.
    Zhang L., Zhou W., Li D.H.: A descent modified Polak–Ribiére–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)MathSciNetzbMATHCrossRefGoogle Scholar
  16. 16.
    Andrei N.: A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization. Applied Mathematics Letters. 21, 165–171 (2008)MathSciNetzbMATHCrossRefGoogle Scholar
  17. 17.
    Andrei N.: A hybrid conjugate gradient algorithm for unconstrained optimization as a convex combination of Hestenes–Stiefel and Dai–Yuan. Stud. Inf. Control 17, 55–70 (2008)Google Scholar
  18. 18.
    Zhang, J., Xiao, Y., Wei, Z.: Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization. Math. Probl. Eng. Article ID 243290, 16 p. doi: 10.1155/2009/243290 (2009)
  19. 19.
    Bongartz K.E., Conn A.R., Gould N.I.M., Toint P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)zbMATHCrossRefGoogle Scholar
  20. 20.
    Dolan E.D., Moré J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)MathSciNetzbMATHCrossRefGoogle Scholar
  21. 21.
    Zoutendijk G.: Nonlinear programming computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)Google Scholar

Copyright information

© Springer-Verlag 2010

Authors and Affiliations

  1. 1.College of Mathematics and EconometricsHunan UniversityChangshaChina
  2. 2.College of Mathematics and Computational ScienceChangsha University of Science and TechnologyChangshaChina

Personalised recommendations