Skip to main content
Log in

Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

In this article, based on the modified secant equation, we propose a modified Hestenes-Stiefel (HS) conjugate gradient method which has similar form as the CG-DESCENT method proposed by Hager and Zhang (SIAM J Optim 16:170–192, 2005). The presented method can generate sufficient descent directions without any line search. Under some mild conditions, we show that it is globally convergent with Armijo line search. Moreover, the R-linear convergence rate of the modified HS method is established. Preliminary numerical results show that the proposed method is promising, and competitive with the well-known CG-DESCENT method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)

    MATH  MathSciNet  Google Scholar 

  2. Broyden, C.G., Dennis, J.E., Moré, J.J.: On the local and superlinear convergence of quasi-Newton methods. J. Inst. Math. Appl. 12, 223–246 (1973)

    Article  MATH  MathSciNet  Google Scholar 

  3. Byrd, R., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconst rained minimization. SIAM J. Numer. Anal. 26, 727–739 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  4. Byrd, R., Nocedal, J., Yuan, Y.: Global convergence of a class of quasi-Newton methods on convex problems. SIAM J. Numer. Anal. 24, 1171–1189 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  5. Bongartz, K.E., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)

    Article  MATH  Google Scholar 

  6. Cheng, W.Y.: A two-term PRP-based descent method. Numer. Funct. Anal. Optim. 28, 1217–1230 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  7. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  8. Dai, Y.H.: Convergence properties of the BFGS algorithm. SIAM J. Optim. 13, 693–701 (2003)

    Article  Google Scholar 

  9. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (2000)

    Article  Google Scholar 

  10. Dai, Z.F., Tian, B.S.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Opt. Lett. (2010). doi:10.1007/s11590-010-0224-8

    Google Scholar 

  11. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MATH  MathSciNet  Google Scholar 

  12. Fletcher, R.: Practical Methods of Optimization, vol. I: Unconstrained Optimization. Wiley, New York (1987)

    Google Scholar 

  13. Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand., B 49, 409–432 (1952)

    MATH  MathSciNet  Google Scholar 

  14. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM. J. Optim. 2, 21–42 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  15. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  16. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2, 35–58 (2006)

    MATH  MathSciNet  Google Scholar 

  17. Liu, Y.L., Storey, C.S.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  18. Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  19. Polak, B., Ribiére, G.: Note surla convergence des méthodes de directions conjuguées. Rev. Francaise Infomat Recherche Operatonelle, 3e Année 16, 35–43 (1969)

    Google Scholar 

  20. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)

    Article  Google Scholar 

  21. Shi, Z.J., Shen, J.: Convergence of the Polak-Ribiére-Polyak conjugate gradient method. Nonlinear Anal. 66, 1428–1441 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  22. Yu, G.H., Zhao, Y.L., Wei, Z.X.: A descent nonlinear conjugate gradient method for large-scale unconstrained optimization. Appl. Math. Comput. 187, 636–643 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  23. Yu, G.H., Guan, L.T., Chen, W.: Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization. Optim. Methods Softw. 23, 275–293 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  24. Yu, G.H., Huang, J.H, Zhou, Y.: A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 23, 555–560 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  25. Yuan, Y.: Numerical Methods for Nonlinear Programming. Shanghai Scientific & Technical Publishers (1993)

  26. Yin, K., Xiao, Y.H., Zhang, M.L.: Nonlinear conjugate gradient method for l 1-norm regularization problems in compressive sensing. J. Comput. Infor. Sys. 7, 880–885 (2011)

    Google Scholar 

  27. Wen, F.H., Yang, X.G.: Skewness of return distribution and coeffcient of risk premium. J. Syst. Sci. Complex. 22, 360–371 (2009)

    Article  MathSciNet  Google Scholar 

  28. Wen, F.H., Liu, Z.F.: A copula-based correlation measure and its application in chinese stock market. Int. J. Inf. Technol. Decis. Mak. 8, 1–15 (2009)

    Article  MATH  Google Scholar 

  29. Zhang, L., Zhou, W., Li, D.: A descent modified Polak-Ribi-re-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  30. Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)

    Article  MathSciNet  Google Scholar 

  31. Zoutendijk, G.: Nonlinear programming computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhifeng Dai.

Additional information

This work was supported by the NSF of China grants (11071087 and 70971013), Hunan Natural Science Foundation (09JJ1010), and the Open Fund Project of Key Research Institute of Philosophies and Social Sciences in Hunan Universities.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dai, Z., Wen, F. Global convergence of a modified Hestenes-Stiefel nonlinear conjugate gradient method with Armijo line search. Numer Algor 59, 79–93 (2012). https://doi.org/10.1007/s11075-011-9477-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-011-9477-2

Keywords

Mathematics Subject Classifications (2010)

Navigation