Skip to main content
Log in

Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

In this paper, by taking a little modification to the search direction of the MPRP method, a variant PRP conjugate gradient method is proposed, which can satisfies the sufficient descent condition independent of any line search. Also, we propose a general form of conjugate gradient parameter \(\beta _k\), and the corresponding method always generates a sufficient descent direction independent of the line search employed. We establish the global convergence of our methods without the assumption that the steplength is bounded away from zero. Numerical results illustrate that our methods can efficiently solve the test problems and therefore is promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  2. Liu, Y.L., Storey, C.S.: Efficient generalized conjugate gradient algorithms. part 1: theory. J. Optim. Theory Appl. 69(1), 129–137 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  3. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    Article  MATH  MathSciNet  Google Scholar 

  4. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MATH  MathSciNet  Google Scholar 

  5. Fletcher, R.: Practical methods of optimization. Wiley (2013)

  6. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  7. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  8. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  9. Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  10. Cheng, W.Y., Li, D.H.: An active set modified Polak-Ribière-Polyak method for large-scale nonlinear bound constrained optimization. J. Optim. Theory Appl. 155(3), 1084–1094 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  11. Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  12. Andrei, N.: Another conjugate gradient algorithm with guaranteed descent and the conjugacy conditions for large-scaled unconstrained optimization. J. Optim. Theory Appl. 159(3), 159–182 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  13. Nakamura, W., Narushima, Y., Yabe, H.: Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. J. Ind. Manag. Optim. 9, 595–619 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  14. Yu, G., Guan, L., Li, G.: Global convergence of modiffied Polak-Ribière- Polyak conjugate gradient methods with sufficient descent property. J. Ind. Manag. Optim. 4, 565–579 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  15. Zhang, L., Li, J.: A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization. Appl. Math. Comput. 217, 10295–10304 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  16. Dong, X., Liu, H., He, Y., Yang, X.: A modified Hestenes–Stiefel conjugate gradient method with sufficient descent condition and conjugac condition. J. Comp. Appl. Math. (2015). doi:10.1016/j.cam.2014.11.058

  17. Dong, X., Liu, H., He, Y.: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. (2015).doi:10.1007/s10957-014-0601-z

  18. Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3(1), 11–21 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  19. Dai, Z.F., Tian, B.S.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 5(4), 615–630 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  20. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    Article  MATH  MathSciNet  Google Scholar 

  21. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MATH  MathSciNet  Google Scholar 

  22. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser.A), 201–213 (2002)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

We would like to thank Professors W. W. Hager and H. Zhang for their CG_DESCENT code for numerical comparison. This work is supported by the Natural Science Foundation of China (11361001), Doctor Foundation of Beifang University of Nationalities (2014XBZ09) and Fundamental Research Funds for the Central Universities (K50513100007).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiao Liang Dong.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dong, X.L., Liu, H., Xu, Y.L. et al. Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence. Optim Lett 9, 1421–1432 (2015). https://doi.org/10.1007/s11590-014-0836-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-014-0836-5

Keywords

Navigation