Skip to main content

Two Adaptive Dai–Liao Nonlinear Conjugate Gradient Methods

Abstract

Following recent attempts to find appropriate choices for parameter of the nonlinear conjugate gradient method proposed by Dai and Liao, two adaptive versions of the method are proposed based on a matrix analysis and using the memoryless BFGS updating formula. Under proper conditions, it is shown that the methods are globally convergent. Numerical experiments are done on a set of CUTEr unconstrained optimization test problems; they demonstrate the efficiency of the proposed methods in the sense of Dolan–Moré performance profile.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2

References

  1. Andrei N (2007) Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud Inform Control 16(4):333–352

    Google Scholar 

  2. Andrei N (2011) Open problems in conjugate gradient algorithms for unconstrained optimization. Bull Malays Math Sci Soc 34(2):319–330

    MathSciNet  MATH  Google Scholar 

  3. Babaie-Kafaki S (2014) An adaptive conjugacy condition and related nonlinear conjugate gradient methods. Int J Comput Methods 11(4):1350092

    MathSciNet  Article  MATH  Google Scholar 

  4. Babaie-Kafaki S (2016) On optimality of two adaptive choices for the parameter of Dai-Liao method. Optim Lett 10:1789–1797

    MathSciNet  Article  MATH  Google Scholar 

  5. Babaie-Kafaki S, Ghanbari R (2014a) The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur J Oper Res 234(3):625–630

    MathSciNet  Article  MATH  Google Scholar 

  6. Babaie-Kafaki S, Ghanbari R (2014b) A descent extension of the Polak-Ribière-Polyak conjugate gradient method. Comput Math Appl 68(12):2005–2011

    MathSciNet  Article  MATH  Google Scholar 

  7. Babaie-Kafaki S, Ghanbari R (2014c) A descent family of Dai-Liao conjugate gradient methods. Optim Methods Softw 29(3):583–591

    MathSciNet  Article  MATH  Google Scholar 

  8. Babaie-Kafaki S, Ghanbari R (2015) Two optimal Dai-Liao conjugate gradient methods. Optimization 64(11):2277–2287

    MathSciNet  Article  MATH  Google Scholar 

  9. Dai YH, Han JY, Liu GH, Sun DF, Yin HX, Yuan YX (1999) Convergence properties of nonlinear conjugate gradient methods. SIAM J Optim 10(2):348–358

    MathSciNet  MATH  Google Scholar 

  10. Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320

    MathSciNet  Article  MATH  Google Scholar 

  11. Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43(1):87–101

    MathSciNet  Article  MATH  Google Scholar 

  12. Dai Z, Chen X, Wen F (2015) A modified Perry’s conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations. Appl Math Comput 270(7):378–386

    MathSciNet  Google Scholar 

  13. Dai Z, Li D, Wen F (2016) Worse-case conditional value-at-risk for asymmetrically distributed asset scenarios returns. J Comput Anal Appl 20(2):237–251

    MathSciNet  MATH  Google Scholar 

  14. Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2, Ser. A):201–213

  15. Dong XL, Liu H, He Y (2015) A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J Optim Theory Appl 165(1):225–241

    MathSciNet  Article  MATH  Google Scholar 

  16. Gould NIM, Orban D, Toint PhL (2003) CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394

    Article  MATH  Google Scholar 

  17. Hager WW, Zhang H (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16(1):170–192

    MathSciNet  Article  MATH  Google Scholar 

  18. Hager WW, Zhang H (2006a) Algorithm 851: \(CG\_Descent\), a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137

    Article  MATH  Google Scholar 

  19. Hager WW, Zhang H (2006b) A survey of nonlinear conjugate gradient methods. Pac J Optim 2(1):35–58

    MathSciNet  MATH  Google Scholar 

  20. Perry A (1976) A modified conjugate gradient algorithm. Oper Res 26(6):1073–1078

    MathSciNet  Article  MATH  Google Scholar 

  21. Powell MJD (1984) Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths DF (ed) Numerical analysis (Dundee, 1983) volume 1066 of Lecture Notes in Math. Springer, Berlin, pp 122–141

    Google Scholar 

  22. Sun W, Yuan YX (2006) Optimization theory and methods: nonlinear programming. Springer, New York

    MATH  Google Scholar 

  23. Watkins DS (2002) Fundamentals of matrix computations. Wiley, New York

    Book  MATH  Google Scholar 

  24. Wen F, He Z, Dai Z, Yang X (2014) Characteristics of investors’ risk preference for stock markets. Econ Comput Econ Cybern 48(3):235–254

    Google Scholar 

  25. Xu C, Zhang JZ (2001) A survey of quasi-Newton equations and quasi-Newton methods for optimization. Ann Oper Res 103(1–4):213–234

    MathSciNet  Article  MATH  Google Scholar 

Download references

Acknowledgements

The authors are grateful to Professor William W. Hager for providing the line search code. They also thank the anonymous reviewer for his/her valuable suggestions helped to improve the presentation.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Babaie-Kafaki, S., Ghanbari, R. Two Adaptive Dai–Liao Nonlinear Conjugate Gradient Methods. Iran J Sci Technol Trans Sci 42, 1505–1509 (2018). https://doi.org/10.1007/s40995-017-0271-4

Download citation

Keywords

  • Unconstrained optimization
  • Conjugate gradient method
  • BFGS update
  • Line search
  • Global convergence

Mathematics Subject Classification

  • 90C53
  • 49M37
  • 65K05