Skip to main content
Log in

A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update

  • Research Paper
  • Published:
4OR Aims and scope Submit manuscript

Abstract

Minimizing the distance between search direction matrix of the Dai–Liao method and the scaled memoryless BFGS update in the Frobenius norm, and using Powell’s nonnegative restriction of the conjugate gradient parameters, a one-parameter class of nonlinear conjugate gradient methods is proposed. Then, a brief global convergence analysis is made with and without convexity assumption on the objective function. Preliminary numerical results are reported; they demonstrate a proper choice for the parameter of the proposed class of conjugate gradient methods may lead to promising numerical performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Andrei N (2010) Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur J Oper Res 204(3):410–420

    Article  Google Scholar 

  • Andrei N (2011) Open problems in conjugate gradient algorithms for unconstrained optimization. B Malays Math Sci Soc 34(2):319–330

    Google Scholar 

  • Babaie-Kafaki S, Ghanbari R (2014) The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur J Oper Res 234(3):625–630

    Article  Google Scholar 

  • Babaie-Kafaki S, Ghanbari R (2015) A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim Methods Softw 30(4):673–681

    Article  Google Scholar 

  • Babaie-Kafaki S, Ghanbari R (2015) Two optimal Dai–Liao conjugate gradient methods. Optimization 64(11):2277–2287

    Article  Google Scholar 

  • Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43(1):87–101

    Article  Google Scholar 

  • Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2, Ser. A):201–213

    Article  Google Scholar 

  • Gould NIM, Orban D, Toint PhL (2003) CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394

    Article  Google Scholar 

  • Hager WW, Zhang H (2006) Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137

    Article  Google Scholar 

  • Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2(1):35–58

    Google Scholar 

  • Oren SS, Luenberger DG (1974) Self-scaling variable metric (SSVM) algorithms. I. Criteria and sufficient conditions for scaling a class of algorithms. Manag Sci 20(5):845–862

  • Powell MJD (1986) Convergence properties of algorithms for nonlinear optimization. SIAM Rev 28(4):487–500

    Article  Google Scholar 

  • Sun W, Yuan YX (2006) Optimization theory and methods: nonlinear programming. Springer, New York

    Google Scholar 

Download references

Acknowledgments

This research was supported by Research Councils of Semnan University and Ferdowsi University of Mashhad. Also, the first author was supported in part by the grant 95813776 from Iran National Science Foundation (INSF). The authors are grateful to Professor William W. Hager for providing the C++ code of CG_Descent. They also thank the anonymous reviewers and the Associate Editor for their valuable comments and suggestions helped to improve the quality of this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Babaie-Kafaki, S., Ghanbari, R. A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR-Q J Oper Res 15, 85–92 (2017). https://doi.org/10.1007/s10288-016-0323-1

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10288-016-0323-1

Keywords

Mathematics Subject Classification

Navigation