A new conjugate gradient method with an efficient memory structure


A new family of conjugate gradient methods for large-scale unconstrained optimization problems is described. It is based on minimizing a penalty function, and uses a limited memory structure to exploit the useful information provided by the iterations. Our penalty function combines the good properties of the linear conjugate gradient method using some penalty parameters. We propose a suitable penalty parameter by solving an auxiliary linear optimization problem and show that the proposed parameter is promising. The global convergence of the new method is investigated under mild assumptions. Numerical results show that the new method is efficient and confirm the effectiveness of the memory structure.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3


  1. Babaie-Kafaki S, Fatemi M, Mahdavi-Amiri N (2011) Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer Algorithms 58:315–331

    MathSciNet  Article  Google Scholar 

  2. Babaie-Kafaki S, Fatemi M (2013) A modified two-point stepsize gradient algorithm for unconstrained minimization. Optim Methods Softw 28:1040–1050

    MathSciNet  Article  Google Scholar 

  3. Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23:296–320

    MathSciNet  Article  Google Scholar 

  4. Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43:87–101

    MathSciNet  Article  Google Scholar 

  5. Dai YH, Yuan Y (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10:177–182

    MathSciNet  Article  Google Scholar 

  6. Dai YH, Yuan Y (2000) Nonlinear conjugate gradient methods. Shanghai Science and Technology Publisher, Shanghai

    Google Scholar 

  7. Dai YH, Yuan Y (2001) An efficient hybrid conjugate gradient method for unconstrained optimization. Ann Oper Res 103:33–47

    MathSciNet  Article  Google Scholar 

  8. Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–213

    MathSciNet  Article  Google Scholar 

  9. Fatemi M (2016) An optimal parameter for Dai–Liao family of conjugate gradient methods. J Optim Theory Appl 169:587–605

    MathSciNet  Article  Google Scholar 

  10. Fatemi M (2016) A new efficient conjugate gradient method for unconstrained optimization. J Comput Appl Math 300:207–216

    MathSciNet  Article  Google Scholar 

  11. Fatemi M (2017) A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim Methods Softw 32:1095–1112

    MathSciNet  Article  Google Scholar 

  12. Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7:149–154

    MathSciNet  Article  Google Scholar 

  13. Gilbert JC, Nocedal J (1992) Global convergence properties of conjugate gradient methods for optimization. SIAM J Optim 2:21–42

    MathSciNet  Article  Google Scholar 

  14. Gould NIM, Orban D, Toint PL (2013) CUTEst: a constrained and unconstrained testing environment with safe threads. Technical report, Rutherford Appleton Laboratory, Chilton, England

  15. Hager WW, Zhang H (2005) A new conjugate gradient method with gauranteed descent and an efficient line search. SIAM J Optim 16:170–192

    MathSciNet  Article  Google Scholar 

  16. Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2:335–358

    MathSciNet  MATH  Google Scholar 

  17. Hager WW, Zhang H (2006) Algorithm 851: CG-DESCENT, a conjugate gradientmethod with guaranteed descent. ACM Trans Math Softw 32:113137

    Article  Google Scholar 

  18. Hestenes MR, Stiefel E (1952) Methods of conjugate gradient for solving linear system. J Res Nat Bur Stand 49:409–436

    MathSciNet  Article  Google Scholar 

  19. Jiang XZ, Jian JB (2019) Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search. J Comput Appl Math 328:525–534

    MathSciNet  Article  Google Scholar 

  20. Kou CX, Dai YH (2015) A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J Optim Theory Appl 165(1):209–224

    MathSciNet  Article  Google Scholar 

  21. Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program Ser B 45:503–528

    MathSciNet  Article  Google Scholar 

  22. Nocedal J (1980) Updating quasi-Newton matrices with limited storage. Math Comput 35:773–782

    MathSciNet  Article  Google Scholar 

  23. Polak E, Ribière G (1969) Note sur la convergence de methods de directions conjuguees. Revue Franccaise d’Informatique et de Recherche opiérationnelle 16:35–43

    MATH  Google Scholar 

  24. Polyak BT (1969) The conjugate gradient method in extremal problems. USSR Comp Math Math Phys 9:94–112

    Article  Google Scholar 

  25. Yuan Y, Stoer J (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75:69–77

    MathSciNet  Article  Google Scholar 

Download references


The author thanks the anonymous reviewers for their valuable comments and suggestions leading to an improvement in the quality of this work , and the Research Council of K. N. Toosi University of Technology for supporting this work.

Author information



Corresponding author

Correspondence to Masoud Fatemi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Communicated by Andreas Fischer.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Fatemi, M. A new conjugate gradient method with an efficient memory structure. Comp. Appl. Math. 38, 59 (2019). https://doi.org/10.1007/s40314-019-0834-4

Download citation


  • Conjugate gradient method
  • Dai–Liao family
  • Limited memory

Mathematics Subject Classification

  • 90C06
  • 90C26
  • 65Y20