Skip to main content
Log in

A new conjugate gradient method with an efficient memory structure

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

A new family of conjugate gradient methods for large-scale unconstrained optimization problems is described. It is based on minimizing a penalty function, and uses a limited memory structure to exploit the useful information provided by the iterations. Our penalty function combines the good properties of the linear conjugate gradient method using some penalty parameters. We propose a suitable penalty parameter by solving an auxiliary linear optimization problem and show that the proposed parameter is promising. The global convergence of the new method is investigated under mild assumptions. Numerical results show that the new method is efficient and confirm the effectiveness of the memory structure.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Babaie-Kafaki S, Fatemi M, Mahdavi-Amiri N (2011) Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer Algorithms 58:315–331

    Article  MathSciNet  Google Scholar 

  • Babaie-Kafaki S, Fatemi M (2013) A modified two-point stepsize gradient algorithm for unconstrained minimization. Optim Methods Softw 28:1040–1050

    Article  MathSciNet  Google Scholar 

  • Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23:296–320

    Article  MathSciNet  Google Scholar 

  • Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43:87–101

    Article  MathSciNet  Google Scholar 

  • Dai YH, Yuan Y (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10:177–182

    Article  MathSciNet  Google Scholar 

  • Dai YH, Yuan Y (2000) Nonlinear conjugate gradient methods. Shanghai Science and Technology Publisher, Shanghai

    Google Scholar 

  • Dai YH, Yuan Y (2001) An efficient hybrid conjugate gradient method for unconstrained optimization. Ann Oper Res 103:33–47

    Article  MathSciNet  Google Scholar 

  • Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–213

    Article  MathSciNet  Google Scholar 

  • Fatemi M (2016) An optimal parameter for Dai–Liao family of conjugate gradient methods. J Optim Theory Appl 169:587–605

    Article  MathSciNet  Google Scholar 

  • Fatemi M (2016) A new efficient conjugate gradient method for unconstrained optimization. J Comput Appl Math 300:207–216

    Article  MathSciNet  Google Scholar 

  • Fatemi M (2017) A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim Methods Softw 32:1095–1112

    Article  MathSciNet  Google Scholar 

  • Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7:149–154

    Article  MathSciNet  Google Scholar 

  • Gilbert JC, Nocedal J (1992) Global convergence properties of conjugate gradient methods for optimization. SIAM J Optim 2:21–42

    Article  MathSciNet  Google Scholar 

  • Gould NIM, Orban D, Toint PL (2013) CUTEst: a constrained and unconstrained testing environment with safe threads. Technical report, Rutherford Appleton Laboratory, Chilton, England

  • Hager WW, Zhang H (2005) A new conjugate gradient method with gauranteed descent and an efficient line search. SIAM J Optim 16:170–192

    Article  MathSciNet  Google Scholar 

  • Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2:335–358

    MathSciNet  MATH  Google Scholar 

  • Hager WW, Zhang H (2006) Algorithm 851: CG-DESCENT, a conjugate gradientmethod with guaranteed descent. ACM Trans Math Softw 32:113137

    Article  Google Scholar 

  • Hestenes MR, Stiefel E (1952) Methods of conjugate gradient for solving linear system. J Res Nat Bur Stand 49:409–436

    Article  MathSciNet  Google Scholar 

  • Jiang XZ, Jian JB (2019) Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search. J Comput Appl Math 328:525–534

    Article  MathSciNet  Google Scholar 

  • Kou CX, Dai YH (2015) A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J Optim Theory Appl 165(1):209–224

    Article  MathSciNet  Google Scholar 

  • Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program Ser B 45:503–528

    Article  MathSciNet  Google Scholar 

  • Nocedal J (1980) Updating quasi-Newton matrices with limited storage. Math Comput 35:773–782

    Article  MathSciNet  Google Scholar 

  • Polak E, Ribière G (1969) Note sur la convergence de methods de directions conjuguees. Revue Franccaise d’Informatique et de Recherche opiérationnelle 16:35–43

    MATH  Google Scholar 

  • Polyak BT (1969) The conjugate gradient method in extremal problems. USSR Comp Math Math Phys 9:94–112

    Article  Google Scholar 

  • Yuan Y, Stoer J (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75:69–77

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The author thanks the anonymous reviewers for their valuable comments and suggestions leading to an improvement in the quality of this work , and the Research Council of K. N. Toosi University of Technology for supporting this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masoud Fatemi.

Additional information

Communicated by Andreas Fischer.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fatemi, M. A new conjugate gradient method with an efficient memory structure. Comp. Appl. Math. 38, 59 (2019). https://doi.org/10.1007/s40314-019-0834-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-019-0834-4

Keywords

Mathematics Subject Classification

Navigation