Abstract
A new family of conjugate gradient methods for large-scale unconstrained optimization problems is described. It is based on minimizing a penalty function, and uses a limited memory structure to exploit the useful information provided by the iterations. Our penalty function combines the good properties of the linear conjugate gradient method using some penalty parameters. We propose a suitable penalty parameter by solving an auxiliary linear optimization problem and show that the proposed parameter is promising. The global convergence of the new method is investigated under mild assumptions. Numerical results show that the new method is efficient and confirm the effectiveness of the memory structure.
Similar content being viewed by others
References
Babaie-Kafaki S, Fatemi M, Mahdavi-Amiri N (2011) Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer Algorithms 58:315–331
Babaie-Kafaki S, Fatemi M (2013) A modified two-point stepsize gradient algorithm for unconstrained minimization. Optim Methods Softw 28:1040–1050
Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23:296–320
Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43:87–101
Dai YH, Yuan Y (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10:177–182
Dai YH, Yuan Y (2000) Nonlinear conjugate gradient methods. Shanghai Science and Technology Publisher, Shanghai
Dai YH, Yuan Y (2001) An efficient hybrid conjugate gradient method for unconstrained optimization. Ann Oper Res 103:33–47
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–213
Fatemi M (2016) An optimal parameter for Dai–Liao family of conjugate gradient methods. J Optim Theory Appl 169:587–605
Fatemi M (2016) A new efficient conjugate gradient method for unconstrained optimization. J Comput Appl Math 300:207–216
Fatemi M (2017) A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim Methods Softw 32:1095–1112
Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7:149–154
Gilbert JC, Nocedal J (1992) Global convergence properties of conjugate gradient methods for optimization. SIAM J Optim 2:21–42
Gould NIM, Orban D, Toint PL (2013) CUTEst: a constrained and unconstrained testing environment with safe threads. Technical report, Rutherford Appleton Laboratory, Chilton, England
Hager WW, Zhang H (2005) A new conjugate gradient method with gauranteed descent and an efficient line search. SIAM J Optim 16:170–192
Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2:335–358
Hager WW, Zhang H (2006) Algorithm 851: CG-DESCENT, a conjugate gradientmethod with guaranteed descent. ACM Trans Math Softw 32:113137
Hestenes MR, Stiefel E (1952) Methods of conjugate gradient for solving linear system. J Res Nat Bur Stand 49:409–436
Jiang XZ, Jian JB (2019) Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search. J Comput Appl Math 328:525–534
Kou CX, Dai YH (2015) A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J Optim Theory Appl 165(1):209–224
Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program Ser B 45:503–528
Nocedal J (1980) Updating quasi-Newton matrices with limited storage. Math Comput 35:773–782
Polak E, Ribière G (1969) Note sur la convergence de methods de directions conjuguees. Revue Franccaise d’Informatique et de Recherche opiérationnelle 16:35–43
Polyak BT (1969) The conjugate gradient method in extremal problems. USSR Comp Math Math Phys 9:94–112
Yuan Y, Stoer J (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75:69–77
Acknowledgements
The author thanks the anonymous reviewers for their valuable comments and suggestions leading to an improvement in the quality of this work , and the Research Council of K. N. Toosi University of Technology for supporting this work.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Andreas Fischer.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Fatemi, M. A new conjugate gradient method with an efficient memory structure. Comp. Appl. Math. 38, 59 (2019). https://doi.org/10.1007/s40314-019-0834-4
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-019-0834-4