, Volume 15, Issue 1, pp 85–92

A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update

Research Paper

DOI: 10.1007/s10288-016-0323-1

Cite this article as:
Babaie-Kafaki, S. & Ghanbari, R. 4OR-Q J Oper Res (2017) 15: 85. doi:10.1007/s10288-016-0323-1


Minimizing the distance between search direction matrix of the Dai–Liao method and the scaled memoryless BFGS update in the Frobenius norm, and using Powell’s nonnegative restriction of the conjugate gradient parameters, a one-parameter class of nonlinear conjugate gradient methods is proposed. Then, a brief global convergence analysis is made with and without convexity assumption on the objective function. Preliminary numerical results are reported; they demonstrate a proper choice for the parameter of the proposed class of conjugate gradient methods may lead to promising numerical performance.


Unconstrained optimization Conjugate gradient method Scaled memoryless BFGS update Frobenius norm Global convergence 

Mathematics Subject Classification

90C53 49M37 65K05 

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.Department of Mathematics, Faculty of Mathematics, Statistics and Computer ScienceSemnan UniversitySemnanIran
  2. 2.Faculty of Mathematical SciencesFerdowsi University of MashhadMashhadIran

Personalised recommendations