Skip to main content
Log in

New properties of a nonlinear conjugate gradient method

  • Original article
  • Published:
Numerische Mathematik Aims and scope Submit manuscript


This paper provides several new properties of the nonlinear conjugate gradient method in [5]. Firstly, the method is proved to have a certain self-adjusting property that is independent of the line search and the function convexity. Secondly, under mild assumptions on the objective function, the method is shown to be globally convergent with a variety of line searches. Thirdly, we find that instead of the negative gradient direction, the search direction defined by the nonlinear conjugate gradient method in [5] can be used to restart any optimization method while guaranteeing the global convergence of the method. Some numerical results are also presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations


Additional information

Received March 12, 1999 / Revised version received April 25, 2000 / Published online February 5, 2001

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dai, YH. New properties of a nonlinear conjugate gradient method. Numer. Math. 89, 83–98 (2001).

Download citation

  • Issue Date:

  • DOI: