Skip to main content
Log in

Conjugate gradient versus steepest descent

  • Technical Note
  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

It is known that the conjugate-gradient algorithm is at least as good as the steepest-descent algorithm for minimizing quadratic functions. It is shown here that the conjugate-gradient algorithm is actually superior to the steepest-descent algorithm in that, in the generic case, at each iteration it yields a lower cost than does the steepest-descent algorithm, when both start at the same point.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Luenberger, D. G.,Introduction to Linear and Nonlinear Programming, Addison-Wesley Publishing Company, Reading, Massachusetts, 1973.

    Google Scholar 

  2. Langlois, W. E.,Conditions for Termination of the Method of Steepest Descent After a Finite Number of Iterations, IBM Journal of Research and Development, Vol. 10, pp. 98–99, 1966.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Communicated by D. Q. Mayne

Thanks are due to Professor R. W. Sargent, Imperial College, London, England, for suggestions concerning presentation.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Allwright, J.C. Conjugate gradient versus steepest descent. J Optim Theory Appl 20, 129–134 (1976). https://doi.org/10.1007/BF00933351

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00933351

Key Words

Navigation