Nonconvex minimization calculations and the conjugate gradient method

  • M. J. D. Powell
Conference paper

DOI: 10.1007/BFb0099521

Part of the Lecture Notes in Mathematics book series (LNM, volume 1066)
Cite this paper as:
Powell M.J.D. (1984) Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths D.F. (eds) Numerical Analysis. Lecture Notes in Mathematics, vol 1066. Springer, Berlin, Heidelberg

Abstract

We consider the global convergence of conjugate gradient methods without restarts, assuming exact arithmetic and exact line searches, when the objective function is twice continuously differentiable and has bounded level sets. Most of our attention is given to the Polak-Ribière algorithm, and unfortunately we find examples that show that the calculated gradients can remain bounded away from zero. The examples that have only two variables show also that some variable metric algorithms for unconstrained optimization need not converge. However, a global convergence theorem is proved for the Fletcher-Reeves version of the conjugate gradient method.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag 1984

Authors and Affiliations

  • M. J. D. Powell

There are no affiliations available

Personalised recommendations