Lecture Notes in Mathematics Volume 1066, 1984, pp 122-141
Date: 15 Nov 2006

Nonconvex minimization calculations and the conjugate gradient method

* Final gross prices may vary according to local VAT.

Get Access

Abstract

We consider the global convergence of conjugate gradient methods without restarts, assuming exact arithmetic and exact line searches, when the objective function is twice continuously differentiable and has bounded level sets. Most of our attention is given to the Polak-Ribière algorithm, and unfortunately we find examples that show that the calculated gradients can remain bounded away from zero. The examples that have only two variables show also that some variable metric algorithms for unconstrained optimization need not converge. However, a global convergence theorem is proved for the Fletcher-Reeves version of the conjugate gradient method.