, Volume 71, Issue 2, pp 399-405

Global convergence result for conjugate gradient methods

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Conjugate gradient optimization algorithms depend on the search directions,

$$\begin{gathered} s^{(1)} = - g^{(1)} , \hfill \\ s^{(k + 1)} = - g^{(k + 1)} + \beta ^{(k)} s^{(k)} ,k \geqslant 1, \hfill \\ \end{gathered} $$
with different methods arising from different choices for the scalar β(k). In this note, conditions are given on β(k) to ensure global convergence of the resulting algorithms.