, Volume 27, Issue 2, pp 155-175

QN-like variable storage conjugate gradients

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Both conjugate gradient and quasi-Newton methods are quite successful at minimizing smooth nonlinear functions of several variables, and each has its advantages. In particular, conjugate gradient methods require much less storage to implement than a quasi-Newton code and therefore find application when storage limitations occur. They are, however, slower, so there have recently been attempts to combine CG and QN algorithms so as to obtain an algorithm with good convergence properties and low storage requirements. One such method is the code CONMIN due to Shanno and Phua; it has proven quite successful but it has one limitation. It has no middle ground, in that it either operates as a quasi-Newton code using O(n 2) storage locations, or as a conjugate gradient code using 7n locations, but it cannot take advantage of the not unusual situation where more than 7n locations are available, but a quasi-Newton code requires an excessive amount of storage.

In this paper we present a way of looking at conjugate gradient algorithms which was in fact given by Shanno and Phua but which we carry further, emphasize and clarify. This applies in particular to Beale's 3-term recurrence relation. Using this point of view, we develop a new combined CG-QN algorithm which can use whatever storage is available; CONMIN occurs as a special case. We present numerical results to demonstrate that the new algorithm is never worse than CONMIN and that it is almost always better if even a small amount of extra storage is provided.

The authors wish to express their appreciation for the support of the Natural Sciences and Engineering Research Council through Operating Grant A8962 (Buckley) and of the National Research Council of Canada through a Postgraduate Scholarship (LeNir).