Conjugate Direction Methods in Optimization pp 81-149 | Cite as

# Conjugate Direction Methods

## Abstract

In the preceding pages we considered two methods for finding a minimum point of a real-valued function *f* of *n* real variables, namely, Newton’s method and the gradient method. The gradient method is easy to apply. However, convergence is often very slow. On the other hand, Newton’s algorithm normally has rapid convergence but involves considerable computation at each step. Recall that one step of Newton’s method involves the computation of the gradient *f*′(*x*) and the Hessian *f*″(*x*) of *f* and the solution of a linear system of equations, usually, by the inversion of the Hessian *f*″(*x*) of *f* It is a crucial fact that a Newton step can be accomplished instead by a sequence of *n* linear minimizations in *n* suitably chosen directions, called *conjugate directions*. This fact is the central theme in the design of an important class of minimization algorithms, called *conjugate direction algorithms*.

### Keywords

Stein## Preview

Unable to display preview. Download preview PDF.