Advertisement

Conjugate Direction Methods

  • Magnus Rudolph Hestenes
Chapter
Part of the Applications of Mathematics book series (SMAP, volume 12)

Abstract

In the preceding pages we considered two methods for finding a minimum point of a real-valued function f of n real variables, namely, Newton’s method and the gradient method. The gradient method is easy to apply. However, convergence is often very slow. On the other hand, Newton’s algorithm normally has rapid convergence but involves considerable computation at each step. Recall that one step of Newton’s method involves the computation of the gradient f′(x) and the Hessian f″(x) of f and the solution of a linear system of equations, usually, by the inversion of the Hessian f″(x) of f It is a crucial fact that a Newton step can be accomplished instead by a sequence of n linear minimizations in n suitably chosen directions, called conjugate directions. This fact is the central theme in the design of an important class of minimization algorithms, called conjugate direction algorithms.

Keywords

Conjugate Gradient Quadratic Function Minimum Point Conjugate Direction Polar Line 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag New York Inc. 1980

Authors and Affiliations

  • Magnus Rudolph Hestenes
    • 1
  1. 1.Department of MathematicsUniversity of CaliforniaLos AngelesUSA

Personalised recommendations