Advertisement

Optimization pp 175-190 | Cite as

Conjugate Gradient and Quasi-Newton

  • Kenneth Lange
Part of the Springer Texts in Statistics book series (STS)

Abstract

Our discussion of Newton’s method has highlighted both its strengths and its weaknesses. Related algorithms such as scoring and Gauss-Newton exploit special features of the objective function f (x) in overcoming the defects of Newton’s method. We now consider algorithms that apply to generic functions f (x). These algorithms also operate by locally approximating f (x) by a strictly convex quadratic function. Indeed, the guiding philosophy behind many modern optimization algorithms is to see what techniques work well with quadratic functions and then to modify the best techniques to generic functions.

Keywords

Conjugate Gradient Line Search Conjugate Gradient Method Trust Region Positive Definite Matrix 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 2004

Authors and Affiliations

  • Kenneth Lange
    • 1
  1. 1.Department of Biomathematics and Human GeneticsUCLA School of MedicineLos AngelesUSA

Personalised recommendations