Advertisement

Optimization pp 155-173 | Cite as

Newton’s Method

  • Kenneth Lange
Part of the Springer Texts in Statistics book series (STS)

Abstract

The MM and EM algorithms are hardly the only methods of optimization. Newton’s method is better known and more widely applied. Despite its defects, Newton’s method is the gold standard for speed of convergence and forms the basis of most modern optimization algorithms. Its many variants seek to retain its fast convergence while taming its defects. They all revolve around the core idea of locally approximating the objective function by a strictly convex quadratic function. At each iteration the quadratic approximation is optimized. Safeguards are introduced to keep the iterates from veering toward irrelevant stationary points.

Keywords

Positive Semidefinite Exponential Family Positive Definite Matrix Descent Direction Quadratic Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media New York 2004

Authors and Affiliations

  • Kenneth Lange
    • 1
  1. 1.Department of Biomathematics and Human GeneticsUCLA School of MedicineLos AngelesUSA

Personalised recommendations