Abstract
Our discussion of Newton’s method has highlighted both its strengths and its weaknesses. Related algorithms such as scoring and Gauss-Newton exploit special features of the objective function f (x) in overcoming the defects of Newton’s method. We now consider algorithms that apply to generic functions f (x). These algorithms also operate by locally approximating f (x) by a strictly convex quadratic function. Indeed, the guiding philosophy behind many modern optimization algorithms is to see what techniques work well with quadratic functions and then to modify the best techniques to generic functions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer Science+Business Media New York
About this chapter
Cite this chapter
Lange, K. (2004). Conjugate Gradient and Quasi-Newton. In: Optimization. Springer Texts in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4757-4182-7_9
Download citation
DOI: https://doi.org/10.1007/978-1-4757-4182-7_9
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4419-1910-6
Online ISBN: 978-1-4757-4182-7
eBook Packages: Springer Book Archive