A Gradient-Based Globalization Strategy for the Newton Method
- 36 Downloads
The Newton method is one of the most powerful methods for the solution of smooth unconstrained optimization problems. It has local quadratic convergence in a neighborhood of a local minimum where the Hessian is positive definite and Lipschitz continuous. Several strategies have been proposed in order to achieve global convergence. They are mainly based either on the modification of the Hessian together with a line search or on the adoption of a restricted-step strategy. We propose a globalization technique that combines the Newton and gradient directions, producing a descent direction on which a backtracking Armijo line search is performed. Our work is motivated by the effectiveness of gradient methods using suitable spectral step-length selection rules. We prove global convergence of the resulting algorithm, and quadratic rate of convergence under suitable second-order optimality conditions. A numerical comparison with a modified Newton method exploiting Hessian modifications shows the effectiveness of our approach.
KeywordsNewton method Gradient method Global convergence
- 14.Moré, J.J., Sorensen, D.C.: Newton’s method. In: Golub, G. (ed.) Studies in Numerical Analysis, pp. 29–82. The Mathematical Association of America, Providence (1984)Google Scholar