Abstract
The Newton method is one of the most powerful methods for the solution of smooth unconstrained optimization problems. It has local quadratic convergence in a neighborhood of a local minimum where the Hessian is positive definite and Lipschitz continuous. Several strategies have been proposed in order to achieve global convergence. They are mainly based either on the modification of the Hessian together with a line search or on the adoption of a restricted-step strategy. We propose a globalization technique that combines the Newton and gradient directions, producing a descent direction on which a backtracking Armijo line search is performed. Our work is motivated by the effectiveness of gradient methods using suitable spectral step-length selection rules. We prove global convergence of the resulting algorithm, and quadratic rate of convergence under suitable second-order optimality conditions. A numerical comparison with a modified Newton method exploiting Hessian modifications shows the effectiveness of our approach.
This work was partially supported by Gruppo Nazionale per il Calcolo Scientifico - Istituto Nazionale di Alta Matematica (GNCS-INdAM). Marco Viola was also supported by the MOD_CELL_DEV Project - Programma di finanziamento della Ricerca di Ateneo, University of Naples Federico II.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Antonelli, L., De Simone, V., di Serafino, D.: On the application of the spectral projected gradient method in image segmentation. J. Math. Imaging and Vis. 54(1), 106–116 (2016)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)
Bertsekas, D.P.: Nonlinear Programming, 3rd edn. Athena Scientific, Belmont (2016)
Crisci, S., Ruggiero, V., Zanni, L.: Steplength selection in gradient projection methods for box-constrained quadratic programs. Appl. Math. Comput. 356, 312–327 (2019)
De Asmundis, R., di Serafino, D., Landi, G.: On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems. J. Comput. Appl. Math. 302, 81–93 (2016)
di Serafino, D., Ruggiero, V., Toraldo, G., Zanni, L.: On the steplength selection in gradient methods for unconstrained optimization. Appl. Math. Comput. 318, 176–195 (2018)
di Serafino, D., Toraldo, G., Viola, M., Barlow, J.: A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables. SIAM J. Optim. 28(4), 2809–2838 (2018)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. Ser. B 91(2), 201–213 (2002)
Dostál, Z., Toraldo, G., Viola, M., Vlach, O.: Proportionality-based gradient methods with applications in contact mechanics. In: Kozubek, T., et al. (eds.) HPCSE 2017. LNCS, vol. 11087, pp. 47–58. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-97136-0_4
Fang, H., O’Leary, D.P.: Modified Cholesky algorithms: a catalog with new approaches. Math. Program. 115(2), 319–349 (2008)
Fletcher, R.: Practical Methods of Optimization. Wiley, New York (2013)
Gill, P., Murray, W.: Newton-type methods for unconstrained and linearly constrained optimization. Math. Program. 7(1), 311–350 (1974)
Han, L., Neumann, M.: Combining quasi-Newton and Cauchy directions. Int. J. Appl. Math. 22(2), 167–191 (2003)
Moré, J.J., Sorensen, D.C.: Newton’s method. In: Golub, G. (ed.) Studies in Numerical Analysis, pp. 29–82. The Mathematical Association of America, Providence (1984)
Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Algorithm 566: FORTRAN subroutines for testing unconstrained optimization software [C5], [E4]. ACM Trans. Math. Softw. 7(1), 136–140 (1981)
Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course, vol. 87. Springer, Boston (2013). https://doi.org/10.1007/978-1-4419-8853-9
Nocedal, J., Wright, S.: Numerical Optimization. Springer Series in Operations Research, 2nd edn. Springer, New York (2006). https://doi.org/10.1007/978-0-387-40065-5
Polak, E.: Optimization: Algorithms and Consistent Approximations, vol. 124. Springer, New York (2012). https://doi.org/10.1007/978-1-4612-0663-7
Zanella, R., Boccacci, P., Zanni, L., Bertero, M.: Efficient gradient projection methods for edge-preserving removal of Poisson noise. Inverse Prob. 25(4), 045010 (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
di Serafino, D., Toraldo, G., Viola, M. (2020). A Gradient-Based Globalization Strategy for the Newton Method. In: Sergeyev, Y., Kvasov, D. (eds) Numerical Computations: Theory and Algorithms. NUMTA 2019. Lecture Notes in Computer Science(), vol 11973. Springer, Cham. https://doi.org/10.1007/978-3-030-39081-5_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-39081-5_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-39080-8
Online ISBN: 978-3-030-39081-5
eBook Packages: Computer ScienceComputer Science (R0)