Skip to main content
Log in

A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

A Newton-like method for unconstrained minimization is introduced in the present work. While the computer work per iteration of the best-known implementations may need several factorizations or may use rather expensive matrix decompositions, the proposed method uses a single cheap factorization per iteration. Convergence and complexity and results, even in the case in which the subproblems’ Hessians are far from being Hessians of the objective function, are presented. Moreover, when the Hessian is Lipschitz-continuous, the proposed method enjoys \(O(\varepsilon ^{-3/2})\) evaluation complexity for first-order optimality and \(O(\varepsilon ^{-3})\) for second-order optimality as other recently introduced Newton method for unconstrained optimization based on cubic regularization or special trust-region procedures. Fairly successful and fully reproducible numerical experiments are presented and the developed corresponding software is freely available.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. With the exception of problem ARGLINC for which we set \(f_{\mathrm {target}}\) as its known optimal value times \(1+10^{-15}\).

  2. With the exception of problem STREC that we were unable to find in the current distribution of CUTEst.

References

  1. Anderson, E., Bai, Z., Bischof, C., Blackford, S., Demmel, J., Dongarra, J., Du Croz, J., Greenbaum, A., Hammarling, S., McKenney, A., Sorensen, D.: LAPACK Users’ Guide, 3rd edn. Society for Industrial and Applied Mathematics, Philadelphia (1999). https://doi.org/10.1137/1.9780898719604

    Book  MATH  Google Scholar 

  2. Bergou, E., Diouane, Y., Gratton, S.: On the use of the energy norm in trust-region and adaptive cubic regularization subproblems. Comput. Optim. Appl. 68, 533–554 (2017). https://doi.org/10.1007/s10589-017-9929-2

    Article  MathSciNet  MATH  Google Scholar 

  3. Bergou, E., Diouane, Y., Gratton, S.: A line-search algorithm inspired by the adaptive cubic regularization framework, with a worst-case complexity \( O(\varepsilon ^{-3/2})\), technical report. http://www.optimization-online.org/DB-HTML/2017/06/6083.html (2017)

  4. Birgin, E.G., Gentil, J.M.: Evaluating bound-constrained minimization software. Comput. Optim. Appl. 53, 347–373 (2012). https://doi.org/10.1007/s10589-012-9466-y

    Article  MathSciNet  MATH  Google Scholar 

  5. Birgin, E.G., Martínez, J.M.: Practical Augmented Lagrangian Methods for Constrained Optimization, Vol. 10 of Fundamentals of Algorithms. Society for Industrial and Applied Mathematics, Philadelphia (2014). https://doi.org/10.1137/1.9781611973365

    Book  MATH  Google Scholar 

  6. Birgin, E.G., Martínez, J.M.: The use of quadratic regularization with a cubic descent condition for unconstrained optimization. SIAM J. Optim. 27, 1049–1074 (2017). https://doi.org/10.1137/16M110280X

    Article  MathSciNet  MATH  Google Scholar 

  7. Birgin, E.G., Martínez, J.M.: On regularization and active-set methods with complexity for constrained optimization. SIAM J. Optim. 28, 1367–1395 (2018). https://doi.org/10.1137/17M1127107

    Article  MathSciNet  MATH  Google Scholar 

  8. Cartis, C., Gould, N.I.M., Toint, P.L.: On the complexity of steepest descent, Newton’s and regularized Newton’s methods for nonconvex unconstrained optimization. SIAM J. Optim. 20, 2833–2852 (2010). https://doi.org/10.1137/090774100

    Article  MathSciNet  MATH  Google Scholar 

  9. Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularization methods for unconstrained optimization. Part I: motivation, convergence and numerical results. Math. Program. 127, 245–295 (2011). https://doi.org/10.1007/s10107-009-0286-5

    Article  MathSciNet  MATH  Google Scholar 

  10. Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularization methods for unconstrained optimization. Part II: worst-case function and derivative complexity. Math. Program. 130, 295–319 (2011). https://doi.org/10.1007/s10107-009-0337-y

    Article  MathSciNet  MATH  Google Scholar 

  11. Cartis, C., Gould, N.I.M., Toint, P.L.: Universal regularization methods: varying the power, the smoothness and the accuracy. SIAM. J. Optim. 29, 595–615 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  12. Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust Region Methods. Society for Industral and Applied Mathematics, Philadelphia (2000). https://doi.org/10.1137/1.9780898719857

    Book  MATH  Google Scholar 

  13. Courant, R., John, F.: Introduction to Calculus and Analysis. Wiley, New York (1974)

    MATH  Google Scholar 

  14. Curtis, F.E., Robinson, D.P., Samadi, M.: A trust-region algorithm with a worst-case iteration complexity of \(O(\varepsilon ^{-3/2})\). Math. Program. 162, 1–32 (2017). https://doi.org/10.1007/s10107-016-1026-2

    Article  MathSciNet  MATH  Google Scholar 

  15. Dennis Jr., J.E., Echebest, N., Guardarucci, M.T., Martínez, J.M., Scolnik, H.D., Vacchino, C.: A curvilinear search using tridiagonal secant updates for unconstrained optimization. SIAM J. Optim. 1, 333–357 (1991). https://doi.org/10.1137/0801022

    Article  MathSciNet  MATH  Google Scholar 

  16. Dennis Jr., J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Society for Industral and Applied Mathematics, Philadelphia (1996). https://doi.org/10.1137/1.9781611971200

    Book  MATH  Google Scholar 

  17. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002). https://doi.org/10.1007/s101070100263

    Article  MathSciNet  MATH  Google Scholar 

  18. Dussault, J.P.: ARC\(_q\): a new adaptive regularization by cubics. Optim. Methods Softw. (2017). https://doi.org/10.1080/10556788.2017.1322080

    MATH  Google Scholar 

  19. Gay, D.M.: Computing optimal locally constrained steps. SIAM J. Sci. Stat. Comput. 2, 186–197 (1981). https://doi.org/10.1137/0902016

    Article  MathSciNet  MATH  Google Scholar 

  20. Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. The Johns Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  21. Gould, N.I.M., Nocedal, J.: The modified absolute-value factorization norm for trust-region minimization. In: De Leone, R., Murli, A., Pardalos, P.M., Toraldo, G. (eds.) High Performance Algorithms and Software in Nonlinear Optimization, pp. 225–241. Kluwer Academic Publishers, Dordrecht (1998). https://doi.org/10.1007/978-1-4613-3279-4_15

    Chapter  Google Scholar 

  22. Gould, N.I.M., Orban, D., Toint, P.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60, 545–557 (2014). https://doi.org/10.1007/s10589-014-9687-3

    Article  MathSciNet  MATH  Google Scholar 

  23. Gould, N.I.M., Porcelli, M., Toint, P.L.: Updating the regularization parameter in the adaptive cubic regularization algorithm. Comput. Optim. Appl. 53, 1–22 (2012). https://doi.org/10.1007/s10589-011-9446-7

    Article  MathSciNet  MATH  Google Scholar 

  24. Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM J. Optim. 27, 478–506 (2017). https://doi.org/10.1137/16M1087801

    Article  MathSciNet  MATH  Google Scholar 

  25. Griewank, A.: The modification of Newton’s method for unconstrained optimization by bounding cubic terms, technical report NA/12. Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge, England (1981)

  26. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newtons method. SIAM J. Numer. Anal. 23, 707–716 (1986). https://doi.org/10.1137/0723046

    Article  MathSciNet  MATH  Google Scholar 

  27. Higham, N.J.: Accuracy and Stability of Numerical Algorithms, 2nd edn. Society for Industrial and Applied Mathematics, Philadelphia (2002)

    Book  MATH  Google Scholar 

  28. HSL. A collection of fortran codes for large scale scientific computation. http://www.hsl.rl.ac.uk/

  29. Karas, E.W., Santos, S.A., Svaiter, B.F.: Algebraic rules for quadratic regularization of Newton’s method. Comput. Optim. Appl. 60, 343–376 (2015). https://doi.org/10.1007/s10589-014-9671-y

    Article  MathSciNet  MATH  Google Scholar 

  30. Lu, S., Wei, Z., Li, L.: A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization. Comput. Optim. Appl. 51, 551–573 (2012). https://doi.org/10.1007/s10589-010-9363-1

    Article  MathSciNet  MATH  Google Scholar 

  31. Martínez, J.M.: On high-order model regularization for constrained optimization. SIAM J. Optim. 27, 2447–2458 (2017). https://doi.org/10.1137/17M1115472

    Article  MathSciNet  MATH  Google Scholar 

  32. Martínez, J.M., Raydan, M.: Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization. J. Glob. Optim. 68, 367–385 (2017). https://doi.org/10.1007/s10898-016-0475-8

    Article  MathSciNet  MATH  Google Scholar 

  33. Moré, J.J., Sorensen, D.C.: Computing a trust region step. SIAM J. Sci. Stat. Comput. 4, 553–572 (1983). https://doi.org/10.1137/0904038

    Article  MathSciNet  MATH  Google Scholar 

  34. Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton’s method and its global performance. Math. Program. 108, 177–205 (2006). https://doi.org/10.1007/s10107-006-0706-8

    Article  MathSciNet  MATH  Google Scholar 

  35. Royer, C.W., Wright, S.J.: Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization, technical report. arXiv:1706.03131v2 [math.OC] (2017)

  36. Wang, Z.-H., Yuan, Y.-X.: A subspace implementation of quasi-Newton trust region methods for unconstrained optimization. Numer. Math. 104, 241–269 (2006). https://doi.org/10.1007/s00211-006-0021-6

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank H. D. Scolnik for providing an updated version of the method introduced in [15]. We also would like to thank N. I. M. Gould for pointing out the work [21] and also for pointing out the usage of subroutine MA57_get_factors in the sparse implementation of the BPK-based Mixed Factorization. Finally, the authors would like to thank the reviewers for their helpful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. G. Birgin.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by FAPESP (Grants 2013/05475-7, 2013/07375-0, 2016/01860-1, and 2018/24293-0) and CNPq (Grants 309517/2014-1 and 303750/2014-6).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Birgin, E.G., Martínez, J.M. A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization. Comput Optim Appl 73, 707–753 (2019). https://doi.org/10.1007/s10589-019-00089-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-019-00089-7

Keywords

Navigation