Skip to main content
Log in

A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

The memory-less SR1 and the memory-less BFGS methods are presented together with their numerical performances for solving a set of 800 unconstrained optimization problems with the number of variables in the range [1000, 10,000]. By memory-less quasi-Newton methods, we understand the quasi-Newton methods which are initialized by the unity matrix at every iteration. In these algorithms, the stepsize is computed by the Wolfe line search conditions. The convergence of the memory-less SR1 method is proved under the classical assumptions. Comparison between the memory-less SR1 and the memory-less BFGS method shows that memory-less BFGS is more efficient and more robust than the memory-less SR1. Comparison between memory-less SR1 and BFGS from CONMIN in implementation of Shanno and Phua shows that memory-less SR1 method is more efficient and more robust than BFGS method from CONMIN, one of the best implementation of BFGS. Additionally, comparison of memory-less SR1 and memory-less BFGS versus steepest descent shows that both these memory-less algorithms are more efficient and more robust. Performances of these algorithms for solving five applications from MINPACK-2 collection, each of them with 40,000 variables, are also presented. For solving these applications, the memory-less BFGS is more efficient than the memory-less SR1. It seems that the accuracy of the Hessian approximations along the iterations in quasi-Newton methods is not as crucial in these methods as it is believed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)

    Article  MathSciNet  Google Scholar 

  2. Wolfe, P.: Convergence conditions for ascent methods. II: Some corrections. SIAM Review. 13, 185–188 (1971)

  3. Broyden, C.G.: Quasi-newton methods and their applications to function minimization. Math. Comput. 21(99), 368–381 (1967)

    Article  Google Scholar 

  4. Davidon, W.C.: Variable metric method for minimization. (Research and Development Report ANL-5990. Argonne National Laboratories.) (1959)

  5. Fiacco, A.V., McCormick, GP.: Nonlinear programming: sequential unconstrained minimization techniques. Research Analysis Corporation, McLean Virginia. Republished in 1990 by SIAM, Philadelphia (1968)

  6. Davidon, W.C.: Variable metric method for minimization. SIAM J. Optim. 1(1), 1–17 (1991)

    Article  MathSciNet  Google Scholar 

  7. Fletcher, R., Powell, M.J.D.: A Rapidly Convergent Descent Method for Minimization, Computer Journal, 163–168 (1963)

  8. Powell, M.J.D.: A new algorithm for unconstrained optimization. In: Rosen, J.B., Mangasarian, O.L., Ritter, K. (eds.) Nonlinear Programming, pp. 31–66. Academic Press, New York (1970)

  9. Broyden, C.G.: The convergence of a class of double-rank minimization algorithms. I. General considerations. Journal of the Institute of Mathematics and Its Applications. 6, 76–90 (1970)

    Article  Google Scholar 

  10. Fletcher, R.: A new approach to variable metric algorithms. Comput. J. 13, 317–322 (1970)

    Article  Google Scholar 

  11. Goldfarb, D.: A family of variable metric method derived by variation mean. Math. Comput. 23, 23–26 (1970)

    Article  Google Scholar 

  12. Shanno, D.F.: Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24, 647–656 (1970)

    Article  MathSciNet  Google Scholar 

  13. Huang, H.Y.: Unified approach to quadratically convergent algorithms for functions minimization. J. Optim. Theory Appl. 5(6), 405–423 (1970)

    Article  MathSciNet  Google Scholar 

  14. Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the selfscaling memoryless BFGS update. J. Comput. Appl. Math. 325, 149–164 (2017)

    Article  MathSciNet  Google Scholar 

  15. Babaie-Kafaki, S.: A modified scaling parameter for the memory-less BFGS updating formula. Numerical Algorithms. 72, 425–433 (2016)

    Article  MathSciNet  Google Scholar 

  16. Leong, W.J., Hassan, M.A.: Scaled memory-less symmetric rank one method for large-scale optimization. Appl. Math. Comput. 218, 413–418 (2011)

    MathSciNet  MATH  Google Scholar 

  17. Modarres, F., Hassan, M.A., Leong, W.J.: Memory-less modified symmetric rank one method for large-scale unconstrained optimization. Am. J. Appl. Sci. 6, 2054–2059 (2009)

    Article  Google Scholar 

  18. Moyi, A.U., Leong, W.J.: A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization. Optimization. 65, 121–143 (2016)

    Article  MathSciNet  Google Scholar 

  19. Nakayama, S., Narushima, Y., Yabe, H.: A memoryless symmetric rank-one method with sufficient descent property for unconstrained optimization. J. Oper. Res. Soc. Jpn. 61, 53–70 (2018)

    MathSciNet  MATH  Google Scholar 

  20. Nakayama, S., Narushima, Y., Yabe, H.: Memory-less quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization. Journal of Industrial and Management Optimization. 15, 1773–1793 (2019)

    Article  MathSciNet  Google Scholar 

  21. Yao, S., Ning, L.: An adaptive three-term conjugate gradient method based on selfscaling memoryless BFGS matrix. J. Comput. Appl. Math. 332, 72–85 (2018)

    Article  MathSciNet  Google Scholar 

  22. Shanno, D.F.: CONMIN – a Fortran subroutine for minimizing an unconstrained nonlinear scalar valued function of a vector variable x either by the BFGS variable metric algorithm or by a Beale restarted conjugate gradient algorithm. Private communication, October. 17, (1983)

  23. Shanno, D.F., Phua, K.H.: Algorithm 500. Minimization of unconstrained multivariable functions. ACM Transactions on Mathematical Software. 2, 87–94 (1976)

    Article  Google Scholar 

  24. Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 Test Problem Collection, Mathematics and Computer Science Division, Argonne National Laboratory, Preprint MCS-P153–0692 (1992)

  25. Conn, A.R., Gould, N.I.M., Toint, P.L.: Convergence of quasi-newton matrices generated by the symmetric rank one update. Math. Program. 50(1–3), 177–195 (1991)

    Article  MathSciNet  Google Scholar 

  26. Khalfan, H.F., Byrd, R.H., Schnabel, R.B.: A theoretical and experimental study of the symmetric rank-one update. SIAM J. Optimization. 3(1), 1–24 (1993)

    Article  MathSciNet  Google Scholar 

  27. Kelley, C.T., Sachs, E.W.: Local convergence of the symmetric rank one iteration. Comput. Optim. Appl. 9, 43–63 (1998)

    Article  MathSciNet  Google Scholar 

  28. Benson, H.Y., Shanno, D.F.: Cubic regularization in symmetric rank-1 quasi-Newton methods. Math. Progr. Comput. 10, 457–486 (2018)

    Article  MathSciNet  Google Scholar 

  29. Chen, H., Lam, W.H., Chan, S.C.: On the convergence analysis of cubic regularized symmetric rank-1 quasi-Newton method and the incremental version in the application of large-scale problems. IEEE Access. 7, 114042–114059 (2019)

    Article  Google Scholar 

  30. Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Applied Mathematics & Optimization. 43, 87–101 (2001)

    Article  MathSciNet  Google Scholar 

  31. Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numerica. 1, 199–242 (1992)

    Article  MathSciNet  Google Scholar 

  32. Nocedal, J., Wright, S.J.: Numerical optimization. In: Springer Series in Operations Research. Springer Science+Business Media, New York, Second edition (2006)

  33. Shanno, D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978a)

    Article  MathSciNet  Google Scholar 

  34. Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978b)

    Article  MathSciNet  Google Scholar 

  35. Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numerical Algorithms. 42(1), 63–73 (2006)

    Article  MathSciNet  Google Scholar 

  36. Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)

    MathSciNet  MATH  Google Scholar 

  37. Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optimization and Its Applications 158 (2020)

  38. Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)

    Article  Google Scholar 

  39. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  40. Glowinski, R.: Numerical methods for nonlinear variational problems, Springer-Verlag, Berlin (1984)

  41. Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)

    Article  MathSciNet  Google Scholar 

  42. Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)

    Article  MathSciNet  Google Scholar 

  43. Aris, R.: The mathematical theory of diffusion and reaction in permeable catalysts. Oxford. (1975)

  44. Bebernes, J., Eberly, D.: Mahematical problems from combustion theory, in: Applied Mathematical Sciences, vol. 83, Springer-Verlag (1989)

  45. Nitsche, J.C.C.: Lectures on minimal surfaces, Vol. 1, Cambridge University Press (1989)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neculai Andrei.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Andrei, N. A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization. Numer Algor 90, 223–240 (2022). https://doi.org/10.1007/s11075-021-01185-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-021-01185-8

Keywords

Navigation