Abstract
The memory-less SR1 with generalized secant equation (MM-SR1gen) is presented and developed together with its numerical performances for solving a collection of 800 unconstrained optimization problems with the number of variables in the range [1000, 10000]. The convergence of the MM-SR1gen method is proved under the classical assumptions. Comparison between the MM-SR1gen versus the memory-less SR1 method, versus the memory-less BFGS method and versus the BFGS in implementation of Shanno and Phua from CONMIN show that MM-SR1gen is more efficient and more robust than these algorithms. By solving five applications from MINPACK-2 collection, each of them with 40,000 variables, we have the computational evidence that MM-SR1gen is more efficient than memory-less SR1 and than memory-less BFGS. The conclusion of this study is that the memory-less SR1 method with generalized secant equation is a rapid and reliable method for solving large-scale minimizing problems. Besides, it is shown that the accuracy of the Hessian approximations along the iterations in quasi-Newton methods is not as crucial in these methods as it is believed.
Similar content being viewed by others
References
Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithms 42(1), 63–73 (2006)
Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)
Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optimization and Its Applications (2020)
Aris, R.: The mathematical theory of diffusion and reaction in permeable catalysts. Clarendon Press, Oxford (1975)
Averick, B.M., Carter, R.G., Moré, J.J., Xue, G.L.: The MINPACK-2 test problem collection, Mathematics and Computer Science Division, Argonne National Laboratory, Preprint MCS-P153-0692 (1992)
Bebernes, J., Eberly, D.: Mahematical problems from combustion theory. In: Applied Mathematical Sciences, vol. 83. Springer-Verlag (1989)
Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, Ph.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)
Cimatti, G.: On a problem of the theory of lubrication governed by a variational inequality. Appl. Math. Optim. 3, 227–242 (1977)
Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Convergence of quasi-newton matrices generated by the symmetric rank one update. Math. Program. 50(1–3), 177–195 (1991)
Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fiacco, A.V., McCormick, G.P.: Nonlinear programming: sequential unconstrained minimization techniques. Research Analysis Corporation, McLean Virginia (1968). Republished in 1990 by SIAM, Philadelphia
Glowinski, R.: Numerical methods for nonlinear variational problems. Springer-Verlag, Berlin (1984)
Goodman, J., Kohn, R., Reyna, L.: Numerical study of a relaxed variational problem from optimal design. Comput. Methods Appl. Mech. Eng. 57, 107–127 (1986)
Kelley, C.T., Sachs, E.W.: Local convergence of the symmetric rank one iteration. Comput. Optim. Appl. 9, 43–63 (1998)
Khalfan, H.F., Byrd, R.H., Schnabel, R.B.: A theoretical and experimental study of the symmetric rank-one update. SIAM J. Optim. 3(1), 1–24 (1993)
Nitsche, J.C.C.: Lectures on minimal surfaces, vol. 1. Cambridge University Press, Cambridge (1989)
Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numer 1, 199–242 (1992)
Shanno, D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978)
Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978)
Shanno, D.F.: CONMIN—A Fortran subroutine for minimizing an unconstrained nonlinear scalar valued function of a vector variable x either by the BFGS variable metric algorithm or by a Beale restarted conjugate gradient algorithm. Private Commun. (1983)
Shanno, D.F., Phua, K.H.: Algorithm 500. Minimization of unconstrained multivariable functions. ACM Trans. Math. Softw. 2, 87–94 (1976)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)
Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13, 185–188 (1971)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Andrei, N. Accelerated memory-less SR1 method with generalized secant equation for unconstrained optimization. Calcolo 59, 16 (2022). https://doi.org/10.1007/s10092-022-00460-x
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10092-022-00460-x
Keywords
- Symmetric-rank one SR1
- Quasi-Newton BFGS
- Wolfe line search
- Numerical experiments
- Secant equation
- Memory-less SR1
- Dolan and Moré performance profile