Abstract
Limited-memory variable metric methods based on the well-known Broyden-Fletcher-Goldfarb-Shanno (BFGS) update are widely used for large scale optimization. The block version of this update, derived for general objective functions in Vlček and Lukšan (Numerical Algorithms 2019), satisfies the secant conditions with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the secant conditions as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously used together with methods based on vector corrections for conjugacy. Here we combine two types of these corrections to satisfy the secant conditions with both the corrected and uncorrected (original) latest difference vectors. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.
Similar content being viewed by others
References
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Opt. 10, 147–161 (2008)
Bongartz, I., Conn, A.R., Gould, N., Toint, P.L.: CUTE: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 21, 123–160 (1995)
Byrd, R.H., Nocedal, J., Schnabel, R.B.: Representation of quasi-Newton matrices and their use in limited memory methods. Math. Prog. 63, 129–156 (1994)
Dennis, J.E., Jr., Schnabel, R.B.: Least change secant updates for quasi-Newton methods. SIAM Rev. 21, 443–459 (1979)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Prog. 91, 201–213 (2002)
Fiedler, M.: Special matrices and their applications in numerical mathematics, 2nd edn. Dover Publications, Mineola (2008)
Fletcher, R.: Practical Methods of Optimization. Wiley, Chichester (1987)
Hager, W.W., Zhang, H.: Algorithm 851: CG-DESCENT, a Conjugate Gradient Method with Guaranteed Descent. ACM Trans. Math. Softw. 32, 113–137 (2006)
Hu, Y.F., Storey, C.: Motivating Quasi-Newton Updates by Preconditioned Conjugate Gradient Methods, Math. Report A 150, Department of Mathematical Sciences, Loughborough University of Technology, England (1991)
Li, D., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Prog. 45, 503–528 (1989)
Lukšan, L., Matonoha, C., Vlček, J.: Algorithm 896: LSA - Algorithms for Large-Scale Optimization, ACM Trans. Math. Softw. 36, 16:1-16:29 (2009)
Lukšan, L., Matonoha, C., Vlček, J.: Sparse Test Problems for Unconstrained Optimization, Report V-1064. ICS AS CR, Prague (2010). (http://hdl.handle.net/11104/0181697)
Lukšan, L., Matonoha, C., Vlček, J.: Modified CUTE Problems for Sparse Unconstrained Optimization, Report V-1081. ICS AS CR, Prague (2010). (http://hdl.handle.net/11104/0189238)
Lukšan, L., Spedicato, E.: Variable metric methods for unconstrained optimization and nonlinear least squares. J. Comput. Appl. Math. 124, 61–95 (2000)
Lukšan, L., Tůma, M., Matonoha, C., Vlček, J., Ramešová, N., Šiška, M., Hartman, J.: UFO 2017. Interactive System for Universal Functional Optimization, Report V-1252, ICS AS CR, Prague (2017). http://www.cs.cas.cz/luksan/ufo.pdf
Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comp. 35, 773–782 (1980)
Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (1999)
Schnabel, R.B.: Quasi-Newton Methods Using Multiple Secant Equations, Technical Report CU-CS-247-83. University of Colorado at Boulder, USA, Department of Computer Science (1983)
Vlček, J., Lukšan, L.: A conjugate directions approach to improve the limited-memory BFGS method. Appl. Math. Comput. 219, 800–809 (2012)
Vlček, J., Lukšan, L.: A modified limited-memory BNS method for unconstrained minimization based on conjugate directions idea. Optim. Meth. Softw. 30, 616–633 (2015)
Vlček, J., Lukšan, L.: Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization. Num. Alg. 80, 957–987 (2019)
Vlček, J., Lukšan, L.: A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions. J. Comput. Appl. Math. 351, 14–28 (2019)
Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, Ch.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)
Acknowledgements
We thank the two anonymous referees for careful reading of the paper and for constructive suggestions.
Funding
Supported by the Institute of Computer Science of the Czech Academy of Sciences (RVO: 67985807).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declared that there is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work was supported by the Institute of Computer Science of the Czech Academy of Sciences (RVO: 67985807)
Rights and permissions
About this article
Cite this article
Vlček, J., Lukšan, L. Two limited-memory optimization methods with minimum violation of the previous secant conditions. Comput Optim Appl 80, 755–780 (2021). https://doi.org/10.1007/s10589-021-00318-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-021-00318-y
Keywords
- Unconstrained minimization
- Variable metric methods
- Limited-memory methods
- Variationally derived methods
- Global convergence
- Numerical results