Skip to main content
Log in

Two limited-memory optimization methods with minimum violation of the previous secant conditions

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

Limited-memory variable metric methods based on the well-known Broyden-Fletcher-Goldfarb-Shanno (BFGS) update are widely used for large scale optimization. The block version of this update, derived for general objective functions in Vlček and Lukšan (Numerical Algorithms 2019), satisfies the secant conditions with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the secant conditions as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously used together with methods based on vector corrections for conjugacy. Here we combine two types of these corrections to satisfy the secant conditions with both the corrected and uncorrected (original) latest difference vectors. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Opt. 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  2. Bongartz, I., Conn, A.R., Gould, N., Toint, P.L.: CUTE: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 21, 123–160 (1995)

    Article  Google Scholar 

  3. Byrd, R.H., Nocedal, J., Schnabel, R.B.: Representation of quasi-Newton matrices and their use in limited memory methods. Math. Prog. 63, 129–156 (1994)

    Article  MathSciNet  Google Scholar 

  4. Dennis, J.E., Jr., Schnabel, R.B.: Least change secant updates for quasi-Newton methods. SIAM Rev. 21, 443–459 (1979)

    Article  MathSciNet  Google Scholar 

  5. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Prog. 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  6. Fiedler, M.: Special matrices and their applications in numerical mathematics, 2nd edn. Dover Publications, Mineola (2008)

    MATH  Google Scholar 

  7. Fletcher, R.: Practical Methods of Optimization. Wiley, Chichester (1987)

    MATH  Google Scholar 

  8. Hager, W.W., Zhang, H.: Algorithm 851: CG-DESCENT, a Conjugate Gradient Method with Guaranteed Descent. ACM Trans. Math. Softw. 32, 113–137 (2006)

    Article  MathSciNet  Google Scholar 

  9. Hu, Y.F., Storey, C.: Motivating Quasi-Newton Updates by Preconditioned Conjugate Gradient Methods, Math. Report A 150, Department of Mathematical Sciences, Loughborough University of Technology, England (1991)

  10. Li, D., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)

    Article  MathSciNet  Google Scholar 

  11. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Prog. 45, 503–528 (1989)

    Article  MathSciNet  Google Scholar 

  12. Lukšan, L., Matonoha, C., Vlček, J.: Algorithm 896: LSA - Algorithms for Large-Scale Optimization, ACM Trans. Math. Softw. 36, 16:1-16:29 (2009)

  13. Lukšan, L., Matonoha, C., Vlček, J.: Sparse Test Problems for Unconstrained Optimization, Report V-1064. ICS AS CR, Prague (2010). (http://hdl.handle.net/11104/0181697)

    MATH  Google Scholar 

  14. Lukšan, L., Matonoha, C., Vlček, J.: Modified CUTE Problems for Sparse Unconstrained Optimization, Report V-1081. ICS AS CR, Prague (2010). (http://hdl.handle.net/11104/0189238)

    Google Scholar 

  15. Lukšan, L., Spedicato, E.: Variable metric methods for unconstrained optimization and nonlinear least squares. J. Comput. Appl. Math. 124, 61–95 (2000)

    Article  MathSciNet  Google Scholar 

  16. Lukšan, L., Tůma, M., Matonoha, C., Vlček, J., Ramešová, N., Šiška, M., Hartman, J.: UFO 2017. Interactive System for Universal Functional Optimization, Report V-1252, ICS AS CR, Prague (2017). http://www.cs.cas.cz/luksan/ufo.pdf

  17. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comp. 35, 773–782 (1980)

    Article  MathSciNet  Google Scholar 

  18. Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (1999)

    Book  Google Scholar 

  19. Schnabel, R.B.: Quasi-Newton Methods Using Multiple Secant Equations, Technical Report CU-CS-247-83. University of Colorado at Boulder, USA, Department of Computer Science (1983)

  20. Vlček, J., Lukšan, L.: A conjugate directions approach to improve the limited-memory BFGS method. Appl. Math. Comput. 219, 800–809 (2012)

    MathSciNet  MATH  Google Scholar 

  21. Vlček, J., Lukšan, L.: A modified limited-memory BNS method for unconstrained minimization based on conjugate directions idea. Optim. Meth. Softw. 30, 616–633 (2015)

    Article  MathSciNet  Google Scholar 

  22. Vlček, J., Lukšan, L.: Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization. Num. Alg. 80, 957–987 (2019)

    Article  MathSciNet  Google Scholar 

  23. Vlček, J., Lukšan, L.: A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions. J. Comput. Appl. Math. 351, 14–28 (2019)

    Article  MathSciNet  Google Scholar 

  24. Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, Ch.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We thank the two anonymous referees for careful reading of the paper and for constructive suggestions.

Funding

Supported by the Institute of Computer Science of the Czech Academy of Sciences (RVO: 67985807).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jan Vlček.

Ethics declarations

Conflict of interest

The authors declared that there is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by the Institute of Computer Science of the Czech Academy of Sciences (RVO: 67985807)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vlček, J., Lukšan, L. Two limited-memory optimization methods with minimum violation of the previous secant conditions. Comput Optim Appl 80, 755–780 (2021). https://doi.org/10.1007/s10589-021-00318-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-021-00318-y

Keywords

Mathematical Subject Classification (2010)

Navigation