Skip to main content
Log in

A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

In this paper, based on the limited memory techniques and subspace minimization conjugate gradient (SMCG) methods, a regularized limited memory subspace minimization conjugate gradient method is proposed, which contains two types of iterations. In SMCG iteration, we obtain the search direction by minimizing the approximate quadratic model or approximate regularization model. In RQN iteration, combined with regularization technique and BFGS method, a modified regularized quasi-Newton method is used in the subspace to improve the orthogonality. Moreover, some simple acceleration criteria and an improved tactic for selecting the initial stepsize to enhance the efficiency of the algorithm are designed. Additionally, a generalized nonmonotone line search is utilized and the global convergence of our proposed algorithm is established under mild conditions. Finally, numerical results show that the proposed algorithm has a significant improvement over ASMCG_PR and is superior to the particularly well-known limited memory conjugate gradient software packages CG_DESCENT (6.8) and CGOPT(2.0) for the CUTEr library.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Availability of supporting data

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

References

  1. Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algor. 65, 859–874 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  2. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  3. Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22(1), 103–109 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  4. Dai, Y.H.: Nonlinear Conjugate Gradient Methods. Wiley Encyclopedia of Operations Research and Management Science (2011). https://doi.org/10.1002/9780470400531.eorms0183

  5. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  6. Dai, Y.H., Kou, C.X.: A Barzilai-Borwein conjugate gradient method. Sci China Math 59(8), 1511–1524 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  7. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  9. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Computer Journal. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  10. Gould, N.I.M., Orban, D., Toint, Ph.L: CUTEr and SifDec: A Constrained and Unconstrained Testing Environment, revisited. ACM Trans. Math. Softw. 29, 373-394 (2003)

  11. Gu, G.Z., Li, D.H., Qi, L.Q., Zhou, S.Z.: Descent directions of quasi-Newton methods for symmetric nonlinear equations. SIAM J. Numer. Anal. 40, 1763–1774 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  12. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  13. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  14. Hager, W.W., Zhang, H.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Software 32(1), 113–137 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  15. Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  16. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  17. Huang, S., Wan, Z., Chen, X.H.: A new nonmonotone line search technique for unconstrained optimization. Numer. Algor. 68(4), 671–689 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  18. Li, D.H., Fukushima, M.: A globally and superlinearly convergent Gauss-Newton-based BFGS methods for symmetric nonlinear equations. SIAM J. Numer. Anal. 37, 152–172 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  19. Li, D.H., Fukushima, M.: On the global convergence of BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11(4), 1054–1064 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  20. Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer Algor. 79, 195–219 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  21. Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Computational and Applied Mathematics. 38(1), (2019)

  22. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  23. Liu, T.W.: A regularized limited memory BFGS method for nonconvex unconstrained minimization. Numer. Algor. 65, 305–323 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  24. Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  25. Liu, Z.X., Liu, H.W.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  26. Liu, H.W., Liu, Z.X.: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180, 879–906 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  27. Liu, Z.X., Liu, H.W., Dai, Y.H.: An improved Dai?CKou conjugate gradient algorithm for unconstrained optimization. Comput. Optim. Appl. 75(1), 145–167 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  28. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  29. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)

    Book  MATH  Google Scholar 

  30. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Franaise Informat. Rech. Opérationnelle. 3(16), 35–43 (1969)

    MATH  Google Scholar 

  31. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  MATH  Google Scholar 

  32. Sun, W., Liu, H., Liu, Z.: A Class of Accelerated Subspace Minimization Conjugate Gradient Methods. J. Optim. Theory Appl. 190(3), 811–840 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  33. Tarzangh, D.A., Peyghami, M.R.: A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems. J. Global Optim. 63, 709–728 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  34. Tankaria, H., Sugimoto, S., Yamashita, N.: A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations. Comput. Optim. Appl. 82, 61–88 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  35. Ueda, K., Yamashita, N.: Convergence properties of the regularized newton method for the unconstrained nonconvex optimization. Appl. Math. Optim. 62, 27–46 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  36. Wang, T., Liu, Z.X., Liu, H.W.: A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization. Int. J. Comput. Math. 96(10), 1924–1942 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  37. Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numer Algor. 76, 813–828 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  38. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  39. Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75(1), 69–77 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  40. Yuan, Y. X., Sun, W. Y.: Theory and methods of optimization. Science Press of China (1999)

  41. Zhang, H., Hager, W.W.: A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  42. Zhao, T., Liu, H.W., Liu, Z.X.: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer. Algor. 87, 1501–1534 (2021)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the editor and the anonymous referees for their valuable suggestions and comments which have greatly improved the presentation of this paper.

Funding

This research was supported by the National Natural Science Foundation of China (No. 12261019, 12161053), Guizhou Provincial Science and Technology Projects (No. QHKJC-ZK[2022]YB084) and the Natural Science Basic Research Program of Shaanxi (No. 2021JM-396).

Author information

Authors and Affiliations

Authors

Contributions

Wumei Sun wrote the main manuscript text. Hongwei Liu and Zexian Liu reviewed and revised the manuscript.

Corresponding author

Correspondence to Hongwei Liu.

Ethics declarations

Ethical approval

Not applicable

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, W., Liu, H. & Liu, Z. A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization. Numer Algor 94, 1919–1948 (2023). https://doi.org/10.1007/s11075-023-01559-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-023-01559-0

Keywords

Mathematics Subject Classification

Navigation