Abstract
In this paper, based on the limited memory techniques and subspace minimization conjugate gradient (SMCG) methods, a regularized limited memory subspace minimization conjugate gradient method is proposed, which contains two types of iterations. In SMCG iteration, we obtain the search direction by minimizing the approximate quadratic model or approximate regularization model. In RQN iteration, combined with regularization technique and BFGS method, a modified regularized quasi-Newton method is used in the subspace to improve the orthogonality. Moreover, some simple acceleration criteria and an improved tactic for selecting the initial stepsize to enhance the efficiency of the algorithm are designed. Additionally, a generalized nonmonotone line search is utilized and the global convergence of our proposed algorithm is established under mild conditions. Finally, numerical results show that the proposed algorithm has a significant improvement over ASMCG_PR and is superior to the particularly well-known limited memory conjugate gradient software packages CG_DESCENT (6.8) and CGOPT(2.0) for the CUTEr library.
Similar content being viewed by others
Availability of supporting data
Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.
References
Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algor. 65, 859–874 (2014)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)
Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22(1), 103–109 (2002)
Dai, Y.H.: Nonlinear Conjugate Gradient Methods. Wiley Encyclopedia of Operations Research and Management Science (2011). https://doi.org/10.1002/9780470400531.eorms0183
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y.H., Kou, C.X.: A Barzilai-Borwein conjugate gradient method. Sci China Math 59(8), 1511–1524 (2016)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Computer Journal. 7, 149–154 (1964)
Gould, N.I.M., Orban, D., Toint, Ph.L: CUTEr and SifDec: A Constrained and Unconstrained Testing Environment, revisited. ACM Trans. Math. Softw. 29, 373-394 (2003)
Gu, G.Z., Li, D.H., Qi, L.Q., Zhou, S.Z.: Descent directions of quasi-Newton methods for symmetric nonlinear equations. SIAM J. Numer. Anal. 40, 1763–1774 (2003)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Hager, W.W., Zhang, H.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Software 32(1), 113–137 (2006)
Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Huang, S., Wan, Z., Chen, X.H.: A new nonmonotone line search technique for unconstrained optimization. Numer. Algor. 68(4), 671–689 (2015)
Li, D.H., Fukushima, M.: A globally and superlinearly convergent Gauss-Newton-based BFGS methods for symmetric nonlinear equations. SIAM J. Numer. Anal. 37, 152–172 (1999)
Li, D.H., Fukushima, M.: On the global convergence of BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11(4), 1054–1064 (2001)
Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer Algor. 79, 195–219 (2018)
Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Computational and Applied Mathematics. 38(1), (2019)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Liu, T.W.: A regularized limited memory BFGS method for nonconvex unconstrained minimization. Numer. Algor. 65, 305–323 (2014)
Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)
Liu, Z.X., Liu, H.W.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)
Liu, H.W., Liu, Z.X.: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180, 879–906 (2019)
Liu, Z.X., Liu, H.W., Dai, Y.H.: An improved Dai?CKou conjugate gradient algorithm for unconstrained optimization. Comput. Optim. Appl. 75(1), 145–167 (2020)
Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)
Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Franaise Informat. Rech. Opérationnelle. 3(16), 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Sun, W., Liu, H., Liu, Z.: A Class of Accelerated Subspace Minimization Conjugate Gradient Methods. J. Optim. Theory Appl. 190(3), 811–840 (2021)
Tarzangh, D.A., Peyghami, M.R.: A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems. J. Global Optim. 63, 709–728 (2015)
Tankaria, H., Sugimoto, S., Yamashita, N.: A regularized limited memory BFGS method for large-scale unconstrained optimization and its efficient implementations. Comput. Optim. Appl. 82, 61–88 (2022)
Ueda, K., Yamashita, N.: Convergence properties of the regularized newton method for the unconstrained nonconvex optimization. Appl. Math. Optim. 62, 27–46 (2010)
Wang, T., Liu, Z.X., Liu, H.W.: A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization. Int. J. Comput. Math. 96(10), 1924–1942 (2019)
Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numer Algor. 76, 813–828 (2017)
Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)
Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75(1), 69–77 (1995)
Yuan, Y. X., Sun, W. Y.: Theory and methods of optimization. Science Press of China (1999)
Zhang, H., Hager, W.W.: A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)
Zhao, T., Liu, H.W., Liu, Z.X.: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer. Algor. 87, 1501–1534 (2021)
Acknowledgements
The authors would like to thank the editor and the anonymous referees for their valuable suggestions and comments which have greatly improved the presentation of this paper.
Funding
This research was supported by the National Natural Science Foundation of China (No. 12261019, 12161053), Guizhou Provincial Science and Technology Projects (No. QHKJC-ZK[2022]YB084) and the Natural Science Basic Research Program of Shaanxi (No. 2021JM-396).
Author information
Authors and Affiliations
Contributions
Wumei Sun wrote the main manuscript text. Hongwei Liu and Zexian Liu reviewed and revised the manuscript.
Corresponding author
Ethics declarations
Ethical approval
Not applicable
Competing interests
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Sun, W., Liu, H. & Liu, Z. A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization. Numer Algor 94, 1919–1948 (2023). https://doi.org/10.1007/s11075-023-01559-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-023-01559-0
Keywords
- Limited memory
- Subspace minimization conjugate gradient method
- Orthogonality
- Regularization model
- Quasi-Newton method