Skip to main content
Log in

A New Subspace Minimization Conjugate Gradient Method for Unconstrained Minimization

  • Original Paper
  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

Subspace minimization conjugate gradient (SMCG) methods are a class of quite efficient iterative methods for unconstrained optimization and have received increasing attention recently. The search directions of SMCG methods are generated by minimizing an approximate model with the approximate matrix \(B_k\) over the two-dimensional subspace spanned by the current gradient \(g_k \) and the latest step. The main drawback of SMCG methods is that the parameter \(g_k^TB_kg_k \) in the search directions must be determined when calculating the search directions. The parameter \(g_k^TB_kg_k \) is crucial to SMCG methods and is difficult to be determined properly. An alternative solution for this drawback might be to exploit a new way to derive SMCG methods independent of \(g_k^TB_kg_k\). The projection technique has been used successfully to derive conjugate gradient directions such as the Dai–Kou conjugate gradient direction (Dai and Kou in SIAM J Optim 23(1):296–320, 2013). Motivated by the above two observations, we use a projection technique to derive a new SMCG method independent of \(g_k^TB_kg_k\). More specifically, we project the search direction of memoryless quasi-Newton method into the above two-dimensional subspace and derive a new search direction, which is proved to be descent. Remarkably, the proposed method without any line search enjoys the finite termination property for two-dimensional strictly convex quadratic functions. An adaptive scaling factor in the search direction is exploited based on the finite termination property. The proposed method does not need to determine the parameter \(g_k^TB_kg_k\) and can be regarded as an extension of the Dai–Kou conjugate gradient method. The global convergence of the proposed method is established under the suitable assumptions. Numerical comparisons on the 147 test functions from the CUTEst library indicate that the proposed method is very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Data Availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 65(4), 859–874 (2014). https://doi.org/10.1007/s11075-013-9718-7

    Article  MathSciNet  Google Scholar 

  2. Andrei, N.: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization. Springer, Berlin (2020). https://doi.org/10.1007/978-3-030-42950-8

    Book  Google Scholar 

  3. Dai, Y.H., Kou, C.X.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016). https://doi.org/10.1007/s11425-016-0279-2

    Article  MathSciNet  Google Scholar 

  4. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013). https://doi.org/10.1137/100813026

    Article  MathSciNet  Google Scholar 

  5. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001). https://doi.org/10.1007/s002450010019

    Article  MathSciNet  Google Scholar 

  6. Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization. Comput. Optim. Appl. 22(1), 103–109 (2002). https://doi.org/10.1023/A:1014838419611

    Article  MathSciNet  Google Scholar 

  7. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999). https://doi.org/10.1137/S1052623497318992

    Article  MathSciNet  Google Scholar 

  8. Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002). https://doi.org/10.1007/s101070100263

    Article  MathSciNet  Google Scholar 

  9. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964). https://doi.org/10.1093/comjnl/7.2.149

    Article  MathSciNet  Google Scholar 

  10. Gould, N.I., Orban, D., Toint, P.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60, 545–557 (2015). https://doi.org/10.1007/s10589-014-9687-3

    Article  MathSciNet  Google Scholar 

  11. Hager, W.W., Zhang, H.C.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006). https://doi.org/10.1145/1132973.1132979

    Article  Google Scholar 

  12. Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005). https://doi.org/10.1137/030601880

    Article  MathSciNet  Google Scholar 

  13. Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952). https://doi.org/10.6028/JRES.049.044

    Article  MathSciNet  Google Scholar 

  14. Huang, Y.K., Dai, Y.H., Liu, X.W.: Equipping Barzilai–Borwein method with two dimensional quadratic termination property. SIAM J. Optim. 31(4), 3068–3096 (2021). https://doi.org/10.1137/21M1390785

    Article  MathSciNet  Google Scholar 

  15. Liu, H.W., Liu, Z.X.: An efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180(3), 879–906 (2019). https://doi.org/10.1007/s10957-018-1393-3

    Article  MathSciNet  Google Scholar 

  16. Liu, Z.X., Liu, H.W., Dai, Y.H.: An improved Dai–Kou conjugate gradient algorithm for unconstrained optimization. Comput. Optim. Appl. 75(1), 145–167 (2020). https://doi.org/10.1007/s10589-019-00143-4

    Article  MathSciNet  Google Scholar 

  17. Perry J. M.: A class of conjugate gradient algorithms with a two-step variable-metric memory. Discussion, Center for Mathematical Studies in Economics and Management Sciences, Northwestern University, Chicago (1977). https://EconPapers.repec.org/RePEc:nwu:cmsems:269

  18. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjugées. Rev. Francaise Informat. Recherche Opértionelle 3, 35–43 (1969)

    Google Scholar 

  19. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969). https://doi.org/10.1016/0041-5553(69)90035-4

    Article  Google Scholar 

  20. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis. Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984). https://doi.org/10.1007/BFb0099521

    Chapter  Google Scholar 

  21. Powell, M.J.D.: Restart procedures of the conjugate gradient method. Math. Program. 2, 241–254 (1977). https://doi.org/10.1007/BF01593790

    Article  MathSciNet  Google Scholar 

  22. Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978). https://doi.org/10.1137/0715085

    Article  MathSciNet  Google Scholar 

  23. Sun, W.M., Liu, H.W., Liu, Z.X.: A class of accelerated subspace minimization conjugate gradient methods. J. Optim. Theory Appl. 190(3), 811–840 (2021). https://doi.org/10.1007/s10957-021-01897-w

    Article  MathSciNet  Google Scholar 

  24. Wang, X.M., Li, C., Wang, J.H., et al.: Linear convergence of subgradient algorithm for convex feasibility on Riemannian manifolds. SIAM J. Optim. 25, 2334–2358 (2015). https://doi.org/10.1137/14099961X

    Article  MathSciNet  Google Scholar 

  25. Wang, X.M.: Subgradient algorithms on Riemannian manifolds of lower bounded curvatures. Optimization 67, 179–194 (2018). https://doi.org/10.1080/02331934.2017.1387548

    Article  MathSciNet  Google Scholar 

  26. Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numer. Algorithms 76(3), 813–828 (2017). https://doi.org/10.1007/s11075-017-0284-2

    Article  MathSciNet  Google Scholar 

  27. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991). https://doi.org/10.3390/math11061420

    Article  MathSciNet  Google Scholar 

  28. Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75(1), 69–77 (1995). https://doi.org/10.1002/zamm.19950750118

    Article  MathSciNet  Google Scholar 

  29. Zhao, T., Liu, H.W., Liu, Z.X.: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer. Algorithm 87(4), 1501–1534 (2021). https://doi.org/10.1007/s11075-020-01017-1

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We would like to thank the associate editor and the anonymous referees for their valuable comments and suggestions. We also would like to thank Professor Yu-Hong Dai in Chinese Academy of Sciences for his valuable and insightful comments on this manuscript. This research is supported by National Science Foundation of China (No.12261019), Guizhou Provincial Science and Technology Projects (No. QHKJC-ZK[2022]YB084).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yan Ni.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Communicated by Stefan Ulbrich.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Z., Ni, Y., Liu, H. et al. A New Subspace Minimization Conjugate Gradient Method for Unconstrained Minimization. J Optim Theory Appl 200, 820–851 (2024). https://doi.org/10.1007/s10957-023-02325-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-023-02325-x

Keywords

Mathematics Subject Classification

Navigation