Skip to main content
Log in

New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

In this paper, two new subspace minimization conjugate gradient methods based on p-regularization models are proposed, where a special scaled norm in p-regularization model is analyzed. Different choices of special scaled norm lead to different solutions to the p-regularized subproblem. Based on the analyses of the solutions in a two-dimensional subspace, we derive new directions satisfying the sufficient descent condition. With a modified nonmonotone line search, we establish the global convergence of the proposed methods under mild assumptions. R-linear convergence of the proposed methods is also analyzed. Numerical results show that, for the CUTEr library, the proposed methods are superior to four conjugate gradient methods, which were proposed by Hager and Zhang (SIAM J. Optim. 16(1):170–192, 2005), Dai and Kou (SIAM J. Optim. 23(1):296–320, 2013), Liu and Liu (J. Optim. Theory. Appl. 180(3):879–906, 2019) and Li et al. (Comput. Appl. Math. 38(1):2019), respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  1. Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm. 65, 859–874 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  2. Andrea, C., Tayebeh, D.N., Stefano, L.: On global minimizers of quadratic functions with cubic regularization. Optim. Lett. 13, 1269–1283 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  3. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bellavia, S., Morini, B., Cartis, C., Gould, N.I.M., Toint, P.L.: Convergence of a regularized euclidean residual algorithm for nonlinear least-squares. SIAM J. Numer. Anal. 48, 1–29 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bellavia, S., Morini, B.: Strong local convergence properties of adaptive regularized methods for nonlinear least squares. IMA J. Numer. Anal. 35, 947–968 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  6. Benson, H.Y., Shanno, D.F.: Interior-point methods for nonconvex nonlinear programming:cubic regularization. Comput. Optim. Appl. 58, 323–346 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  7. Bianconcini, T., Liuzzi, G., Morini, B., Sciandrone, M.: On the use of iterative methods in cubic regularization for unconstrained optimization. Comput. Optim. Appl. 60, 35–57 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  8. Bianconcini, T., Sciandrone, M.: A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques. Optim. Methods Softw. 31, 1008–1035 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  9. Birgin, E.G., Gardenghi, J.L., Martínez, J. M., Santos, S.A., Toint, P.L.: Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models. Math. Program. 163, 359–368 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  10. Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularisation methods for unconstrained optimization. Part i: motivation, convergence and numerical results. Math. Program. 127, 245–295 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  11. Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function-and derivative-evaluation complexity. Math. Program. 130, 295–319 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  14. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  15. Dai, Y.H., Kou, C.X.: A Barzilai-Borwein conjugate gradient method. Sci. China Math. 59, 1511–1524 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  16. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  17. Dussault, J.P.: Simple Unified Convergence Proofs for the Trust-Region and a New ARC variant.Tech. Rep., University of Sherbrooke, Sherbrooke, Canada (2015)

  18. Fatemi, M.: A new efficient conjugate gradient method for unconstrained optimization. J. Comput. Appl. Math. 300, 207–216 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  19. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  20. Gould, N.I.M., Orban, D.: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29, 373–394 (2003)

    Article  MATH  Google Scholar 

  21. Gould, N.I.M., Porcelli, M.: Updating the regularization parameter in the adaptive cubic regularization algorithm. Comput. Optim. Appl. 53, 1–22 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  22. Gould, N.I.M., Robinson, D.P., Thorne, H.S.: On solving trust-region and other regularised subproblems in optimization. Math. Program. Comput. 2, 21–57 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  23. Griewank, A.: The modification of Newton’s method for unconstrained optimization by bounding cubic terms. Technical Report NA/12, Department of Applied Mathematics and Theoretical Physics University of Cambridge (1981)

  24. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  25. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  26. Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  27. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear system. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  28. Li, L.B.: A new algorithm for solving large scale trust region subproblem. Oper. Res. Manag. Sci. 16, 48–52 (2007)

    Google Scholar 

  29. Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with non- monotone line search for unconstrained optimization. Numer. Algorithms. 79, 195–219 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  30. Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput. Appl. Math. 38 (2019)

  31. Liu, H.W., Liu, Z.X.: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180, 879–906 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  32. Liu, Z.X., Liu, H.W.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  33. Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms. 78, 21–39 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  34. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  35. Necoara, I., Nesterov, Y. u., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Math. Program., Ser. A. 175, 69–107 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  36. Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton’s method and its global performance. Math. Program. 108, 177–205 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  37. Nathan, C., Autar, K., Jai, P., Michael, K.: Newton-Raphson Method-Graphical Simulation of the Method. University of South Florida. http://numericalmethods.eng.usf.deu/mws (2003)

  38. Nesterov, Y.: Accelerating the cubic regularization of Newtons method on convex problems. Math. Program. 112, 159–181 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  39. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)

    Article  MATH  Google Scholar 

  40. Radosaw, P.: Conjugate Gradient Algorithms in Nonconvex Optimization. Springer, Berlin (2009)

    Google Scholar 

  41. Rivaie, M., Mamat, M., Abashar, A.: A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches. Appl. Math. Comput. 268, 1152–1163 (2015)

    MathSciNet  MATH  Google Scholar 

  42. Sun, W.Y.: On nonquadratic model optimization methods. Asia Pac. J. Oper. Res. 13, 43–63 (1996)

    Google Scholar 

  43. Weiser, M., Deuflhard, P., Erdmann, B.: Affine conjugate adaptive Newton methods for nonlinear elastomechanics. Optim. Methods Softw. 22, 413–431 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  44. Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for largescale unconstrained optimization. Numer. Algorithm. 76, 813–828 (2017)

    Article  MATH  Google Scholar 

  45. Hsia, Y., Sheu, R.L., Yuan, Y.X.: Theory and application of p-regularized subproblems for p > 2. Optim Methods Softw. 1059–1077 (2017)

  46. Yuan, Y.X., Sun, W.Y.: Optimization Theory and Methods. Science Press, Beijing (1997)

    Google Scholar 

  47. Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75, 69–77 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  48. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11, 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  49. Yuan, Y.X.: A review on subspace methods for nonlinear optimization. In: Proceedings of the International Congress of Mathematics 2014, Seoul, Korea, pp 807–827 (2014)

  50. Yuan, Y.X.: Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim. Eng. 10, 207–218 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  51. Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  52. Zhou, B., Gao, L., Dai, Y.H.: Gradient methods with adaptive stepsizes. Comput. Optim. Appl. 35, 69–86 (2006)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

We would like to thank the anonymous referees for their useful comments. We also would like to thank Professors Hager,W.W. and Zhang, H.C. for their CG_DESCENT (5.3) code, and thank Professor Dai, Y.H and Dr. Kou, C.X. for their CGOPT code.

Funding

This research is supported by the National Science Foundation of China (No. 11901561), Guangxi Natural Science Foundation (No. 2018GXNSFBA281180) and China Postdoctoral Science Foundation (2019M660833).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongwei Liu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, T., Liu, H. & Liu, Z. New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer Algor 87, 1501–1534 (2021). https://doi.org/10.1007/s11075-020-01017-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-020-01017-1

Keywords

Mathematics Subject Classification (2010)

Navigation