Skip to main content
Log in

A modified conjugate gradient method for general convex functions

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

The aim of this paper is to introduce a new non-smooth conjugate gradient method for solving unconstrained minimization problems. The proposed method is a member of a descent family of Dai-Liao conjugate gradient methods suitable for minimizing convex non-smooth objective functions. Under mild assumptions, our proposed method is globally convergent, and all search directions satisfy the sufficient descent condition. Numerical comparative results indicate that the new algorithm is efficient and outperform some exiting algorithms in this field.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Data availability

The data and code that support the findings of this study are available from the corresponding author upon request.

References

  1. Abdollahi, F., Fatemi, M.: An efficient conjugate gradient method with strong convergence properties for non-smooth optimization. Journal of Mathematical Modeling, pp. 1–16 (2021)

  2. Andrei, N.: A dai-liao conjugate gradient algorithm with clustering of eigenvalues. Numerical Algorithms 77(4), 1273–1282 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  3. Babaie-Kafaki, S., Ghanbari, R.: The dai–liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  4. Babaie-Kafaki, S., Ghanbari, R.: A descent family of dai–liao conjugate gradient methods. Optimization Methods and Software 29(3), 583–591 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  5. Beck, A.: First-order methods in optimization. SIAM (2017)

  6. Chen, X., Fukushima, M.: Proximal quasi-newton methods for nondifferentiable convex optimization. Math. Program. 85(2), 313–334 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  7. Conn, A.R., Gould, N.I., Toint, P.L.: Trust region methods, vol. 1, Siam (2000)

  8. Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62(1-3), 261–275 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  9. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001). https://doi.org/10.1007/s002450010019

    Article  MathSciNet  MATH  Google Scholar 

  11. Fatemi, M.: A new efficient conjugate gradient method for unconstrained optimization. J. Comput. Appl. Math. 300, 207–216 (2016). https://doi.org/10.1016/j.cam.2015.12.035

    Article  MathSciNet  MATH  Google Scholar 

  12. Fukushima, M.: A descent algorithm for nonsmooth convex optimization. Math. Program. 30(2), 163–175 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  13. Fukushima, M., Qi, L.: A globally and superlinearly convergent algorithm for nonsmooth convex minimization. SIAM J. Optim. 6(4), 1106–1120 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gould, N.I., Orban Dominique MRNUMBER = 2272354, a.T.P.L.: Cuter And sifdec: A constrained and unconstrained testing environment, revisited. ACM Transactions on Mathematical Software (TOMS) 29(4), 373–394 (2003)

    Article  MATH  Google Scholar 

  15. Haarala, M., Miettinen, K., Mäkelä, M.M.: New limited memory bundle method for large-scale nonsmooth optimization. Optimization Methods and Software 19(6), 673–692 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  16. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005). https://doi.org/10.1137/030601880

    Article  MathSciNet  MATH  Google Scholar 

  17. Hiriart-Urruty, J.B., Lemaréchal, C.: Convex analysis and minimization algorithms i: Fundamentals, vol. 305 Springer science & business media (2013)

  18. Hu, Y., Liu, L., Wang, Y.: Wei-yao-liu conjugate gradient algorithm for nonsmooth convex optimization problems. Statistics, Optimization & Information Computing 8(2), 403–413 (2020)

    Article  MathSciNet  Google Scholar 

  19. Lemaréchal, C., Sagastizábal, C.: Practical aspects of the moreau–yosida regularization: Theoretical preliminaries. SIAM J. Optim. 7(2), 367–385 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  20. Li, Q.: A modified fletcher-reeves-type method for nonsmooth convex minimization. Statistics, Optimization & Information Computing 2(3), 200–210 (2014)

    Article  MathSciNet  Google Scholar 

  21. Lin, H., Mairal, J., Harchaoui, Z.: An inexact variable metric proximal point algorithm for generic quasi-newton acceleration. SIAM J. Optim. 29(2), 1408–1443 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  22. Lukšan, L., Vlcek, J.: Test problems for nonsmooth unconstrained and linearly constrained optimization. Technická, zpráva, pp. 798 (2000)

  23. Mahdavi-Amiri, N., Yousefpour, R.: An effective nonsmooth optimization algorithm for locally lipschitz functions. J. Optim. Theory Appl. 155 (1), 180–195 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  24. Meng, F., Zhao, G.: On second-order properties of the moreau–yosida regularization for constrained nonsmooth convex programs. Numer. Funct. Anal. Optim. 25(5-6), 515–529 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  25. Momeni, M., Peyghami, M.: A new conjugate gradient algorithm with cubic barzilai–borwein stepsize for unconstrained optimization. Optimization Methods and Software 34(3), 650–664 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  26. Parikh, N., Boyd, S., et al.: Proximal algorithms. Foundations and Trends®; in Optimization 1(3), 127–239 (2014)

    Article  Google Scholar 

  27. Sagara, N., Fukushima, M.: A trust region method for nonsmooth convex optimization. Management 1(2), 171–180 (2005)

    MathSciNet  MATH  Google Scholar 

  28. Woldu, T.G., Zhang, H., Zhang, X., Fissuh, Y.H.: A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization. J. Optim. Theory Appl., pp. 1–16 (2020)

  29. Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28(2), 203–225 (2004). https://doi.org/10.1023/B:COAP.0000026885.81997.88

    Article  MathSciNet  MATH  Google Scholar 

  30. Yuan, G., Meng, Z., Li, Y.: A modified hestenes and stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168(1), 129–152 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  31. Yuan, G., Sheng, Z., Liu, W.: The modified hz conjugate gradient algorithm for large-scale nonsmooth optimization. PloS One 11(10), e0164,289 (2016)

    Article  Google Scholar 

  32. Yuan, G., Wei, Z., Li, G.: A modified polak–ribière–polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255, 86–96 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  33. Yuan, G., Wei, Z., Wang, Z.: Gradient trust region algorithm with limited memory bfgs update for nonsmooth convex minimization. Comput. Optim. Appl. 54(1), 45–64 (2013)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors thank the Research Council of K.N. Toosi University of Technology for supporting this work. We also would like to thank the unknown referees for their valuable comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masoud Fatemi.

Ethics declarations

Conflict of interest

The authors declare competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Abdollahi, F., Fatemi, M. A modified conjugate gradient method for general convex functions. Numer Algor 92, 1485–1502 (2023). https://doi.org/10.1007/s11075-022-01349-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-022-01349-0

Keywords

Navigation