Abstract
The aim of this paper is to introduce a new non-smooth conjugate gradient method for solving unconstrained minimization problems. The proposed method is a member of a descent family of Dai-Liao conjugate gradient methods suitable for minimizing convex non-smooth objective functions. Under mild assumptions, our proposed method is globally convergent, and all search directions satisfy the sufficient descent condition. Numerical comparative results indicate that the new algorithm is efficient and outperform some exiting algorithms in this field.
Similar content being viewed by others
Data availability
The data and code that support the findings of this study are available from the corresponding author upon request.
References
Abdollahi, F., Fatemi, M.: An efficient conjugate gradient method with strong convergence properties for non-smooth optimization. Journal of Mathematical Modeling, pp. 1–16 (2021)
Andrei, N.: A dai-liao conjugate gradient algorithm with clustering of eigenvalues. Numerical Algorithms 77(4), 1273–1282 (2018)
Babaie-Kafaki, S., Ghanbari, R.: The dai–liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Babaie-Kafaki, S., Ghanbari, R.: A descent family of dai–liao conjugate gradient methods. Optimization Methods and Software 29(3), 583–591 (2014)
Beck, A.: First-order methods in optimization. SIAM (2017)
Chen, X., Fukushima, M.: Proximal quasi-newton methods for nondifferentiable convex optimization. Math. Program. 85(2), 313–334 (1999)
Conn, A.R., Gould, N.I., Toint, P.L.: Trust region methods, vol. 1, Siam (2000)
Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62(1-3), 261–275 (1993)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001). https://doi.org/10.1007/s002450010019
Fatemi, M.: A new efficient conjugate gradient method for unconstrained optimization. J. Comput. Appl. Math. 300, 207–216 (2016). https://doi.org/10.1016/j.cam.2015.12.035
Fukushima, M.: A descent algorithm for nonsmooth convex optimization. Math. Program. 30(2), 163–175 (1984)
Fukushima, M., Qi, L.: A globally and superlinearly convergent algorithm for nonsmooth convex minimization. SIAM J. Optim. 6(4), 1106–1120 (1996)
Gould, N.I., Orban Dominique MRNUMBER = 2272354, a.T.P.L.: Cuter And sifdec: A constrained and unconstrained testing environment, revisited. ACM Transactions on Mathematical Software (TOMS) 29(4), 373–394 (2003)
Haarala, M., Miettinen, K., Mäkelä, M.M.: New limited memory bundle method for large-scale nonsmooth optimization. Optimization Methods and Software 19(6), 673–692 (2004)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005). https://doi.org/10.1137/030601880
Hiriart-Urruty, J.B., Lemaréchal, C.: Convex analysis and minimization algorithms i: Fundamentals, vol. 305 Springer science & business media (2013)
Hu, Y., Liu, L., Wang, Y.: Wei-yao-liu conjugate gradient algorithm for nonsmooth convex optimization problems. Statistics, Optimization & Information Computing 8(2), 403–413 (2020)
Lemaréchal, C., Sagastizábal, C.: Practical aspects of the moreau–yosida regularization: Theoretical preliminaries. SIAM J. Optim. 7(2), 367–385 (1997)
Li, Q.: A modified fletcher-reeves-type method for nonsmooth convex minimization. Statistics, Optimization & Information Computing 2(3), 200–210 (2014)
Lin, H., Mairal, J., Harchaoui, Z.: An inexact variable metric proximal point algorithm for generic quasi-newton acceleration. SIAM J. Optim. 29(2), 1408–1443 (2019)
Lukšan, L., Vlcek, J.: Test problems for nonsmooth unconstrained and linearly constrained optimization. Technická, zpráva, pp. 798 (2000)
Mahdavi-Amiri, N., Yousefpour, R.: An effective nonsmooth optimization algorithm for locally lipschitz functions. J. Optim. Theory Appl. 155 (1), 180–195 (2012)
Meng, F., Zhao, G.: On second-order properties of the moreau–yosida regularization for constrained nonsmooth convex programs. Numer. Funct. Anal. Optim. 25(5-6), 515–529 (2004)
Momeni, M., Peyghami, M.: A new conjugate gradient algorithm with cubic barzilai–borwein stepsize for unconstrained optimization. Optimization Methods and Software 34(3), 650–664 (2019)
Parikh, N., Boyd, S., et al.: Proximal algorithms. Foundations and Trends®; in Optimization 1(3), 127–239 (2014)
Sagara, N., Fukushima, M.: A trust region method for nonsmooth convex optimization. Management 1(2), 171–180 (2005)
Woldu, T.G., Zhang, H., Zhang, X., Fissuh, Y.H.: A modified nonlinear conjugate gradient algorithm for large-scale nonsmooth convex optimization. J. Optim. Theory Appl., pp. 1–16 (2020)
Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28(2), 203–225 (2004). https://doi.org/10.1023/B:COAP.0000026885.81997.88
Yuan, G., Meng, Z., Li, Y.: A modified hestenes and stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168(1), 129–152 (2016)
Yuan, G., Sheng, Z., Liu, W.: The modified hz conjugate gradient algorithm for large-scale nonsmooth optimization. PloS One 11(10), e0164,289 (2016)
Yuan, G., Wei, Z., Li, G.: A modified polak–ribière–polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255, 86–96 (2014)
Yuan, G., Wei, Z., Wang, Z.: Gradient trust region algorithm with limited memory bfgs update for nonsmooth convex minimization. Comput. Optim. Appl. 54(1), 45–64 (2013)
Acknowledgements
The authors thank the Research Council of K.N. Toosi University of Technology for supporting this work. We also would like to thank the unknown referees for their valuable comments.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Abdollahi, F., Fatemi, M. A modified conjugate gradient method for general convex functions. Numer Algor 92, 1485–1502 (2023). https://doi.org/10.1007/s11075-022-01349-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-022-01349-0