Abstract
Recently, Jian, Han and Jiang proposed a descent hybrid conjugate gradient method which is globally convergent without convexity assumption on the objective function, being also sensibly promising in computational point of view. Here, we develop one-parameter descent extensions of the method based on the Dai–Liao approach. We show that one of the given methods satisfies the sufficient descent condition when the parameter is chosen properly. Also, we establish global convergence of the method without convexity assumption. At last, practical merits of the methods are investigated by numerical experiments on a set of CUTEr test functions as well as the signal processing problems. The results show computational efficiency of the proposed methods.
Similar content being viewed by others
References
Abubakar, A. B., Kumam, P., Awwal, A.M.: Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery. Result. Appl. Math. 4, 100069 (2019)
Aminifard, Z., Babaie-Kafaki, S.: A modified descent polak-ribiére-polyak conjugate gradient method with global convergence property for nonconvex functions. Calcolo 56(2), 16 (2019)
Aminifard, Z., Babaie-Kafaki, S.: An optimal parameter choice for the Dai-Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix. 4OR 17, 317–330 (2019)
Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)
Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm. 47(2), 143–156 (2008)
Antoine, X., Levitt, A., Tang, Q.: Efficient spectral computation of the stationary states of rotating Bose–Einstein condensates by preconditioned nonlinear conjugate gradient methods. J Comput. Phys. 343, 92–109 (2017)
Arazm, M. R., Babaie-Kafaki, S., Ghanbari, R.: An extended Dai–Liao conjugate gradient method with global convergence for nonconvex functions. Glasnik Mat. 52(2), 361–375 (2017)
Awwal, A. M., Kumam, P., Abubakar, A.B.: A modified conjugate gradient method for monotone nonlinear equations with convex constraints. Appl. Numer. Math. 145, 507–520 (2019)
Babaie-Kafaki, S.: On the sufficient descent condition of the Hager-Zhang conjugate gradient methods. 4OR 12(3), 285–292 (2014)
Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30(4), 673–681 (2015)
Birgin, E., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)
Black, M. J., Rangarajan, A.: On the unification of line processes, outlier rejection, and robust statistics with applications in early vision. Int. J. Comput. Vis. 19(1), 57–91 (1996)
Bruckstein, A. M., Donoho, D. L., Elad, M.: From sparse solutions of systems of equations to sparse modeling of signals and images. SIAM Rev. 51 (1), 34–81 (2009)
Cao, J., Wu, J.: A conjugate gradient algorithm and its applications in image restoration. Appl. Numer Math. 152, 243–252 (2020)
Dai, Y. H., Han, J. Y., Liu, G. H., Sun, D. F., Yin, H. X., Yuan, Y.X.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 348–358 (1999)
Dai, Y. H., Kou, C. X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y. H., Liao, L. Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Dolan, E. D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Programm. 91(2, Ser. A), 201–213 (2002)
Esmaeili, H., Shabani, S., Kimiaei, M.: A new generalized shrinkage conjugate gradient method for sparse recovery. Calcolo 56(1), 1–38 (2019)
Exl, L., Fischbacher, J., Oezelt, H., Gusenbauer, M., Schrefl, T.: Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization. Comput. Phys Commun. 235, 179–186 (2019)
Faramarzi, P., Amini, K.: A modified spectral conjugate gradient method with global convergence. J. Optim Theory Appl. 182, 667–690 (2019)
Gilbert, J. C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Gould, N. I. M., Orban, D., Toint, Ph.l: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Hager, W. W., Zhang, H.: Algorithm 851: CG−Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Hager, W. W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Heravi, A. R., Hodtani, G.A.: A new correntropy-based conjugate gradient backpropagation algorithm for improving training in neural networks. IEEE Trans. Neural Netw. Learn Syst. 29(12), 6252–6263 (2018)
Jian, J. B., Han, L., Jiang, X.Z.: A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl. Math Model. 39, 1281–1290 (2015)
Jiang, X. Z., Han, L., Jian, J.B.: A globally convergent mixed conjugate gradient method with Wolfe line search. Math. Numer. Sin. 34, 103–112 (2012)
Li, X., Zhang, W., Dong, X.: A class of modified FR conjugate gradient method and applications to non-negative matrix factorization. Comput. Math. Appl. 73, 270–276 (2017)
Lin, J., Jiang, C.: An improved conjugate gradient parametric detection based on space-time scan. Signal Process. 169, 107412 (2020)
Nesterov, Y.: Excessive gap technique in nonsmooth convex minimization. SIAM J. Optim. 16(1), 235–249 (2005)
Nocedal, J., Wright, S. J.: Numerical Optimization. Springer, New York (2006)
Shengwei, Y., Wei, Z., Huang, H.: A note about WYL’s conjugate gradient method and its applications. Appl. Math. Comput. 191(2), 381–388 (2007)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Sun, W., Yuan, Y. X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)
Wang, X. Y., Li, S. J., Kou, X. P.: A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints. Calcolo 53, 133–145 (2016)
Wei, Z. X., Yao, S. W., Liu, L. Y.: The convergence properties of some new conjugate gradient methods. Appl. Math. Comput. 183, 1341–1350 (2006)
Yu, G., Huang, J., Zhou, Y.: A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 23(5), 555–560 (2010)
Yuan, G., Li, T., Hu, W.: A conjugate gradient algorithm and its application in large-scale optimization problems and image restoration. J. Inequal. Appl. 2019(1), 247 (2019)
Yuan, G., Li, T., Hu, W.: A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems. Appl. Numer. Math. 147, 129–141 (2020)
Yuan, G., Lu, J., Wang, Z.: The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl. Numer. Math. 152, 1–11 (2020)
Yuan, G., Meng, Z., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168(1), 129–152 (2016)
Zhang, L., Zhou, W., Li, D. H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)
Zhu, H., Xiao, Y., Wu, S.Y.: Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique. Comput. Math Appl. 66(1), 24–32 (2013)
Acknowledgements
The authors owe a major debt of gratitude to Professor Michael Navon for the line search code. They also thank the anonymous reviewers for their valuable comments and suggestions that helped to improve the quality of this work.
Funding
This research was in part supported by the grant no. 97022259 from Iran National Science Foundation (INSF), and in part by the Research Council of Semnan University (grant no. 31.99.21870).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Aminifard, Z., Babaie-Kafaki, S. Dai–Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer Algor 89, 1369–1387 (2022). https://doi.org/10.1007/s11075-021-01157-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-021-01157-y
Keywords
- Unconstrained optimization
- Conjugate gradient method
- Sufficient descent condition
- Global convergence
- Image restoration
- Compressed sensing