Skip to main content
Log in

A family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial extensions for the constrained nonlinear pseudo-monotone equations with applications

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

Al-Baali et al. (Comput. Optim. Appl. 60:89–110, 2015) have proposed a three-term conjugate gradient method which satisfies a sufficient descent condition and global convergence. In this paper, we extend this method to a family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial versions for constrained nonlinear pseudo-monotone equations. The accelerated gradient-descent method MSM and the relaxed-inertial strategy are incorporated into the proposed methods to obtain better computational performance. The global convergence of the extended methods are established theoretically without the Lipschitz continuity of the underlying mapping. Numerical results on constrained nonlinear equations show that the extended methods with different settings are efficient. The applicability and efficiency of the extended methods in the regularized decentralized logistic regression and sparse signal restoration problems are also presented and verified.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Availability of supporting data

Data used or analyzed in the study are available from the author upon reasonable request.

References

  1. Facchinei, F., Pang, J.S.: Finite-dimensional Variational Inequalities and Complementarity Problems. Vol-I. Springer, Berlin (2003)

    MATH  Google Scholar 

  2. Meintjes, K., Morgan, A.P.: Chemical equilibrium systems as numerical test problems. ACM Trans. Math. Software 16(2), 143–151 (1990)

    MATH  Google Scholar 

  3. Dirkse, S.P., Ferris, M.C.: MCPLIB: A collection of nonlinear mixed complementarity problems. Optim. Methods Softw. 5(4), 319–345 (1995)

    Google Scholar 

  4. Xiao, Y.H., Zhu, H.: A conjugate gradient method to sovle convex constrained monotone equations with applications in compressive sensing. J. Math. Anal. Appl. 405(1), 310–319 (2013)

    MathSciNet  MATH  Google Scholar 

  5. Yin, J.H., Jian, J.B., Jiang, X.Z., Liu, M.X., Wang, L.Z.: A hybrid three-term conjugate gradient projection method for constrained nonlinear monotone equations with applications. Numer. Algorithms 88, 389–418 (2021)

    MathSciNet  MATH  Google Scholar 

  6. Yin, J.H., Jian, J.B., Jiang, X.Z.: A generalized hybrid CGPM-based algorithm for solving large-scale convex constrained equations with applications to image restoration. J. Comput. Appl. Math. 391, 113423 (2021)

    MathSciNet  MATH  Google Scholar 

  7. Liu, P.J., Shao, H., Wang, Y., Wu, X.Y.: A three-term CGPM-based algorithm without Lipschitz continuity for constrained nonlinear monotone equations with applications. Appl. Numer. Math. 175, 98–107 (2022)

    MathSciNet  MATH  Google Scholar 

  8. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    MathSciNet  MATH  Google Scholar 

  9. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    MathSciNet  MATH  Google Scholar 

  10. Shao, H., Guo, H., Wu, X.Y., Liu, P.J.: Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising. Appl. Math. Model. 118, 393–411 (2023)

    MathSciNet  MATH  Google Scholar 

  11. Babaie-Kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    MathSciNet  MATH  Google Scholar 

  12. Aminifard, Z., Babaie-Kafaki, S.: A restart scheme for the Dai-Liao conjugate gradient method by ignoring a direction of maximum magnification by the search direction matrix. RAIRO-Oper. Res. 54(4), 981–991 (2020)

    MathSciNet  MATH  Google Scholar 

  13. Aminifard, Z., Babaie-Kafaki, S.: Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89, 1369–1387 (2022)

    MathSciNet  MATH  Google Scholar 

  14. Babaie-Kafaki, S., Aminifard, Z.: Improving the Dai-Liao parameter choices using a fixed point equation. J. Math. Model. 10(1), 11–20 (2022)

    MathSciNet  MATH  Google Scholar 

  15. Babaie-Kafaki, S., Gambari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Meth. Softw. 29(3), 583–591 (2014)

    MathSciNet  MATH  Google Scholar 

  16. Babaie-Kafaki, S.: A survey on the Dai-Liao family of nonlinear conjugate gradient methods. RAIRO-Oper. Res. 57(1), 43–58 (2023)

    MathSciNet  MATH  Google Scholar 

  17. Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for systems of monotone equations. In: Fukushima, M., Qi, L. (eds.) Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, pp. 355–369. Kluwer, Dordrecht (1998)

    Google Scholar 

  18. Abubakar, A.B., Kumam, P.: A descent Dai-Liao conjugate gradient method for nonlinear equations. Numer. Algorithms 81, 197–210 (2019)

    MathSciNet  MATH  Google Scholar 

  19. Ivanov, B., Stanimirović, P.S., Milovanović, G.V., Djordjević, S., Brajević, I.: Accelerated multiple step-size methods for solving unconstrained optimization problems. Optim. Methods Softw. 36, 998–1029 (2021)

    MathSciNet  MATH  Google Scholar 

  20. Ivanov, B., Milovanović, G.V., Stanimirović, P.S.: Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring. J. Glob. Optim. 85, 377–420 (2023)

    MathSciNet  MATH  Google Scholar 

  21. Liu, J.K., Lu, Z.L., Xu, J.L., Wu, S., Tu, Z.W.: An efficient projection-based algorithm without Lipschitz continuity for large-scale nonlinear pseudo-monotone equations. J. Comput. Appl. Math. 403, 113822 (2022)

    MathSciNet  MATH  Google Scholar 

  22. Yuan, G.L., Li, T.T., Hu, W.J.: A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems. Appl. Numer. Math. 147, 129–141 (2020)

    MathSciNet  MATH  Google Scholar 

  23. Yuan, G.L., Wang, B.P., Sheng, Z.: The Hager-Zhang conjugate gradient algorithm for large-scale nonlinear equations. Inter. J. Comput. Math. 96(8), 1533–1547 (2019)

    MathSciNet  MATH  Google Scholar 

  24. Ou, Y.G., Li, L.: A unified convergence analysis of the derivative-free projection-based method for constrained nonlinear monotone equations. Numer. Algorithms (2022). https://doi.org/10.1007/s11075-022-01483-9

    Article  MATH  Google Scholar 

  25. Ou, Y.G., Xu, W.J.: A unified derivative-free projection method model for large-scale nonlinear equations with convex constraints. J. Ind. Manag. Optim. 18(5), 3539–3560 (2022)

    MathSciNet  MATH  Google Scholar 

  26. Liu, P.J., Wu, X.Y., Shao, H., Zhang, Y., Cao, S.H.: Three adaptive hybrid derivative-free projection methods for constrained monotone nonlinear equations and their applications. Numer. Linear Algebra Appl. 30(2), e2471 (2023)

    MathSciNet  MATH  Google Scholar 

  27. Wu, X.Y., Shao, H., Liu, P.J., Zhang, Y., Zhuo, Y.: An efficient conjugate gradient-based algorithm for unconstrained optimization and its projection extension to large-scale constrained nonlinear equations with application in signal recovery and image denoising problems. J. Comput. Appl. Math. 422, 114879 (2023)

    MathSciNet  MATH  Google Scholar 

  28. Sun, M., Liu, J.: New hybrid conjugate gradient projection method for the convex constrained equations. Calcolo 53, 399–411 (2018)

    MathSciNet  MATH  Google Scholar 

  29. Sun, M., Liu, J.: Three derivative-free projection methods for nonlinear equations with convex constraints. J. Appl. Math. Comput. 47(1), 265–276 (2015)

    MathSciNet  MATH  Google Scholar 

  30. Wang, S., Guan, H.B.: A scaled conjugate gradient method for solving monotone nonlinear equations with convex constraints. J. Appl. Math. 2013, 286486 (2013)

    MathSciNet  MATH  Google Scholar 

  31. Yu, G.H., Niu, S.Z., Ma, J.H.: Multivariate spectral gradient projection method for nonlinear monotone equations with convex constraints. J. Ind. Manag. Optim. 9(1), 117–129 (2013)

    MathSciNet  MATH  Google Scholar 

  32. Abubakar, A.B., Kuman, P.: A descent Dai-Liao conjugate gradient method for nonlinear equations. Numer. Algorithms 81(1), 197–210 (2019)

    MathSciNet  MATH  Google Scholar 

  33. Gao, P.T., He, C.J.: An efficient three-term conjugate gradient method for nonlinear monotone equations with convex constraints. Calcolo 55, 53 (2018)

    MathSciNet  MATH  Google Scholar 

  34. Liu, J.K., Feng, Y.M.: A derivative-free iterative method for nonlinear monotone equations with convex constraints. Numer. Algorithms 82, 245–262 (2019)

    MathSciNet  MATH  Google Scholar 

  35. Liu, J.K., Sun, Y., Zhao, Y.X.: A derivative-free projection algorithm for solving pseudo-monotone equations with convex constraints (in Chinese). Math. Numer. Sin. 43(3), 388–400 (2021)

    MathSciNet  MATH  Google Scholar 

  36. Papp, Z., Rapajić, S.: FR type methods for systems of large-scale nonlinear monotone equations. Appl. Math. Comput. 269, 816–823 (2015)

    MathSciNet  MATH  Google Scholar 

  37. Polyak, B.T.: Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys. 4(5), 1–17 (1964)

    Google Scholar 

  38. Wang, X.Q., Shao, H., Liu, P.J., Wu, T.: An inertial proximal partially symmetric ADMM-based algorithm for linearly constrained multi-block nonconvex optimization problems with applications. J. Comput. Appl. Math. 420, 114821 (2023)

    MathSciNet  MATH  Google Scholar 

  39. Chen, C.H., Chan, R.H., Ma, S.Q., Yang, J.F.: Inertial proximal ADMM for linearly constrained separable convex optimization. SIAM J. Imaging Sci. 8(4), 2239–2267 (2015)

    MathSciNet  MATH  Google Scholar 

  40. Dou, M.Y., Li, H.Y., Liu, X.W.: An inertial proximal Peaceman-Rachford splitting method (in Chinese). Sci. Sin. Math. 47(2), 333–348 (2017)

    MATH  Google Scholar 

  41. Gao, X., Cai, X.J., Han, D.R.: A Gauss-Seidel type inertial proximal alternating linearized minimization for a class of nonconvex optimization problems. J. Glob. Optim. 76(4), 863–887 (2020)

    MathSciNet  MATH  Google Scholar 

  42. Abubakar, A.B., Kumam, P., Ibrahim, A.H.: Inertial derivative-free projection method for nonlinear monotone operator equations with convex constraints. IEEE Access 9, 92157–92167 (2021)

    Google Scholar 

  43. Ma, G.D., Jin, J.C., Jian, J.B., Yin, J.H., Han, D.L.: A modified inertial three-term conjugate gradient projection method for constrained nonlinear equations with applications in compressed sensing. Numer. Algorithms 2023, 92(3), 1621-1653 (2023)

  44. Ibrahim, A.H., Kumam, P., Abubakar, A.B., Adamu, A.: Accelerated derivative-free method for nonlinear monotone equations with an application. Numer. Linear Algebra Appl. 29(3), e2424 (2022)

    MathSciNet  MATH  Google Scholar 

  45. Ibrahim, A.H., Kumam, P., Sun, M., Chaipunya, P.: Projection method with inertial step for nonlinear equations: application to signal recovery. J. Ind. Manag. Optim. 19(1), 30–55 (2023)

    MathSciNet  MATH  Google Scholar 

  46. Ibrahim, A.H., Kumam, P., Rapajić, S., Papp, Z., Abubakar, A.B.: Approximation methods with inertial term for large-scale nonlinear monotone equations. Appl. Numer. Math. 181, 417–435 (2022)

    MathSciNet  MATH  Google Scholar 

  47. Jian, J.B., Yin, J.H., Tang, C.M., Han, D.L.: A family of inertial derivative-free projection methods for constrained nonlinear pseudo-monotone equations with applications. Comput. Appl. Math. 41, 309 (2022)

    MathSciNet  MATH  Google Scholar 

  48. Yin, J.H., Jian, J.B., Jiang, X.Z., Wu, X.D.: A family of inertial-relaxed DFPM-based algorithms for solving large-scale monotone nonlinear equations with application to sparse signal restoration. J. Comput. Appl. Math. 419, 114674 (2023)

    MathSciNet  MATH  Google Scholar 

  49. Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60, 89–110 (2015)

    MathSciNet  MATH  Google Scholar 

  50. Stanimirović, P.S., Miladinović, M.B.: Accelerated gradient descent methods with line search. Numer. Algorithms 54, 503–520 (2010)

    MathSciNet  MATH  Google Scholar 

  51. Jiang, X.Z., Zhu, Y.H., Jian, J.B.: Two efficient nonlinear conjugate gradient methods with restart procedures and their applications in image restoration. Nonlinear Dyn. 111, 5469–5498 (2023)

    Google Scholar 

  52. Jiang, X.Z., Yang, H.H., Yin, J.H., Liao, W.: A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems. J. Comput. Appl. Math. 424, 115020 (2023)

    MathSciNet  MATH  Google Scholar 

  53. Yin, J.H., Jian, J.B., Jiang, X.Z.: A spectral gradient projection algorithm for convex constrained nonsmooth equations based on an adaptive line search (in Chinese). Math. Numer. Sin. 42(4), 457–471 (2020)

    MathSciNet  MATH  Google Scholar 

  54. Alves, M.M., Eckstein, J., Geremia, M., Melo, J.G.: Relative-error inertial-relaxed inexact versions of Douglas-Rachford and ADMM splitting algorithms. Comput. Optim. Appl. 75(2), 389–422 (2020)

    MathSciNet  MATH  Google Scholar 

  55. Polyak, B.T.: Introduction to Optimization, Optimization Software, pp. 49. Inc. Publications Division, New York (1987)

  56. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    MathSciNet  MATH  Google Scholar 

  57. Zhou, W.J., Li, D.H.: Limited memory BFGS method for nonlinear monotone equations. J. Comput. Math. 25, 89–96 (2007)

    MathSciNet  Google Scholar 

  58. Cruz, W.L., Raydan, M.: Nonmonotone spectral methods for large-scale nonlinear systems. Optim. Methods Softw. 18(5), 583–599 (2003)

    MathSciNet  MATH  Google Scholar 

  59. Luo, H.: Accelerated primal-dual methods for linearly constrained convex optimization problems. arXiv: 2109.12604 (2022)

  60. Chang, C.C., Lin, C.J.: LIBSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. 2(3), 1–27 (2011)

    Google Scholar 

  61. Xiao, Y.H., Wang, Q.Y., Hu, Q.J.: Non-smooth equations based method for \(l_1\)-norm problems with applications to compressed sensing. Nonlinear Anal. Theor. 74(11), 3570–3577 (2011)

    MATH  Google Scholar 

  62. Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction, application to compressed sensing and other inverse problems. IEEE J. Sel. Top. Sign. Proces. 1(4), 586–597 (2007)

    Google Scholar 

  63. Pang, J.S.: Inexact Newton methods for the nonlinear complementary problem. Math. Program. 36(1), 54–71 (1986)

    MathSciNet  MATH  Google Scholar 

  64. Hoyer, P.O.: Non-negative matrix factorization with sparseness constraints. J. Mach. Learn. Res. 5, 1457–1469 (2004)

    MathSciNet  MATH  Google Scholar 

Download references

Funding

This work was supported by the Fundamental Research Funds for the Central Universities (JSX220005).

Author information

Authors and Affiliations

Authors

Contributions

All authors read and approved the final manuscript. PL is mainly responsible for algorithm design and theoretical analysis; HS is mainly contributing to algorithm design; ZY and TZ mainly contribute to theoretical analysis and numerical experiments; XW is mainly contributing to algorithm design and numerical experiments.

Corresponding author

Correspondence to Hu Shao.

Ethics declarations

Ethical approval

Not applicable

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, P., Shao, H., Yuan, Z. et al. A family of three-term conjugate gradient projection methods with a restart procedure and their relaxed-inertial extensions for the constrained nonlinear pseudo-monotone equations with applications. Numer Algor 94, 1055–1083 (2023). https://doi.org/10.1007/s11075-023-01527-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-023-01527-8

Keywords

Mathematics Subject Classification (2010)

Navigation