Skip to main content
Log in

A Globally Convergent Derivative-Free Projection Method for Nonlinear Monotone Equations with Applications

  • Published:
Bulletin of the Malaysian Mathematical Sciences Society Aims and scope Submit manuscript

Abstract

In this paper, based on the theoretical effectiveness of quasi-Newton methods and the projection strategy, we propose a three-term derivative-free projection method for finding approximate solutions of large-scale nonlinear monotone equations. This method ensures that the descent condition holds. Its global convergence and rate of convergence are established. The proposed method is suitable for solving large-scale nonlinear monotone equations due to its low storage requirements. The method is tested on a number of benchmark test problems from the literature, and numerical results show the method’s efficacy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Abubakar, A.B., Sabi’u, J., Kumam, P., Shah, A.: Solving nonlinear operator equations via SR1 update. J. Appl. Math. Comput. (2021). https://doi.org/10.1007/s12190-020-01461-1

    Article  MathSciNet  Google Scholar 

  2. Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comput. Appl. Math. 325, 149–164 (2017)

    Article  MathSciNet  Google Scholar 

  3. Awwal, A.M., Kumam, P., Mohammad, H., Watthayu, W., Abubakar, A.B.: A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing \(\ell _{1}\) regularized problem. Optimization 70(5–6), 1231–1259 (2021)

    Article  MathSciNet  Google Scholar 

  4. Babaie-Kafaki, S.: A modified scaled memoryless symmetric rank-one method. Boll Unione Mat Ital. 13, 369–379 (2020)

    Article  MathSciNet  Google Scholar 

  5. Dai, Z., Kang, J.: Some new efficient mean-variance portfolio selection models. Int. J. Fin. Econ. (2021). https://doi.org/10.1002/ijfe.2400

    Article  Google Scholar 

  6. Dai, Z., Zhu, H.: A modified Hestenes–Stiefel-type derivative-free method for large-scale nonlinear monotone equations. Mathematics 8(2), 168 (2020)

    Article  Google Scholar 

  7. Dai, Z., Kang, J., Wen, F.: Predicting stock returns: a risk measurement perspective. Int. Rev. Financial Anal. 74, 101676 (2021)

    Article  Google Scholar 

  8. Dauda, M.K., Magaji, A.S., Abdullah, H., Sabi’u, J., Halilu, A.S.: A new search direction via hybrid conjugate gradient coefficient for solving nonlinear system of equations. Malays. J. Comput. Appl. Math. 2(1), 8–15 (2019)

    Google Scholar 

  9. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  10. Dreeb, N.K., Hashim, K.H., Mahdi, M.M., Wasi, H.A., Dwail, H.H., Shiker, M.A.K., Hussein, H.A.: Solving a large-scale nonlinear system of monotone equations by using a projection technique. J. Eng. Appl. Sci. 14(7), 10102–10108 (2019)

    Article  Google Scholar 

  11. Fang, X.: A class of new derivative-free gradient type methods for large-scale nonlinear systems of monotone equations. J. Inequal. Appl. 2020, 93 (2020)

    Article  MathSciNet  Google Scholar 

  12. Faramarzi, P., Amini, K.: A spectral three-term Hestenes–Stiefel conjugate gradient method, 4OR Q. J. Oper. Res. 19, 71–92 (2021)

    Article  Google Scholar 

  13. Feng, D., Sun, M., Wang, X.: A family of conjugate gradient methods for large-scale nonlinear equations. J. Inequal. Appl. 2017, 236 (2017)

    Article  MathSciNet  Google Scholar 

  14. Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction, application to compressed sensing and other inverse problems. IEEE J. STSP 1, 586–597 (2007)

    Google Scholar 

  15. Gao, P., He, C.: An efficient three-term conjugate gradient method for nonlinear monotone equation with convex constraints. Calcolo 55, 53 (2018)

    Article  MathSciNet  Google Scholar 

  16. Guo, J., Wan, Z.: A modified spectral PRP conjugate gradient projection method for solving large-scale monotone equations and its application in compressed sensing. Math. Probl. Eng. 2019, 5261830 (2019)

    MathSciNet  MATH  Google Scholar 

  17. Hu, Y., Wang, Y.: An efficient projected gradient method for convex constrained monotone equations with applications in compressive sensing. J. Appl. Math. Phys. 8, 983–998 (2020)

    Article  Google Scholar 

  18. Kaelo, P., Koorapetse, M.: A globally convergent projection method for a system of nonlinear monotone equations. Int. J. Comput. Math. 98, 719–737 (2021)

    Article  MathSciNet  Google Scholar 

  19. Koorapetse, M., Kaelo, P.: Globally convergent three-term conjugate gradient projection methods for solving nonliner monotone equations. Arab. J. Math. 7, 289–301 (2018)

    Article  Google Scholar 

  20. Koorapetse, M., Kaelo, P.: A new three-term conjugate gradient-based projection method for solving large-scale monotone equations. Math. Model. Anal. 24(4), 550–563 (2019)

    Article  MathSciNet  Google Scholar 

  21. Koorapetse, M., Kaelo, P., Offen, E.R.: A scaled derivative-free projection method for solving nonlinear monotone equations. Bull. Iran Math. Soc. 45, 755–770 (2019)

    Article  MathSciNet  Google Scholar 

  22. Koorapetse, M., Kaelo, P., Lekoko, S., Diphofu, T.: A derivative-free RMIL conjugate gradient projection method for convex constrained nonlinear monotone equations with applications in compressive sensing. Appl. Numer. Math. 165, 431–441 (2021)

    Article  MathSciNet  Google Scholar 

  23. Peiting, G., Chuanjiang, H.: A derivative-free three-term projection algorithm involving spectral quotient for solving nonlinear monotone equations. Optimization 67(10), 1631–1648 (2018)

    Article  MathSciNet  Google Scholar 

  24. Sabi’u, J., Shah, A.: An efficient three-term conjugate gradient-type algorithm for monotone nonlinear equations. RAIRO-Oper. Res. 55, S1113–S1127 (2021)

    Article  MathSciNet  Google Scholar 

  25. Sabi’u, J., Shah, A., Waziri, M.Y.: Two optimal Hager–Zhang conjugate gradient methods for solving monotone nonlinear equations. Appl. Numer. Math. 153, 217–233 (2020)

    Article  MathSciNet  Google Scholar 

  26. Sabi’u, J., Shah, A., Waziri, M.Y.: A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations. Int. J. Comput. Math. (2021). https://doi.org/10.1080/00207160.2021.1910814

    Article  MATH  Google Scholar 

  27. Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for systems of monotone equations. In: Fukushima, M., Qi, L. (eds.) Reformulation: Nonsmooth, Semismooth and Smoothing Methods. Applied Optimization, vol. 22, pp. 355–369. Springer, Boston (1998)

    Chapter  Google Scholar 

  28. Waziri, M.Y., Ahmed, K., Sabi’u, J.: A Dai–Liao conjugate gradient method via modified secant equations for system of nonlinear equations. Arab. J. Math. 9, 443–457 (2020)

    Article  MathSciNet  Google Scholar 

  29. Waziri, M.Y., Hungu, K.A., Sabi’u, J.: Descent Perry conjugate gradient methods for systems of monotone nonlinear equations. Numer. Algor. 85, 763–785 (2020)

    Article  MathSciNet  Google Scholar 

  30. Xiao, Y., Zhu, H.: A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing. J. Math. Anal. Appl. 405, 310–319 (2013)

    Article  MathSciNet  Google Scholar 

  31. Yuan, G., Hu, W.: A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations. J. Inequal. Appl. 2018, 113 (2018)

    Article  MathSciNet  Google Scholar 

  32. Yuan, G., Wang, B., Sheng, Z.: The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations. Int. J. Comput. Math. 96(8), 1533–1547 (2019)

    Article  MathSciNet  Google Scholar 

  33. Zhang, B., Zhu, Z.: A modified quasi-Newton diagonal update algorithm for total variation denoising problems and nonlinear monotone equations with applications in compressive sensing. Numer. Linear Algebra Appl. 22, 500–522 (2015)

    Article  MathSciNet  Google Scholar 

  34. Zhao, Y.B., Li, D.: Monotonicity of fixed point and normal mapping associated with variational inequality and its application. SIAM J. Optim. 11(4), 962–973 (2001)

    Article  MathSciNet  Google Scholar 

  35. Zhou, G., Toh, K.C.: Superlinear convergence of a Newton-type algorithm for monotone equations. J. Optim. Theory Appl. 125, 205–221 (2005)

    Article  MathSciNet  Google Scholar 

  36. Zhou, W.J., Li, D.H.: Limited memory BFGS method for nonlinear monotone equations. J. Comput. Math. 25, 89–96 (2007)

    MathSciNet  Google Scholar 

  37. Zhou, W.J., Li, D.H.: A globally convergent BFGS method for nonlinear monotone equations without any merit functions. Math. Comput. 77, 2231–2240 (2008)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to P. Kaelo.

Additional information

Communicated by Anton Abdulbasah Kamil.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kaelo, P., Koorapetse, M. & Sam, C.R. A Globally Convergent Derivative-Free Projection Method for Nonlinear Monotone Equations with Applications. Bull. Malays. Math. Sci. Soc. 44, 4335–4356 (2021). https://doi.org/10.1007/s40840-021-01171-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40840-021-01171-2

Keywords

Mathematics Subject Classification

Navigation