Abstract
In this paper, based on the theoretical effectiveness of quasi-Newton methods and the projection strategy, we propose a three-term derivative-free projection method for finding approximate solutions of large-scale nonlinear monotone equations. This method ensures that the descent condition holds. Its global convergence and rate of convergence are established. The proposed method is suitable for solving large-scale nonlinear monotone equations due to its low storage requirements. The method is tested on a number of benchmark test problems from the literature, and numerical results show the method’s efficacy.
Similar content being viewed by others
References
Abubakar, A.B., Sabi’u, J., Kumam, P., Shah, A.: Solving nonlinear operator equations via SR1 update. J. Appl. Math. Comput. (2021). https://doi.org/10.1007/s12190-020-01461-1
Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comput. Appl. Math. 325, 149–164 (2017)
Awwal, A.M., Kumam, P., Mohammad, H., Watthayu, W., Abubakar, A.B.: A Perry-type derivative-free algorithm for solving nonlinear system of equations and minimizing \(\ell _{1}\) regularized problem. Optimization 70(5–6), 1231–1259 (2021)
Babaie-Kafaki, S.: A modified scaled memoryless symmetric rank-one method. Boll Unione Mat Ital. 13, 369–379 (2020)
Dai, Z., Kang, J.: Some new efficient mean-variance portfolio selection models. Int. J. Fin. Econ. (2021). https://doi.org/10.1002/ijfe.2400
Dai, Z., Zhu, H.: A modified Hestenes–Stiefel-type derivative-free method for large-scale nonlinear monotone equations. Mathematics 8(2), 168 (2020)
Dai, Z., Kang, J., Wen, F.: Predicting stock returns: a risk measurement perspective. Int. Rev. Financial Anal. 74, 101676 (2021)
Dauda, M.K., Magaji, A.S., Abdullah, H., Sabi’u, J., Halilu, A.S.: A new search direction via hybrid conjugate gradient coefficient for solving nonlinear system of equations. Malays. J. Comput. Appl. Math. 2(1), 8–15 (2019)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Dreeb, N.K., Hashim, K.H., Mahdi, M.M., Wasi, H.A., Dwail, H.H., Shiker, M.A.K., Hussein, H.A.: Solving a large-scale nonlinear system of monotone equations by using a projection technique. J. Eng. Appl. Sci. 14(7), 10102–10108 (2019)
Fang, X.: A class of new derivative-free gradient type methods for large-scale nonlinear systems of monotone equations. J. Inequal. Appl. 2020, 93 (2020)
Faramarzi, P., Amini, K.: A spectral three-term Hestenes–Stiefel conjugate gradient method, 4OR Q. J. Oper. Res. 19, 71–92 (2021)
Feng, D., Sun, M., Wang, X.: A family of conjugate gradient methods for large-scale nonlinear equations. J. Inequal. Appl. 2017, 236 (2017)
Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction, application to compressed sensing and other inverse problems. IEEE J. STSP 1, 586–597 (2007)
Gao, P., He, C.: An efficient three-term conjugate gradient method for nonlinear monotone equation with convex constraints. Calcolo 55, 53 (2018)
Guo, J., Wan, Z.: A modified spectral PRP conjugate gradient projection method for solving large-scale monotone equations and its application in compressed sensing. Math. Probl. Eng. 2019, 5261830 (2019)
Hu, Y., Wang, Y.: An efficient projected gradient method for convex constrained monotone equations with applications in compressive sensing. J. Appl. Math. Phys. 8, 983–998 (2020)
Kaelo, P., Koorapetse, M.: A globally convergent projection method for a system of nonlinear monotone equations. Int. J. Comput. Math. 98, 719–737 (2021)
Koorapetse, M., Kaelo, P.: Globally convergent three-term conjugate gradient projection methods for solving nonliner monotone equations. Arab. J. Math. 7, 289–301 (2018)
Koorapetse, M., Kaelo, P.: A new three-term conjugate gradient-based projection method for solving large-scale monotone equations. Math. Model. Anal. 24(4), 550–563 (2019)
Koorapetse, M., Kaelo, P., Offen, E.R.: A scaled derivative-free projection method for solving nonlinear monotone equations. Bull. Iran Math. Soc. 45, 755–770 (2019)
Koorapetse, M., Kaelo, P., Lekoko, S., Diphofu, T.: A derivative-free RMIL conjugate gradient projection method for convex constrained nonlinear monotone equations with applications in compressive sensing. Appl. Numer. Math. 165, 431–441 (2021)
Peiting, G., Chuanjiang, H.: A derivative-free three-term projection algorithm involving spectral quotient for solving nonlinear monotone equations. Optimization 67(10), 1631–1648 (2018)
Sabi’u, J., Shah, A.: An efficient three-term conjugate gradient-type algorithm for monotone nonlinear equations. RAIRO-Oper. Res. 55, S1113–S1127 (2021)
Sabi’u, J., Shah, A., Waziri, M.Y.: Two optimal Hager–Zhang conjugate gradient methods for solving monotone nonlinear equations. Appl. Numer. Math. 153, 217–233 (2020)
Sabi’u, J., Shah, A., Waziri, M.Y.: A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations. Int. J. Comput. Math. (2021). https://doi.org/10.1080/00207160.2021.1910814
Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for systems of monotone equations. In: Fukushima, M., Qi, L. (eds.) Reformulation: Nonsmooth, Semismooth and Smoothing Methods. Applied Optimization, vol. 22, pp. 355–369. Springer, Boston (1998)
Waziri, M.Y., Ahmed, K., Sabi’u, J.: A Dai–Liao conjugate gradient method via modified secant equations for system of nonlinear equations. Arab. J. Math. 9, 443–457 (2020)
Waziri, M.Y., Hungu, K.A., Sabi’u, J.: Descent Perry conjugate gradient methods for systems of monotone nonlinear equations. Numer. Algor. 85, 763–785 (2020)
Xiao, Y., Zhu, H.: A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing. J. Math. Anal. Appl. 405, 310–319 (2013)
Yuan, G., Hu, W.: A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations. J. Inequal. Appl. 2018, 113 (2018)
Yuan, G., Wang, B., Sheng, Z.: The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations. Int. J. Comput. Math. 96(8), 1533–1547 (2019)
Zhang, B., Zhu, Z.: A modified quasi-Newton diagonal update algorithm for total variation denoising problems and nonlinear monotone equations with applications in compressive sensing. Numer. Linear Algebra Appl. 22, 500–522 (2015)
Zhao, Y.B., Li, D.: Monotonicity of fixed point and normal mapping associated with variational inequality and its application. SIAM J. Optim. 11(4), 962–973 (2001)
Zhou, G., Toh, K.C.: Superlinear convergence of a Newton-type algorithm for monotone equations. J. Optim. Theory Appl. 125, 205–221 (2005)
Zhou, W.J., Li, D.H.: Limited memory BFGS method for nonlinear monotone equations. J. Comput. Math. 25, 89–96 (2007)
Zhou, W.J., Li, D.H.: A globally convergent BFGS method for nonlinear monotone equations without any merit functions. Math. Comput. 77, 2231–2240 (2008)
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Anton Abdulbasah Kamil.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Kaelo, P., Koorapetse, M. & Sam, C.R. A Globally Convergent Derivative-Free Projection Method for Nonlinear Monotone Equations with Applications. Bull. Malays. Math. Sci. Soc. 44, 4335–4356 (2021). https://doi.org/10.1007/s40840-021-01171-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40840-021-01171-2