Skip to main content
Log in

A Scaled Derivative-Free Projection Method for Solving Nonlinear Monotone Equations

  • Original Paper
  • Published:
Bulletin of the Iranian Mathematical Society Aims and scope Submit manuscript

Abstract

Derivative-free projection methods are very effective and highly useful methods for solving large-scale nonlinear monotone equations. A lot of research is continuously being done to either improve the existing methods or come up with other new projection methods. In this paper, we propose a scaled derivative-free projection method for solving large-scale nonlinear monotone equations, which combines conjugate gradient and projection techniques. We establish its global convergence under appropriate conditions. The method is then compared with other existing methods in the literature and preliminary numerical results indicate that the method compares very well.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Dai, Z., Chen, X., Wen, F.: A modified Perry’s conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations. Appl. Math. Comput. 270, 378–386 (2015)

    MathSciNet  MATH  Google Scholar 

  2. Ding, Y., Xiao, Y., Li, J.: A class of conjugate gradient methods for convex constrained monotone equations. Optimization 66(12), 2309–2328 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  3. Dirkse, S.P., Ferris, M.C.: A collection of nonlinear mixed complementarity problems. Optim. Math. Soft. 5, 319–345 (1995)

    Article  Google Scholar 

  4. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  5. Du, X., Zhang, P., Ma, W.: Some modified conjugate gradient methods for unconstrained optimization. Comput. Appl. Math. 305, 92–114 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  6. Hu, Y., Wei, Z.: A modified Liu–Storey conjugate gradient projection algorithm for nonlinear monotone equations. Int. Math. Forum. 9, 1767–1777 (2014)

    Article  Google Scholar 

  7. Liu, J.K., Li, S.J.: A projection method for convex constrained monotone nonlinear equations with applications. Comput. Math. Appl. 70(10), 2442–2453 (2015)

    Article  MathSciNet  Google Scholar 

  8. Liu, J.K., Li, S.J.: Spectral DY-type projection method for nonlinear monotone system of equations. J. Comput. Math. 33, 341–355 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  9. Li, M.: An Liu–Storey-type method for solving large-scale nonlinear monotone equations. Numer. Func. Anal. Opt. 35, 310–322 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  10. Liu, S.Y., Huang, Y.Y., Jiao, H.W.: Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations. Abstr. Appl. Anal. 2014, 305643 (2014)

    MathSciNet  MATH  Google Scholar 

  11. Meintjes, K., Morgan, A.P.: A methodology for solving chemical equilibrium systems. Appl. Math. Comput. 22, 333–361 (1987)

    MathSciNet  MATH  Google Scholar 

  12. Solodov, M.V., Svaiter, B.F.: A globally convergent inexact Newton method for systems of monotone equations, Reformulation: Nonsmooth, Piecewise Smooth, Semismoothing methods, pp. 355–369. Springer, USA (1998)

    Book  Google Scholar 

  13. Sun, M., Liu, J.: A modified Hestenes–Stiefel projection method for constrained nonlinear equations and its linear convergence rate. J. Appl. Math. Comput. 49(1–2), 145–156 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  14. Sun, M., Liu, J.: New hybrid conjugate gradient projection method for the convex constrained equations. Calcolo 53(3), 399–411 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  15. Sun, M., Liu, J.: Three derivative-free projection methods for nonlinear equations with convex constraints. J. Appl. Math. Comput. 47, 265–276 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  16. Wang, S., Guan, H.: A scaled conjugate gradient method for solving monotone nonlinear equations with convex constraints. J. Appl. Math. 2013, 286486 (2013)

    MathSciNet  MATH  Google Scholar 

  17. Wang, X.Y., Li, S.J., Kou, X.P.: A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints. Calcolo 53(2), 133–145 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  18. Xiao, Y., Zhu, H.: A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing. J. Math. Anal. Appl. 405, 310–319 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  19. Yan, Q.-R., Peng, X.-Z., Li, D.-H.: A globally convergent derivative-free method for solving large-scale nonlinear monotone equations. J. Comput. Appl. Math. 234, 649–657 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  20. Yuan, G., Zhang, M.: A three-terms Polak–Ribière–Polyak conjugate gradient algorithm for large-scale nonlinear equations. J. Comput. Appl. Math. 286, 186–195 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  21. Zheng, L.: A modified PRP projection method for nonlinear equations with convex constraints. Int. J. Pure. Appl. Math. 79, 87–96 (2012)

    MATH  Google Scholar 

Download references

Acknowledgements

The authors are grateful to the Editor and the two anonymous referees for their constructive comments which greatly improved this paper’s presentation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to P. Kaelo.

Additional information

Communicated by Davod Khojasteh Salkuyeh.

Appendix

Appendix

Here, we present the twelve test problems that were used in numerical experiments. These problems are as follows.

Problem 1

[7]

$$\begin{aligned} F_{1}(x)&= x_{1} - e^{\cos (\frac{x_{1}+x_{2}}{n+1})},\\ F_{i}(x)&= x_{i} - e^{\cos (\frac{x_{i-1}+x_{i}+x_{i+1}}{n+1})}, \,\, i=2,3,\ldots ,n-1,\\ F_{n}(x)&= x_{n} - e^{\cos (\frac{x_{n-1}+x_{n}}{n+1})}, \end{aligned}$$

where \(\Omega =\mathbb {R}_{+}^{n}\). Initial guess \(x_{0}=(1, 1, \ldots , 1)^{\text {T}}\).

Problem 2

[9]

$$\begin{aligned} F_{1}(x)&= 2.5x_{1}+ x_{2}-1,\\ F_{i}(x)&= x_{i-1}+2.5x_{i}+x_{i+1}-1, \,\text {for}\quad i=2, 3, \ldots , n-1,\\ F_{n}(x)&= x_{n-1}+2.5x_{n}-1, \end{aligned}$$

where \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(0, 0, \ldots , 0)^{\text {T}}\).

Problem 3

 [7]

$$\begin{aligned} F_{1}(x)&= 3x_{1}^{3}+ 2x_{2}-5+\sin (x_{1}-x_{2})\sin (x_{1}+x_{2}),\\ F_{i}(x)&= -x_{i-1}e^{x_{i-1}-x_{i}}+x_{i}(4+3x_{i}^{2})+2x_{i+1}\\&+\sin (x_{i}-x_{i+1})\sin (x_{i}+x_{i+1})-8, \,\, i=2, 3,\ldots , n-1,\\ F_{n}(x)&= -x_{n-1}e^{x_{n-1}-x_{n}}+4x_{n}-3, \end{aligned}$$

where \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(0, 0, \ldots , 0)^{\text {T}}\).

Problem 4

 [7]

$$\begin{aligned} F_{i}(x) = 2x_{i} - \sin ( x_{i}), \,\,\, i=1,2,\ldots ,n, \end{aligned}$$

where \( \Omega =\mathbb {R}^{n}.\) Initial guess \(x_{0}=(1, 1, \ldots , 1)^{\text {T}}\).

Problem 5

[19]

$$\begin{aligned} F_{i}(x) = x_{i} - \frac{1}{n}x_{i}^{2}+\frac{1}{n}\sum _{k=1}^{n}x_{k}+i, \,\,\, 1,2,\ldots ,n \end{aligned}$$

where \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(1, 1, \ldots , 1)^{\text {T}}\).

Problem 6

[2]

$$\begin{aligned} F_{i}(x) = x_{i} - \sin (| x_{i}-1|), \,\,\, i=1,2,\ldots ,n, \end{aligned}$$

where \(\Omega =\{x\in \mathbb {R}^{n}|\sum _{i=1}^{n}x_{i}\le n, x_{i}\ge 0, i=1,2,\ldots ,n\}\). Initial guess \(x_{0}=(-1, -1, \ldots , -1)^{\text {T}}\).

Problem 7

[20]

$$\begin{aligned} F_{1}(x)&= 2x_{1}+0.5h^{2}(x_{1}+h)^{3}-x_{2},\\ F_{i}(x)&= 2x_{i}+0.5h^{2}(x_{i}+hi)^{3}-x_{i-1}+x_{i+1}, \,\,\, i=2, 3, \ldots , n-1,\\ F_{n}(x)&= 2x_{n}+0.5h^{2}(x_{n}+hn)^{3}-x_{n-1}, \end{aligned}$$

where \(h=\frac{1}{n+1}\) and \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(\frac{1}{2}, \frac{1}{2}, \ldots , \frac{1}{2})^{\text {T}}\).

Problem 8

[19]

$$\begin{aligned} F_{i}(x) = 2x_{i} - \sin (| x_{i}|), \,\,\, i=1,2,\ldots ,n, \end{aligned}$$

where \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(10, 10, \ldots , 10)^{\text {T}}\).

Problem 9

[2]

$$\begin{aligned} F_{i}(x)&= 4x_{i}(x_{i}^{2}+x_{n}^{2})-4, \,\,\, i=1, 2, \ldots , n-1,\\ F_{n}(x)&= 4x_{n}\sum _{i=1}^{n-1}(x_{i}^{2}+x_{n}^{2}), \end{aligned}$$

where \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(0, 0, \ldots , 0)^{\text {T}}\).

Problem 10

[2]

$$\begin{aligned} t_{i}&=\sum _{i=1}^{n}x_{i}^{2}, \quad c=10^{-5},\\ F_{i}(x)&= 2c(x_{i}-1)+4(t_{i}-0.25)x_{i}, \,\,\, i=1, 2, \ldots , n,\\ \end{aligned}$$

where \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(-1, -1, \ldots , -1)^{\text {T}}\).

Problem 11

[2]

$$\begin{aligned} F_{1}(x)&= 4x_{1}(x_{1}^{2}+x_{2}^{2})-4,\\ F_{i}(x)&= 4x_{i}(x_{i-1}^{2}+x_{i}^{2})+4x_{i}(x_{i}^{2}+x_{i+1}^{2})-4, \,\,\, i=2, 3, \ldots , n-1,\\ F_{n}(x)&= 4x_{n}(x_{n-1}^{2}+x_{n}^{2}), \end{aligned}$$

where \(\Omega =\mathbb {R}^{n}\). Initial guess \(x_{0}=(2, 2, \ldots , 2)^{\text {T}}\).

Problem 12

[7]

$$\begin{aligned} F_i(x)&= e^{x_{i}}-1, \,\,\, i=1, 2, \ldots , n, \end{aligned}$$

where \(\Omega =\mathbb {R}_{+}^{n}\). Initial guess \(x_{0}=(1, 1, \ldots , 1)^{\text {T}}\).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Koorapetse, M., Kaelo, P. & Offen, E.R. A Scaled Derivative-Free Projection Method for Solving Nonlinear Monotone Equations. Bull. Iran. Math. Soc. 45, 755–770 (2019). https://doi.org/10.1007/s41980-018-0163-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s41980-018-0163-1

Keywords

Mathematics Subject Classification

Navigation