Skip to main content
Log in

Self-adaptive inertial extragradient algorithms for solving variational inequality problems

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

In this paper, we study the strong convergence of two Mann-type inertial extragradient algorithms, which are devised with a new step size, for solving a variational inequality problem with a monotone and Lipschitz continuous operator in real Hilbert spaces. Strong convergence theorems for the suggested algorithms are proved without the prior knowledge of the Lipschitz constant of the operator. Finally, we provide some numerical experiments to illustrate the performance of the proposed algorithms and provide a comparison with related ones.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  • An NT, Nam NM, Qin X (2020) Solving \( k \)-center problems involving sets based on optimization techniques. J Glob Optim 76:189–209

    Article  MathSciNet  Google Scholar 

  • Ansari QH, Islam M, Yao JC (2020) Nonsmooth variational inequalities on Hadamard manifolds. Appl Anal 99:340–358

    Article  MathSciNet  Google Scholar 

  • Beck A, Guttmann-Beck N (2019) FOM—a MATLAB toolbox of first-order methods for solving convex optimization problems. Optim Methods Softw 34:172–193

    Article  MathSciNet  Google Scholar 

  • Censor Y, Gibali A, Reich S (2011) The subgradient extragradient method for solving variational inequalities in Hilbert space. J Optim Theory Appl 148:318–335

    Article  MathSciNet  Google Scholar 

  • Cho SY, Kang SM (2012) Approximation of common solutions of variational inequalities via strict pseudocontractions. Acta Math Sci 32:1607–1618

    Article  MathSciNet  Google Scholar 

  • Cho SY, Li W, Kang SM (2013) Convergence analysis of an iterative algorithm for monotone operators. J Inequal Appl 2013:199

    Article  MathSciNet  Google Scholar 

  • Fan J, Liu L, Qin X (2020) A subgradient extragradient algorithm with inertial effects for solving strongly pseudomonotone variational inequalities. Optimization 69:2199–2215

    Article  MathSciNet  Google Scholar 

  • Korpelevich GM (1976) The extragradient method for finding saddle points and other problems. Ekonomikai Matematicheskie Metody 12:747–756

    MathSciNet  MATH  Google Scholar 

  • Kraikaew R, Saejung S (2014) Strong convergence of the Halpern subgradient extragradient method for solving variational inequalities in Hilbert spaces. J Optim Theory Appl 163:399–412

    Article  MathSciNet  Google Scholar 

  • Liu LS (1995) Ishikawa and Mann iteration process with errors for nonlinear strongly accretive mappings in Banach space. J Math Anal Appl 194:114–125

    Article  MathSciNet  Google Scholar 

  • Liu L (2019) A hybrid steepest descent method for solving split feasibility problems involving nonexpansive mappings. J Nonlinear Convex Anal 20:471–488

    MathSciNet  Google Scholar 

  • Liu L, Qin X, Agarwal RP (2019) Iterative methods for fixed points and zero points of nonlinear mappings with applications. Optimization. https://doi.org/10.1080/02331934.2019.1613404

    Article  Google Scholar 

  • Maingé PE (2008) A hybrid extragradient-viscosity method for monotone operators and fixed point problems. SIAM J Control Optim 47:1499–1515

    Article  MathSciNet  Google Scholar 

  • Qin X, An NT (2019) Smoothing algorithms for computing the projection onto a Minkowski sum of convex sets. Comput Optim Appl 74:821–850

    Article  MathSciNet  Google Scholar 

  • Qin X, Wang L, Yao JC (2020) Inertial splitting method for maximal monotone mappings. J Nonlinear Convex Anal 21:2325–2333

    MathSciNet  Google Scholar 

  • Sahu DR, Yao JC, Verma M, Shukla KK (2020) Convergence rate analysis of proximal gradient methods with applications to composite minimization problems. Optimization. https://doi.org/10.1080/02331934.2019.1702040

    Article  Google Scholar 

  • Shehu Y, Iyiola OS (2017) Strong convergence result for monotone variational inequalities. Numer Algorithms 76:259–282

    Article  MathSciNet  Google Scholar 

  • Shehu Y, Iyiola OS, Li XH, Dong Q-L (2019) Convergence analysis of projection method for variational inequalities. Comput Appl Math 38:161

    Article  MathSciNet  Google Scholar 

  • Tan B, Li S (2020) Strong convergence of inertial Mann algorithms for solving hierarchical fixed point problems. J Nonlinear Var Anal 4:337–355

    Google Scholar 

  • Tan B, Xu S (2020) Strong convergence of two inertial projection algorithms in Hilbert spaces. J Appl Numer Optim 2:171–186

    Google Scholar 

  • Tan B, Xu S, Li S (2020) Inertial shrinking projection algorithms for solving hierarchical variational inequality problems. J Nonlinear Convex Anal 21:871–884

    MathSciNet  Google Scholar 

  • Thong DV, Hieu DV (2019) Strong convergence of extragradient methods with a new step size for solving variational inequality problems. Comput Appl Math 38:136

    Article  MathSciNet  Google Scholar 

  • Tseng P (2000) A modified forward–backward splitting method for maximal monotone mappings. SIAM J Control Optim 38:431–446

    Article  MathSciNet  Google Scholar 

  • Wang F, Pham H (2019) On a new algorithm for solving variational inequality and fixed point problems. J Nonlinear Var Anal 3:225–233

    MATH  Google Scholar 

  • Wang X, Ou X, Zhang T, Chen JW (2019) An alternate minimization method beyond positive definite proximal regularization: convergence and complexity. J Nonlinear Var Anal 3:333–355

    MATH  Google Scholar 

  • Xu HK (2002) Iterative algorithms for nonlinear operators. J Lond Math Soc 66:240–256

    Article  MathSciNet  Google Scholar 

  • Yang J, Liu H (2019) Strong convergence result for solving monotone variational inequalities in Hilbert space. Numer Algorithms 80:741–752

    Article  MathSciNet  Google Scholar 

  • Zhou Z, Tan B, Li S (2020) A new accelerated self-adaptive stepsize algorithm with excellent stability for split common fixed point problems. Comput Appl Math 39:220

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors are very grateful to the editor and the anonymous referee for their constructive comments, which significantly improved the original manuscript. We would also like to thank Professor Xiaolong Qin for reading the initial manuscript and giving us many useful suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Songxiao Li.

Additional information

Communicated by Baisheng Yan.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tan, B., Fan, J. & Li, S. Self-adaptive inertial extragradient algorithms for solving variational inequality problems. Comp. Appl. Math. 40, 19 (2021). https://doi.org/10.1007/s40314-020-01393-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-020-01393-3

Keywords

Mathematics Subject Classification

Navigation