Advertisement

Springer Nature is making Coronavirus research free. View research | View latest news | Sign up for updates

A Version of the Mirror descent Method to Solve Variational Inequalities*

  • 88 Accesses

  • 7 Citations

Abstract

Nemirovski and Yudin proposed the mirror descent algorithm at the late 1970s to solve convex optimization problems. This method is suitable to solve huge-scale optimization problems. In the paper, we describe a new version of the mirror descent method to solve variational inequalities with pseudomonotone operators. The method can be interpreted as a modification of Popov’s two-step algorithm with the use of Bregman projections on the feasible set. We prove the convergence of the sequences generated by the proposed method.

This is a preview of subscription content, log in to check access.

References

  1. 1.

    I. V. Konnov, Combined Relaxation Methods for Variational Inequalities, Berlin–Heidelberg–New York, Springer-Verlag (2001).

  2. 2.

    P. I. Stetsyuk, “An approximate method of ellipsoids,” Cybern. Syst. Analysis, Vol. 39, No. 3, 435–439 (2003).

  3. 3.

    V. V. Semenov, “On the parallel proximal decomposition method for solving the problems of convex optimization,” J. Autom. Inform. Sci., Vol. 42, Iss. 4, 13–18 (2010).

  4. 4.

    V. V. Semenov, “Strongly convergent algorithms for variational inequality problem over the set of solutions of the equilibrium problems,” in: M. Z. Zgurovsky and V. A. Sadovnichiy (eds.), Continuous and Distributed Systems. Solid Mechanics and its Applications, Vol. 211, Springer Intern. Publ. (Switzerland), New York–Heidelberg (2014), pp. 131–146.

  5. 5.

    G. M. Korpelevich, “The extra-gradient method to find saddle points and solve other problems,” Ekonomika i Mat. Metody, Vol. 12, No. 4, 747–756 (1976).

  6. 6.

    E. N. Khobotov, “Modification of the extragradient method to solve variational inequalities and some optimization problems,” Zhur. Vych. Mat. Mat. Fiz., Vol. 27, No. 10, 1462–1473 (1987).

  7. 7.

    P. Tseng, “A modified forward-backward splitting method for maximal monotone mappings,” SIAM J. on Optimization, Vol. 38, 431–446 (2000).

  8. 8.

    Y. Censor, A. Gibali, and S. Reich, “The subgradient extragradient method for solving variational inequalities in Hilbert space,” J. of Optimization Theory and Applications, Vol. 148, 318–335 (2011).

  9. 9.

    S. I. Lyashko, V. V. Semenov, and T. A. Voitova, “Low-cost modification of Korpelevich’s methods for monotone equilibrium problems,” Cybern. Syst. Analysis, Vol. 47, No. 4, 631–639 (2011).

  10. 10.

    V. V. Semenov, “A strongly convergent splitting method for systems of operator inclusions with monotone operators,” J. Autom. Inform. Sci., Vol. 46, Iss. 5, 45–56 (2014).

  11. 11.

    V. V. Semenov, “Hybrid splitting methods for the system of operator inclusions with monotone operators,” Cybern. Syst. Analysis, Vol. 50, No. 5, 741–749 (2014).

  12. 12.

    D. A. Verlan, V. V. Semenov, and L. M. Chabak, “A strongly convergent modified extragradient method for variational inequalities with non-Lipschitz operators,” J. Autom. Inform. Sci., Vol. 47, Iss. 7, 31–46 (2015).

  13. 13.

    S. V. Denisov, V. V. Semenov, and L. M. Chabak, “Convergence of the modified extragradient method for variational inequalities with non-Lipschitz operators,” Cybern. Syst. Analysis, Vol. 51, No. 5, 757–765 (2015).

  14. 14.

    L. D. Popov, “Modification of the Arrow–Hurwitz method of finding saddle points,” Matem. Zametki, Vol. 28, No. 5, 777–784 (1980).

  15. 15.

    Yu. V. Malitsky and V. V. Semenov, “An extragradient algorithm for monotone variational inequalities,” Cybern. Syst. Analysis, Vol. 50, No. 2, 271–277 (2014).

  16. 16.

    Yu. V. Malitsky and V. V. Semenov, “A hybrid method without extrapolation step for solving variational inequality problems,” J. of Global Optimization, Vol. 61, Iss. 1, 193–202 (2015).

  17. 17.

    Ya. I. Vedel and V. V. Semenov, “A new two-stage proximal algorithm to solve the equilibrium problem,” Zhurn. Obch. ta Prykladnoi Matem., No. 1 (118), 15–23 (2015).

  18. 18.

    L. M. Bregman, “The relaxation method of finding common point of convex sets and its application to solve convex programming problems, Zhurn. Vych. Mat. Mat. Fiz., Vol. 7, No. 3, 620–631 (1967).

  19. 19.

    A. S. Nemirovski and D. B. Yudin, Complexity of Problems and Efficiency of Optimization Methods [in Russian], Nauka, Moscow (1979).

  20. 20.

    A. Ben-Tal, T. Margalit, and A. Nemirovski, “The ordered subsets mirror descent optimization method with applications to tomography,” SIAM J. on Optimization, Vol. 12, 79–108 (2001).

  21. 21.

    A. Beck and M. Teboulle, “Mirror descent and nonlinear projected subgradient methods for convex optimization,” Operations Research Letters, Vol. 31, No. 3, 167–175 (2003).

  22. 22.

    Z. Allen-Zhu and L. Orecchia, “Linear coupling: An ultimate unification of gradient and mirror descent,” e-print (2014), arXiv:1407.1537.

  23. 23.

    A. S. Anikin, A. V. Gasnikov, and A. Yu. Gornov, “Randomization and scarcity in huge-scale optimization problems on the example of the mirror descent method,” Tr. MFTI, Vol. 8, No. 1, 11–24 (2016).

  24. 24.

    A. Nemirovski, “Prox-method with rate of convergence O(1/t) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems,” SIAM J. on Optimization, Vol. 15, 229–251 (2004).

  25. 25.

    A. Auslender and M. Teboulle, “Interior projection-like methods for monotone variational inequalities,” Mathematical Programming, Vol. 104, Iss. 1, 39–68 (2005).

  26. 26.

    Yu. Nesterov, “Dual extrapolation and its applications to solving variational inequalities and related problems,” Mathematical Programming, Vol. 109, Iss. 2–3, 319–344 (2007).

  27. 27.

    A. Juditsky, A. Nemirovski, and C. Tauvel, Solving variational inequalities with stochastic mirror-prox algorithm," Stochastic Systems, Vol. 1, No. 1, 17–58 (2011).

  28. 28.

    M. Baes, M. Burgisser, and A. Nemirovski, “A randomized mirror-prox method for solving structured large-scale matrix saddle-point problems,” SIAM J. on Optimization, Vol. 23, 934–962 (2013).

  29. 29.

    A. V. Gasnikov, A. A. Logunovskaya, and L. E. Morozova, “Relation of imitative logit dynamics in population game theory and with the mirror descent method in online optimization on the example of the shortest route problem,” Tr. MFTI, Vol. 7, No. 4, 104–113 (2015).

Download references

Author information

Correspondence to V. V. Semenov.

Additional information

*The study was supported by the Ministry of Education and Science of Ukraine (project “Development of the algorithms for modeling and optimization of dynamic systems for defense, medicine, and ecology,” 0116U004777).

Translated from Kibernetika i Sistemnyi Analiz, No. 2, March–April, 2017, pp. 83–93.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Semenov, V.V. A Version of the Mirror descent Method to Solve Variational Inequalities* . Cybern Syst Anal 53, 234–243 (2017). https://doi.org/10.1007/s10559-017-9923-9

Download citation

Keywords

  • variational inequality
  • pseudomonotonicity
  • Bregman distance
  • Kullback–Leibler distance
  • mirror descent method
  • convergence