Advertisement

Cybernetics and Systems Analysis

, Volume 53, Issue 2, pp 234–243 | Cite as

A Version of the Mirror descent Method to Solve Variational Inequalities*

  • V. V. Semenov
Article

Abstract

Nemirovski and Yudin proposed the mirror descent algorithm at the late 1970s to solve convex optimization problems. This method is suitable to solve huge-scale optimization problems. In the paper, we describe a new version of the mirror descent method to solve variational inequalities with pseudomonotone operators. The method can be interpreted as a modification of Popov’s two-step algorithm with the use of Bregman projections on the feasible set. We prove the convergence of the sequences generated by the proposed method.

Keywords

variational inequality pseudomonotonicity Bregman distance Kullback–Leibler distance mirror descent method convergence 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    I. V. Konnov, Combined Relaxation Methods for Variational Inequalities, Berlin–Heidelberg–New York, Springer-Verlag (2001).CrossRefzbMATHGoogle Scholar
  2. 2.
    P. I. Stetsyuk, “An approximate method of ellipsoids,” Cybern. Syst. Analysis, Vol. 39, No. 3, 435–439 (2003).MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    V. V. Semenov, “On the parallel proximal decomposition method for solving the problems of convex optimization,” J. Autom. Inform. Sci., Vol. 42, Iss. 4, 13–18 (2010).CrossRefGoogle Scholar
  4. 4.
    V. V. Semenov, “Strongly convergent algorithms for variational inequality problem over the set of solutions of the equilibrium problems,” in: M. Z. Zgurovsky and V. A. Sadovnichiy (eds.), Continuous and Distributed Systems. Solid Mechanics and its Applications, Vol. 211, Springer Intern. Publ. (Switzerland), New York–Heidelberg (2014), pp. 131–146.Google Scholar
  5. 5.
    G. M. Korpelevich, “The extra-gradient method to find saddle points and solve other problems,” Ekonomika i Mat. Metody, Vol. 12, No. 4, 747–756 (1976).MathSciNetzbMATHGoogle Scholar
  6. 6.
    E. N. Khobotov, “Modification of the extragradient method to solve variational inequalities and some optimization problems,” Zhur. Vych. Mat. Mat. Fiz., Vol. 27, No. 10, 1462–1473 (1987).MathSciNetGoogle Scholar
  7. 7.
    P. Tseng, “A modified forward-backward splitting method for maximal monotone mappings,” SIAM J. on Optimization, Vol. 38, 431–446 (2000).MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Y. Censor, A. Gibali, and S. Reich, “The subgradient extragradient method for solving variational inequalities in Hilbert space,” J. of Optimization Theory and Applications, Vol. 148, 318–335 (2011).MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    S. I. Lyashko, V. V. Semenov, and T. A. Voitova, “Low-cost modification of Korpelevich’s methods for monotone equilibrium problems,” Cybern. Syst. Analysis, Vol. 47, No. 4, 631–639 (2011).MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    V. V. Semenov, “A strongly convergent splitting method for systems of operator inclusions with monotone operators,” J. Autom. Inform. Sci., Vol. 46, Iss. 5, 45–56 (2014).CrossRefGoogle Scholar
  11. 11.
    V. V. Semenov, “Hybrid splitting methods for the system of operator inclusions with monotone operators,” Cybern. Syst. Analysis, Vol. 50, No. 5, 741–749 (2014).MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    D. A. Verlan, V. V. Semenov, and L. M. Chabak, “A strongly convergent modified extragradient method for variational inequalities with non-Lipschitz operators,” J. Autom. Inform. Sci., Vol. 47, Iss. 7, 31–46 (2015).CrossRefGoogle Scholar
  13. 13.
    S. V. Denisov, V. V. Semenov, and L. M. Chabak, “Convergence of the modified extragradient method for variational inequalities with non-Lipschitz operators,” Cybern. Syst. Analysis, Vol. 51, No. 5, 757–765 (2015).MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    L. D. Popov, “Modification of the Arrow–Hurwitz method of finding saddle points,” Matem. Zametki, Vol. 28, No. 5, 777–784 (1980).zbMATHGoogle Scholar
  15. 15.
    Yu. V. Malitsky and V. V. Semenov, “An extragradient algorithm for monotone variational inequalities,” Cybern. Syst. Analysis, Vol. 50, No. 2, 271–277 (2014).MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Yu. V. Malitsky and V. V. Semenov, “A hybrid method without extrapolation step for solving variational inequality problems,” J. of Global Optimization, Vol. 61, Iss. 1, 193–202 (2015).MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Ya. I. Vedel and V. V. Semenov, “A new two-stage proximal algorithm to solve the equilibrium problem,” Zhurn. Obch. ta Prykladnoi Matem., No. 1 (118), 15–23 (2015).Google Scholar
  18. 18.
    L. M. Bregman, “The relaxation method of finding common point of convex sets and its application to solve convex programming problems, Zhurn. Vych. Mat. Mat. Fiz., Vol. 7, No. 3, 620–631 (1967).MathSciNetzbMATHGoogle Scholar
  19. 19.
    A. S. Nemirovski and D. B. Yudin, Complexity of Problems and Efficiency of Optimization Methods [in Russian], Nauka, Moscow (1979).Google Scholar
  20. 20.
    A. Ben-Tal, T. Margalit, and A. Nemirovski, “The ordered subsets mirror descent optimization method with applications to tomography,” SIAM J. on Optimization, Vol. 12, 79–108 (2001).MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    A. Beck and M. Teboulle, “Mirror descent and nonlinear projected subgradient methods for convex optimization,” Operations Research Letters, Vol. 31, No. 3, 167–175 (2003).MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    Z. Allen-Zhu and L. Orecchia, “Linear coupling: An ultimate unification of gradient and mirror descent,” e-print (2014), arXiv:1407.1537.Google Scholar
  23. 23.
    A. S. Anikin, A. V. Gasnikov, and A. Yu. Gornov, “Randomization and scarcity in huge-scale optimization problems on the example of the mirror descent method,” Tr. MFTI, Vol. 8, No. 1, 11–24 (2016).Google Scholar
  24. 24.
    A. Nemirovski, “Prox-method with rate of convergence O(1/t) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems,” SIAM J. on Optimization, Vol. 15, 229–251 (2004).MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    A. Auslender and M. Teboulle, “Interior projection-like methods for monotone variational inequalities,” Mathematical Programming, Vol. 104, Iss. 1, 39–68 (2005).MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Yu. Nesterov, “Dual extrapolation and its applications to solving variational inequalities and related problems,” Mathematical Programming, Vol. 109, Iss. 2–3, 319–344 (2007).MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    A. Juditsky, A. Nemirovski, and C. Tauvel, Solving variational inequalities with stochastic mirror-prox algorithm," Stochastic Systems, Vol. 1, No. 1, 17–58 (2011).MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    M. Baes, M. Burgisser, and A. Nemirovski, “A randomized mirror-prox method for solving structured large-scale matrix saddle-point problems,” SIAM J. on Optimization, Vol. 23, 934–962 (2013).MathSciNetCrossRefzbMATHGoogle Scholar
  29. 29.
    A. V. Gasnikov, A. A. Logunovskaya, and L. E. Morozova, “Relation of imitative logit dynamics in population game theory and with the mirror descent method in online optimization on the example of the shortest route problem,” Tr. MFTI, Vol. 7, No. 4, 104–113 (2015).Google Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.Taras Shevchenko National University of KyivKyivUkraine

Personalised recommendations