Advertisement

Computational Mathematics and Mathematical Physics

, Volume 58, Issue 11, pp 1728–1736 | Cite as

Primal–Dual Mirror Descent Method for Constraint Stochastic Optimization Problems

  • A. S. Bayandina
  • A. V. Gasnikov
  • E. V. Gasnikova
  • S. V. Matsievskii
Article
  • 14 Downloads

Abstract

Extension of the mirror descent method developed for convex stochastic optimization problems to constrained convex stochastic optimization problems (subject to functional inequality constraints) is studied. A method that performs an ordinary mirror descent step if the constraints are insignificantly violated and performs a mirror descent step with respect to the violated constraint if this constraint is significantly violated is proposed. If the method parameters are chosen appropriately, a bound on the convergence rate (that is optimal for the given class of problems) is obtained and sharp bounds on the probability of large deviations are proved. For the deterministic case, the primal–dual property of the proposed method is proved. In other words, it is proved that, given the sequence of points (vectors) generated by the method, the solution of the dual method can be reconstructed up to the same accuracy with which the primal problem is solved. The efficiency of the method as applied for problems subject to a huge number of constraints is discussed. Note that the bound on the duality gap obtained in this paper does not include the unknown size of the solution to the dual problem.

Keywords:

Mirror descent method convex stochastic optimization constrained optimization probability of large deviations randomization 

Notes

ACKNOWLEDGMENTS

We are grateful to Yu.E. Nesterov and A.S. Nemirovski for discussions of parts of this paper. We are also grateful to the reviewer for valuable remarks.

The work by A.V. Gasnikov was performed in the Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences and supported by the Russian Science Foundation (project no. 14-50-00150). The work by E.V. Gasnikova was supported by the Russian Foundation for Basic Research, project no. 15-31-20571-mol_a_ved.

REFERENCES

  1. 1.
    A. S. Nemirovski and D. B. Yudin, Problem Complexity and Method Efficiency in Optimization, Interscience Series in Discrete Mathematics (Nauka, Moscow, 1979; Wiley, 1983), Vol. XV.Google Scholar
  2. 2.
    A. S. Anikin, A. V. Gasnikov, and A. Yu. Gornov, “Randomization and sparseness in huge-scale optimization problems using the mirror descent method as an example,” Trudy Mosk. Fiz.-Tekhn. Inst. 8 (1), 11–24 (2016). arXiv:1602.00594Google Scholar
  3. 3.
    K. Kim, Yu. Nesterov, V. Skokov, and B. Cherkasskii, “Efficient differentiation algorithms and extreme problems,” Ekon. Mat. Metody 20, 309–318 (1984).Google Scholar
  4. 4.
    Yu. Nesterov, “Lexicographic differentiation of nonsmooth functions,” Math. Progam. 104, 669 – 700 (2005).MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    A. V. Gasnikov, P. E. Dvurechensky, Yu. V. Dorn, and Yu. V. Maksimov, “Numerical methods for the problem of traffic flow equilibrium in the Backmann and the stable dynamics models,” Mat. Model. 28 (10), 40–64 (2016). arXiv:1506.00293Google Scholar
  6. 6.
    A. Juditsky, G. Lan, A. Nemirovski, and A. Shapiro, “Stochastic approximation approach to stochastic programming,” SIAM J. Optim. 19, 1574–1609 (2009).MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    S. Boucheron, G. Lugoshi, and P. Massart, Concentration Inequalities: A Nonasymptotic Theory of Independence (Oxford Univ. Press, 2013).CrossRefzbMATHGoogle Scholar
  8. 8.
    Yu. Nesterov and S. Shpirko, “Primal-dual subgradient method for huge-scale linear conic problem,” SIAM J. Optim. 24, 1444–1457 (2014). http://www.optimization-online.org/DB_FILE/2012/08/3590.pdfMathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Yu. Nesterov, “New primal-dual subgradient methods for convex optimization problems with functional constraints,” Int. Workshop “Optimization and Statistical Learning”, Les Houches, France, 2015. http://lear.inrialpes.fr/workshop/osl2015/program.htmlGoogle Scholar
  10. 10.
    A. S. Anikin, A. V. Gasnikov, P. E. Dvurechensky, A. I. Tyurin, and A. V. Chernov, “Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints,” Comput. Math. Math. Phys. 57, 1262–1275 (2017). arXiv:1602.01686Google Scholar
  11. 11.
    A. Nemirovski, S. Onn, and U. G. Rothblum, “Accuracy certificates for computational problems with convex structure,” Math. Oper. Res. 35 (1), 52–78 (2010).MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    B. Cox, A. Juditsky, and A. Nemirovski, “Decomposition techniques for bilinear saddle point problems and variational inequalities with affine monotone operators on domains given by linear minimization oracles,”, 2015. arXiv:1506.02444Google Scholar
  13. 13.
    A. Juditsky and A. Nemirovski, “First order methods for nonsmooth convex large-scale optimization, I, II,” in Optimization for Machine Learning, ed. by S. Sra, S. Nowozin, and S. Wright (MIT Press, 2012).zbMATHGoogle Scholar
  14. 14.
    A. V. Gasnikov, E. A. Krymova, A. A. Lagunovskaya, I. N. Usmanova, and F. A. Fedorenko, “Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case,” Autom. Remote Control 78, 224–234 (2017). arXiv:1509.01679Google Scholar
  15. 15.
    J. C. Duchi, Introductory Lectures on Stochastic Optimization, IAS/Park City Mathematics Series.(2016), pp. 1–84. http://stanford.edu/~jduchi/PCMIConvex/Duchi16.pdfGoogle Scholar
  16. 16.
    Yu. Nesterov, “Subgradient methods for convex function with nonstandart growth properties,” 2016. http://www.mathnet.ru:8080/PresentFiles/16179/growthbm_nesterov.pdfGoogle Scholar
  17. 17.
    J. C. Duchi, S. Shalev-Shwartz, Y. Singer, and A. Tewari, “Composite objective mirror descent,” Proc. of COLT, 2010, pp. 14–26.Google Scholar
  18. 18.
    A. Juditsky and Yu. Nesterov, “Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization,” Stoch. Syst. 4 (1), 44–80 (2014).MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    A. S. Anikin, A. V. Gasnikov, A. Yu. Gornov, D. I. Kamzolov, Yu. V. Maksimov, and Yu. E. Nesterov, “Efficient Numerical Solution of the PageRank problem for doubly sparse matrices,” Trudy Mosk. Fiz.-Tekhn. Inst., 7 (4), 74–94 (2015). arXiv:1508.07607Google Scholar
  20. 20.
    https://github.com/anastasiabayandina/MirrorGoogle Scholar
  21. 21.
    A. Beck, A. Ben-Tal, N. Guttmann-Beck, and L. Tetruashvili, “The CoMirror algorithm for solving nonsmooth constrained convex problems,” Oper. Res. Letts. 38, 493–498 (2011).MathSciNetCrossRefzbMATHGoogle Scholar
  22. 22.
    A. Juditsky, A. Nemirovski, and C. Tauvel, “Solving variational inequalities with stochastic mirror-prox algorithm,” Stochastic Syst. 1 (1), 17–58 (2011).MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    G. Lan and Z. Zhou, “Algorithms for stochastic optimization with expectation constraints,” 2016. http://pwp.gatech.edu/guanghui-lan/wp-content/uploads/sites/330/2016/08/SPCS8-19-16.pdfGoogle Scholar

Copyright information

© Pleiades Publishing, Ltd. 2018

Authors and Affiliations

  1. 1.Department of Control and Applied Mathematics, Moscow Institute of Physics and TechnologyDolgoprudnyiRussia
  2. 2.Chair of Mathematical Foundations of Control, Moscow Institute of Physics and TechnologyDolgoprudnyiRussia
  3. 3.Kharkevich Institute for Information Transmission Problems, Russian Academy of SciencesMoscowRussia
  4. 4.Laboratory of Structural Analysis Methods in Predictive Simulation, Moscow Institute of Physics and TechnologyDolgoprudnyiRussia
  5. 5.Kant Baltic Federal UniversityKaliningradRussia
  6. 6.Adygeya State UniversityMaykopRussia

Personalised recommendations