Skip to main content

Solving Smooth Min-Min and Min-Max Problems by Mixed Oracle Algorithms

Part of the Communications in Computer and Information Science book series (CCIS,volume 1476)


In this paper, we consider two types of problems that have some similarity in their structure, namely, min-min problems and min-max saddle-point problems. Our approach is based on considering the outer minimization problem as a minimization problem with an inexact oracle. This inexact oracle is calculated via an inexact solution of the inner problem, which is either minimization or maximization problem. Our main assumption is that the available oracle is mixed: it is only possible to evaluate the gradient w.r.t. the outer block of variables which corresponds to the outer minimization problem, whereas for the inner problem, only zeroth-order oracle is available. To solve the inner problem, we use the accelerated gradient-free method with zeroth-order oracle. To solve the outer problem, we use either an inexact variant of Vaidya’s cutting-plane method or a variant of the accelerated gradient method. As a result, we propose a framework that leads to non-asymptotic complexity bounds for both min-min and min-max problems. Moreover, we estimate separately the number of first- and zeroth-order oracle calls, which are sufficient to reach any desired accuracy.


  • First-order methods
  • Zeroth-order methods
  • Cutting-plane methods
  • Saddle-point problems

The research of A. Gasnikov and P. Dvurechensky was supported by Russian Science Foundation (project No. 21-71-30005). The research of E. Gladin, A. Sadiev and A. Beznosikov was partially supported by Andrei Raigorodskii scholarship.

This is a preview of subscription content, access via your institution.

Buying options

USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
USD   89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions


  1. 1.

    Here and below instead of ARDDsc we can use Accelerated coordinate descent methods [9, 20] with replacing partial derivatives by finite differences. In this case we lost opportunity to play on the choice of the norm (that could save \(\sqrt{n_y}\)-factor in gradient-free oracle complexity estimate [6]), but, we gain a possibility to replace the wort case \(L_{yy}\) to the average one (that could be \(n_y\)-times smaller [20]). At the end this could also save \(\sqrt{n_y}\)-factor in gradient-free oracle complexity estimate [20].

  2. 2.

    \(\widetilde{O} (\cdot )= O (\cdot )\) up to a small power of logarithmic factor.


  1. Alkousa, M., Dvinskikh, D., Stonyakin, F., Gasnikov, A., Kovalev, D.: Accelerated methods for composite non-bilinear saddle point problem (2020).

  2. Beznosikov, A., Sadiev, A., Gasnikov, A.: Gradient-free methods for saddle-point problem. arXiv preprint arXiv:2005.05913 (2020).

  3. Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. Journal of Mathematical Imaging and Vision 40(1), 120–145 (2011)

    CrossRef  MathSciNet  Google Scholar 

  4. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to derivative-free optimization. Soc. Ind. Appl. Math. (2009).

    CrossRef  MATH  Google Scholar 

  5. Devolder, O., Glineur, F., Nesterov, Y.: First-order methods with inexact oracle: the strongly convex case (2013).

  6. Dvurechensky, P., Gorbunov, E., Gasnikov, A.: An accelerated directional derivative method for smooth stochastic convex optimization. Eur. J. Oper. Res. 290(2), 601–621 (2021).

    CrossRef  MathSciNet  MATH  Google Scholar 

  7. Fu, M.C. (ed.): Handbook of Simulation Optimization. ISORMS, vol. 216. Springer, New York (2015).

    CrossRef  MATH  Google Scholar 

  8. Gasnikov, A., et al.: Universal method with inexact oracle and its applications for searching equillibriums in multistage transport problems. arXiv preprint arXiv:1506.00292 (2015)

  9. Gasnikov, A., Dvurechensky, P., Usmanova, I.: On accelerated randomized methods. Proceedings of Moscow Institute of Physics and Technology 8, pp. 67–100. Russian (2016)

    Google Scholar 

  10. Goodfellow, I.J., et al.: Generative adversarial networks (2014)

    Google Scholar 

  11. Goodfellow, I.J., Shlens, J., Szegedy, C.: Explaining and harnessing adversarial examples (2014)

    Google Scholar 

  12. Korpelevich, G.M.: The extragradient method for finding saddle points and other problems. Eknomika i Matematicheskie Metody 12, 747–756 (1976)

    Google Scholar 

  13. Lin, H., Mairal, J., Harchaoui, Z.: A universal catalyst for first-order optimization. In: Cortes, C., Lawrence, N., Lee, D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28. Curran Associates, Inc. (2015).

  14. Liu, S., et al.: Min-max optimization without gradients: convergence and applications to adversarial ml (2019).

  15. Madry, A., Makelov, A., Schmidt, L., Tsipras, D., Vladu, A.: Towards deep learning models resistant to adversarial attacks. In: 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, 30 April–3 May 2018, Conference Track Proceedings (2018)

    Google Scholar 

  16. Narodytska, N., Kasiviswanathan, S.P.: Simple black-box adversarial attacks on deep neural networks. In: CVPR Workshops. pp. 1310–1318. IEEE Computer Society (2017).

  17. Nedić, A., Ozdaglar, A.: Subgradient methods for saddle-point problems. J. Optim. Theory Appl. 142(1), 205–228 (2009)

    CrossRef  MathSciNet  Google Scholar 

  18. Nemirovski, A.: Prox-method with rate of convergence o (1/ t ) for variational inequalities with lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optim. 15, 229–251 (2004).

    CrossRef  MathSciNet  MATH  Google Scholar 

  19. Nesterov, Y.: Lectures on Convex Optimization. SOIA, vol. 137. Springer, Cham (2018).

    CrossRef  MATH  Google Scholar 

  20. Nesterov, Y., Stich, S.U.: Efficiency of the accelerated coordinate descent method on structured optimization problems. SIAM J. Optim. 27(1), 110–123 (2017)

    CrossRef  MathSciNet  Google Scholar 

  21. Pinto, L., Davidson, J., Sukthankar, R., Gupta, A.: Robust adversarial reinforcement learning. Proceedings of Machine Learning Research, 06–11 August 2017, vol. 70, pp. 2817–2826. PMLR, International Convention Centre, Sydney (2017).

  22. Polyak, B.T.: Introduction to Optimization. Publications Division, Inc., New York (1987)

    MATH  Google Scholar 

  23. Sadiev, A., Beznosikov, A., Dvurechensky, P., Gasnikov, A.: Zeroth-order algorithms for smooth saddle-point problems. arXiv:2009.09908 (2020)

  24. Shashaani, S., Hashemi, F.S., Pasupathy, R.: Astro-df: a class of adaptive sampling trust-region algorithms for derivative-free stochastic optimization. SIAM J. Optim. 28(4), 3145–3176 (2018).

    CrossRef  MathSciNet  MATH  Google Scholar 

  25. Stonyakin, F., et al.: Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model (2020).

  26. Tramèr, F., Kurakin, A., Papernot, N., Goodfellow, I., Boneh, D., McDaniel, P.: Ensemble adversarial training: attacks and defenses (2017).

  27. Vaidya, P.M.: A new algorithm for minimizing convex functions over convex sets. In: 30th Annual Symposium on Foundations of Computer Science, pp. 338–343. IEEE Computer Society (1989)

    Google Scholar 

  28. Vaidya, P.M.: A new algorithm for minimizing convex functions over convex sets. Math. Program. 73(3), 291–341 (1996)

    CrossRef  MathSciNet  Google Scholar 

  29. Wang, Z., Balasubramanian, K., Ma, S., Razaviyayn, M.: Zeroth-order algorithms for nonconvex minimax problems with improved complexities (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Abdurakhmon Sadiev .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gladin, E., Sadiev, A., Gasnikov, A., Dvurechensky, P., Beznosikov, A., Alkousa, M. (2021). Solving Smooth Min-Min and Min-Max Problems by Mixed Oracle Algorithms. In: Strekalovsky, A., Kochetov, Y., Gruzdeva, T., Orlov, A. (eds) Mathematical Optimization Theory and Operations Research: Recent Trends. MOTOR 2021. Communications in Computer and Information Science, vol 1476. Springer, Cham.

Download citation

  • DOI:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86432-3

  • Online ISBN: 978-3-030-86433-0

  • eBook Packages: Computer ScienceComputer Science (R0)