Skip to main content
Log in

Solving Convex Min-Min Problems with Smoothness and Strong Convexity in One Group of Variables and Low Dimension in the Other

  • Published:
Automation and Remote Control Aims and scope Submit manuscript

Abstract

The article deals with some approaches to solving convex problems of the min-min type with smoothness and strong convexity in only one of the two groups of variables. It is shown that the proposed approaches based on Vaidya’s method, the fast gradient method, and the accelerated gradient method with variance reduction have linear convergence. It is proposed to use Vaidya’s method to solve the exterior problem and the fast gradient method to solve the interior (smooth and strongly convex) one. Due to its importance for applications in machine learning, the case where the objective function is the sum of a large number of functions is considered separately. In this case, the accelerated gradient method with variance reduction is used instead of the fast gradient method. The results of numerical experiments are presented that illustrate the advantages of the proposed procedures for a logistic regression problem in which the a priori distribution for one of the two groups of variables is available.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.

Similar content being viewed by others

REFERENCES

  1. Nesterov, Yu.E., Method of minimizing convex functions with convergence rate \(O(1/k^2) \), Dokl. Akad. Nauk SSSR, 1983, vol. 269, no. 3, pp. 543–547.

    MathSciNet  Google Scholar 

  2. Lan, G., First-order and Stochastic Optimization Methods for Machine Learning, Atlanta: Springer, 2020.

    Book  Google Scholar 

  3. Gasnikov, A.V., Sovremennye chislennye metody optimizatsii. Metod universal’nogo gradientnogo spuska (Modern Numerical Optimization Methods. Universal Gradient Descent Method), Moscow: MTsNMO, 2020.

    Google Scholar 

  4. Alkousa, M.S., Dvinskikh, D.M., Stonyakin, F.S., Gasnikov, A.V., and Kovalev, D., Accelerated methods for saddle point problems, Comput. Math. Math. Phys., 2020, vol. 60, no. 11, pp. 1787–1809.

    Article  MathSciNet  Google Scholar 

  5. Gladin, E., Kuruzov, I., Stonyakin, F., Pasechnyuk, D., Alkousa, M., and Gasnikov, A., Solving Strongly Convex-Concave Composite Saddle Point Problems with a Small Dimension of One of the Variables. https://arxiv.org/pdf/2010.02280.pdf .

  6. Tianyi, L., Chi, J., and Michael, I.J., Near-Optimal Algorithms for Minimax Optimization. https://arxiv.org/pdf/2002.02417v5.pdf .

  7. Yuanhao, W. and Jian, L., Improved Algorithms for Convex-Concave Minimax Optimization. https://arxiv.org/pdf/2006.06359.pdf .

  8. Zhongruo Wang, Krishnakumar Balasubramanian, Shiqian Ma, and Meisam Razaviyayn, Zeroth-Order Algorithms for Nonconvex Minimax Problems with Improved Complexities. https://arxiv.org/pdf/2001.07819.pdf .

  9. Gasnikov, A.V. and Gasnikova, E.V., Modeli ravnovesnogo raspredeleniya transportnykh potokov v bol’shikh setyakh. Uch. pos. (Equilibrium Distribution Models of Traffic Flows in Large Networks. A Handbook), Moscow: Mosk. Fiz.-Tekh. Inst., 2020.

    Google Scholar 

  10. Bolte, J., Glaudin, L., Pauwels, E., and Serrurier, M., A Hölderian Backtracking Method for Min-Max and Min-Min Problems. https://arxiv.org/pdf/2007.08810.pdf .

  11. Jungers, M., Trélat, E., and Abou-Kandil, H., Min-max and min-min Stackelberg strategies with closed-loop information structure, J. Dyn. Control Syst. Springer, 2011, no. 17(3), pp. 387–425.

  12. Farhangi, H. and Konur, D., Set-based min-max and min-min robustness for multi-objective robust optimization, Proc. 67th Annu. Conf. Expo Inst. Ind. Eng. (Pittsburgh, PA, 2017), Inst. Ind. Eng. (IIE), 2017, pp. 1217–1222.

  13. Vaidya, P.M., A new algorithm for minimizing convex functions over convex sets, Foundations of Computer Science. 30th Annu. Symp. (1989), pp. 338–343.

  14. Vaidya, P.M., A new algorithm for minimizing convex functions over convex sets, Math. Program., 1996, vol. 73, pp. 291–341.

    MathSciNet  MATH  Google Scholar 

  15. Lan, G., Zhize Li, and Yi, Zhou., A unified variance-reduced accelerated gradient method for convex optimization, 33rd Conf. Neural Inf. Process. Syst. (NeurIPS 2019) (Vancouver, Canada, 2019). https://arxiv.org/pdf/1905.12412.pdf .

  16. Tyurin, A.I. and Gasnikov, A.V., Fast gradient descent method for convex optimization problems with an oracle that generates a \((\delta , L) \)-model of a function in a requested point, Comput. Math. Math. Phys., 2019, vol. 59, no. 7, pp. 1137–1150.

    Google Scholar 

  17. Bubeck, S., Convex optimization: algorithms and complexity, Found. Trends Mach. Learn., 2015, vol. 8, nos. 3–4, pp. 231–357

    Article  Google Scholar 

  18. Gladin, E., Sadiev, A., Gasnikov, A., Stonyakin, F., Dvurechensky, P., Beznosikov, A., and Alkousa, M., Solving Smooth Min-Min and Min-Max Problems by Mixed Oracle Algorithms. https://arxiv.org/pdf/2103.00434.pdf .

  19. Polyak, B.T., Vvedenie v optimizatsiyu (Introduction to Optimization), Moscow: Nauka, 1983.

    MATH  Google Scholar 

  20. Gasnikov, A.V., Dvurechenskii, P.E., Kamzolov, D.I., Nesterov, Yu.E., Spokoinyi, V.G., Stetsyuk, P.I., Suvorikova, A.L., and Chernov, A.V., Finding equilibria in multistage transport models, Tr. Mosk. Fiz.-Tekh. Inst., 2015, vol. 7, no. 4(28), pp. 143–155.

    Google Scholar 

  21. Bishop, C., Pattern Recognition and Machine Learning, New York: Springer, 2006.

    MATH  Google Scholar 

Download references

Funding

This work was supported by the Ministry of Science and Higher Education of the Russian Federation, state assignment no. 075-00337-20-03, project no.0714-2020-0005. The work of E.L. Gladin was also supported by an A.M. Raigorodskii scholarship in the field of numerical optimization methods. The work of A.V. Gasnikov was also partly supported by the Russian Foundation for Basic Research, project no. 18-29-03071 mk.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to E. Gladin, M. Alkousa or A. Gasnikov.

Additional information

Translated by V. Potapchouck

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gladin, E., Alkousa, M. & Gasnikov, A. Solving Convex Min-Min Problems with Smoothness and Strong Convexity in One Group of Variables and Low Dimension in the Other. Autom Remote Control 82, 1679–1691 (2021). https://doi.org/10.1134/S0005117921100064

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0005117921100064

Keywords

Navigation