On Some Methods for Strongly Convex Optimization Problems with One Functional Constraint

  • Fedor S. StonyakinEmail author
  • Mohammad S. Alkousa
  • Alexander A. Titov
  • Victoria V. Piskunova
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11548)


We consider the classical optimization problem of minimizing a strongly convex, non-smooth, Lipschitz-continuous function with one Lipschitz-continuous constraint. We develop the approach in [10] and propose two methods for the considered problem with adaptive stopping rules. The main idea of the methods is using the dichotomy method and solving an auxiliary one-dimensional problem at each iteration. Theoretical estimates for the proposed methods are obtained. Partially, for smooth functions, we prove the linear rate of convergence of the methods. We also consider theoretical estimates in the case of non-smooth functions. The results for some examples of numerical experiments illustrating the advantages of the proposed methods and the comparison with some adaptive optimal method for non-smooth strongly convex functions are also given.


Optimization with functional constraint Adaptive method Lipschitz-continuous function Lipschitz-continuous gradient Strongly convex objective function Dichotomy method 


  1. 1.
    Aravkin A.Y., Burke J.V., Drusvyatskiy D.: Convex Analysis and Nonsmooth Optimization (2017).
  2. 2.
    Bayandina, A., Dvurechensky, P., Gasnikov, A., Stonyakin, F., Titov, A.: Mirror descent and convex optimization problems with non-smooth inequality constraints. In: Giselsson, P., Rantzer, A. (eds.) Large-Scale and Distributed Optimization. LNM, vol. 2227, pp. 181–213. Springer, Cham (2018). Scholar
  3. 3.
    Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Ben-Tal, A., Nemirovski, A.: Robust truss topology design via semidefinite programming. SIAM J. Optim. 7(4), 991–1016 (1997)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, New York (2004)CrossRefGoogle Scholar
  6. 6.
    Bubeck, S.: Convex optimization: algorithms and complexity. Found. Trends Mach. Learn. 8(3–4), 231–357 (2015). Scholar
  7. 7.
    Danskin, J.M.: The theory of Max-Min, with applications. J. SIAM Appl. Math. 14(4) (1966)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Demyanov, V.F., Malozemov, V.N.: Introduction to Minimax. Nauka, Moscow (1972). (in Russian)Google Scholar
  9. 9.
    Gasnikov, A.V.: Modern numerical optimization methods. The method of universal gradient descent (2018). (in Russian).
  10. 10.
    Ivanova, A., Gasnikov, A., Nurminski, E., Vorontsova, E.: Walrasian equilibrium and centralized distributed optimization from the point of view of modern convex optimization methods on the example of resource allocation problem (2019). (in Russian).
  11. 11.
    Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Nesterov, Y.: Gradient methods for minimizing composite functions. Math. Program. 140(1), 125–161 (2013)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer Academic Publishers, Massachusetts (2004)CrossRefGoogle Scholar
  14. 14.
    Shpirko, S., Nesterov, Y.: Primal-dual subgradient methods for huge-scale linear conic problem. SIAM J. Optim. 24(3), 1444–1457 (2014)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Vasilyev, F.: Optimization Methods. Fizmatlit, Moscow (2002). (in Russian)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.V.I. Vernadsky Crimean Federal UniversitySimferopolRussia
  2. 2.Moscow Institute of Physics and TechnologyMoscowRussia

Personalised recommendations