Skip to main content

Online Optimization Problems with Functional Constraints Under Relative Lipschitz Continuity and Relative Strong Convexity Conditions

  • Conference paper
  • First Online:
Mathematical Optimization Theory and Operations Research: Recent Trends (MOTOR 2023)

Abstract

In this work, we consider the problem of strongly convex online optimization with convex inequality constraints. A scheme with switching over productive and non-productive steps is proposed for these problems. The convergence rate of the proposed scheme is proven for the class of relatively Lipschitz-continuous and strongly convex minimization problems. Moreover, we study the extensions of the Mirror Descent algorithms that eliminate the need for a priori knowledge of the lower bound on the (relative) strong convexity parameters of the observed functions. Some numerical experiments were conducted to demonstrate the effectiveness of one of the proposed algorithms with a comparison with another adaptive algorithm for convex online optimization problems.

The research was supported by Russian Science Foundation (project No. 21-71- 30005), https://rscf.ru/en/project/21-71-30005/..

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alkousa, M.S.: On some stochastic mirror descent methods for constrained online optimization problems. Comput. Res. Model. 11(2), 205–217 (2019)

    Article  Google Scholar 

  2. Bauschke, H.H., Bolte, J., Teboulle, M.: A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications. Math. Oper. Res. 42(2), 330–348 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  3. Bubeck, S., Cesa-Bianchi, N.: Regret analysis of stochastic and nonstochastic multi-armed bandit problems. Found. Trends Mach. Learn. 5(1), 1–122 (2012)

    Article  MATH  Google Scholar 

  4. Gasnikov, A.V., Lagunovskaya, A.A., Usmanova, I.N., Fedorenko, F.A., Krymova, E.A.: Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case. Autom. Remote Control 78(2), 224–234 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  5. Hazan, E., Rakhlin, A., Bartlett, P.: Adaptive online gradient descent. In: Advances in Neural Information Processing Systems, vol. 20 (2007)

    Google Scholar 

  6. Hazan, E., Kale, S.: Beyond the regret minimization barrier: optimal algorithms for stochastic strongly-convex optimization. JMLR 15, 2489–2512 (2014)

    MathSciNet  MATH  Google Scholar 

  7. Hazan, E.: Introduction to online convex optimization. Found. Trends Optim. 2(3–4), 157–325 (2015)

    Google Scholar 

  8. Jenatton, R., Huang, J., Archambeau, C.: Adaptive algorithms for online convex optimization with long-term constraints. In: Proceedings of the 33rd International Conference on Machine Learning, PMLR, vol. 48, pp. 402–411 (2016)

    Google Scholar 

  9. Lu, H.: Relative continuity for non-lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent. Inf. J. Optim. 1(4), 288–303 (2019)

    MathSciNet  Google Scholar 

  10. Lu, H., Freund, R., Nesterov, Yu.: Relatively smooth convex optimization by first-order methods and applications. SIOPT 28(1), 333–354 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  11. Lugosi, G., Cesa-Bianchi, N.: Prediction, Learning and Games. Cambridge University Press, New York (2006)

    MATH  Google Scholar 

  12. Nesterov, Y.: Relative smoothness: new paradigm in convex optimization. Conference report, EUSIPCO-2019, A Coruna, Spain, 4 September 2019 (2019). http://eusipco2019.org/wp-content/uploads/2019/10/Relative-Smoothness-New-Paradigm-in-Convex.pdf

  13. Orabona, F., Crammer, K., Cesa-Bianchi, N.: A generalized online mirror descent with applications to classification and regression. Mach. Learn. 99, 411–435 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  14. Savchuk, O., Stonyakin, F., Alkousa, M., Zabirova, R., Titov, A., Gasnikov, A.: Online optimization problems with functional constraints under relative Lipschitz continuity and relative strong convexity conditions. arXiv preprint (2023). https://arxiv.org/pdf/2303.02746.pdf

  15. Titov, A.A., Stonyakin, F.S., Gasnikov, A.V., Alkousa, M.S.: Mirror descent and constrained online optimization problems. In: Evtushenko, Y., Jaćimović, M., Khachay, M., Kochetov, Y., Malkova, V., Posypkin, M. (eds.) OPTIMA 2018. CCIS, vol. 974, pp. 64–78. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-10934-9_5

    Chapter  MATH  Google Scholar 

  16. Titov, A.A., Stonyakin, F.S., Alkousa, M.S., Ablaev, S.S., Gasnikov, A.V.: Analogues of switching subgradient schemes for relatively lipschitz-continuous convex programming problems. In: Kochetov, Y., Bykadorov, I., Gruzdeva, T. (eds.) MOTOR 2020. CCIS, vol. 1275, pp. 133–149. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58657-7_13

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fedor Stonyakin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Savchuk, O., Stonyakin, F., Alkousa, M., Zabirova, R., Titov, A., Gasnikov, A. (2023). Online Optimization Problems with Functional Constraints Under Relative Lipschitz Continuity and Relative Strong Convexity Conditions. In: Khachay, M., Kochetov, Y., Eremeev, A., Khamisov, O., Mazalov, V., Pardalos, P. (eds) Mathematical Optimization Theory and Operations Research: Recent Trends. MOTOR 2023. Communications in Computer and Information Science, vol 1881. Springer, Cham. https://doi.org/10.1007/978-3-031-43257-6_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43257-6_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43256-9

  • Online ISBN: 978-3-031-43257-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics