Abstract
In this work, we consider the problem of strongly convex online optimization with convex inequality constraints. A scheme with switching over productive and non-productive steps is proposed for these problems. The convergence rate of the proposed scheme is proven for the class of relatively Lipschitz-continuous and strongly convex minimization problems. Moreover, we study the extensions of the Mirror Descent algorithms that eliminate the need for a priori knowledge of the lower bound on the (relative) strong convexity parameters of the observed functions. Some numerical experiments were conducted to demonstrate the effectiveness of one of the proposed algorithms with a comparison with another adaptive algorithm for convex online optimization problems.
The research was supported by Russian Science Foundation (project No. 21-71- 30005), https://rscf.ru/en/project/21-71-30005/..
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alkousa, M.S.: On some stochastic mirror descent methods for constrained online optimization problems. Comput. Res. Model. 11(2), 205–217 (2019)
Bauschke, H.H., Bolte, J., Teboulle, M.: A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications. Math. Oper. Res. 42(2), 330–348 (2017)
Bubeck, S., Cesa-Bianchi, N.: Regret analysis of stochastic and nonstochastic multi-armed bandit problems. Found. Trends Mach. Learn. 5(1), 1–122 (2012)
Gasnikov, A.V., Lagunovskaya, A.A., Usmanova, I.N., Fedorenko, F.A., Krymova, E.A.: Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case. Autom. Remote Control 78(2), 224–234 (2017)
Hazan, E., Rakhlin, A., Bartlett, P.: Adaptive online gradient descent. In: Advances in Neural Information Processing Systems, vol. 20 (2007)
Hazan, E., Kale, S.: Beyond the regret minimization barrier: optimal algorithms for stochastic strongly-convex optimization. JMLR 15, 2489–2512 (2014)
Hazan, E.: Introduction to online convex optimization. Found. Trends Optim. 2(3–4), 157–325 (2015)
Jenatton, R., Huang, J., Archambeau, C.: Adaptive algorithms for online convex optimization with long-term constraints. In: Proceedings of the 33rd International Conference on Machine Learning, PMLR, vol. 48, pp. 402–411 (2016)
Lu, H.: Relative continuity for non-lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent. Inf. J. Optim. 1(4), 288–303 (2019)
Lu, H., Freund, R., Nesterov, Yu.: Relatively smooth convex optimization by first-order methods and applications. SIOPT 28(1), 333–354 (2018)
Lugosi, G., Cesa-Bianchi, N.: Prediction, Learning and Games. Cambridge University Press, New York (2006)
Nesterov, Y.: Relative smoothness: new paradigm in convex optimization. Conference report, EUSIPCO-2019, A Coruna, Spain, 4 September 2019 (2019). http://eusipco2019.org/wp-content/uploads/2019/10/Relative-Smoothness-New-Paradigm-in-Convex.pdf
Orabona, F., Crammer, K., Cesa-Bianchi, N.: A generalized online mirror descent with applications to classification and regression. Mach. Learn. 99, 411–435 (2015)
Savchuk, O., Stonyakin, F., Alkousa, M., Zabirova, R., Titov, A., Gasnikov, A.: Online optimization problems with functional constraints under relative Lipschitz continuity and relative strong convexity conditions. arXiv preprint (2023). https://arxiv.org/pdf/2303.02746.pdf
Titov, A.A., Stonyakin, F.S., Gasnikov, A.V., Alkousa, M.S.: Mirror descent and constrained online optimization problems. In: Evtushenko, Y., Jaćimović, M., Khachay, M., Kochetov, Y., Malkova, V., Posypkin, M. (eds.) OPTIMA 2018. CCIS, vol. 974, pp. 64–78. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-10934-9_5
Titov, A.A., Stonyakin, F.S., Alkousa, M.S., Ablaev, S.S., Gasnikov, A.V.: Analogues of switching subgradient schemes for relatively lipschitz-continuous convex programming problems. In: Kochetov, Y., Bykadorov, I., Gruzdeva, T. (eds.) MOTOR 2020. CCIS, vol. 1275, pp. 133–149. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58657-7_13
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Savchuk, O., Stonyakin, F., Alkousa, M., Zabirova, R., Titov, A., Gasnikov, A. (2023). Online Optimization Problems with Functional Constraints Under Relative Lipschitz Continuity and Relative Strong Convexity Conditions. In: Khachay, M., Kochetov, Y., Eremeev, A., Khamisov, O., Mazalov, V., Pardalos, P. (eds) Mathematical Optimization Theory and Operations Research: Recent Trends. MOTOR 2023. Communications in Computer and Information Science, vol 1881. Springer, Cham. https://doi.org/10.1007/978-3-031-43257-6_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-43257-6_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-43256-9
Online ISBN: 978-3-031-43257-6
eBook Packages: Computer ScienceComputer Science (R0)