Skip to main content

Chasing Convex Bodies Optimally

  • Chapter
  • First Online:
Geometric Aspects of Functional Analysis

Part of the book series: Lecture Notes in Mathematics ((LNM,volume 2327))

  • 413 Accesses

Abstract

In the chasing convex bodies problem, an online player receives a request sequence of N convex sets \(K_1,\dots , K_N\) contained in a normed space X of dimension d. The player starts at \(x_0=0\in X\), and at time n observes the set \(K_n\) and then moves to a new point \(x_n\in K_n\), paying a cost \(||x_n-x_{n-1}||\). The player aims to ensure the total cost exceeds the minimum possible total cost by at most a bounded factor \(\alpha _d\) independent of N, despite \(x_n\) being chosen without knowledge of the future sets \(K_{n+1},\dots ,K_N\). The best possible \(\alpha _d\) is called the competitive ratio. Finiteness of the competitive ratio for convex body chasing was proved for \(d=2\) in Friedman and Linial (Discrete Comput. Geom. 9(3):293–321, 1993.) and conjectured for all d. Bubeck et al. (Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing, pp. 861–868, 2019) recently resolved this conjecture, proving an exponential \(2^{O(d)}\) upper bound on the competitive ratio.

We give an improved algorithm achieving competitive ratio d in any normed space, which is exactly tight for \(\ell ^{\infty }\). In Euclidean space, our algorithm also achieves competitive ratio \(O(\sqrt {d\log N})\), nearly matching a \(\sqrt {d}\) lower bound when N is subexponential in d. Our approach extends that of Bubeck et al. (Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 1496–1508. SIAM, 2020.) for nested convex bodies, which is based on the classical Steiner point of a convex body. We define the functional Steiner point of a convex function and apply it to the associated work function.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. N. Alon, B. Awerbuch, Y. Azar, The online set cover problem, in Proceedings of the Thirty-Fifth Annual ACM Symposium on Theory of Computing (2003), pp. 100–105

    Google Scholar 

  2. A. Antoniadis, N. Barcelo, M. Nugent, K. Pruhs, K. Schewior, M. Scquizzato, Chasing convex bodies and functions, in LATIN 2016: Theoretical Informatics (Springer, Berlin, 2016), pp. 68–81

    Book  MATH  Google Scholar 

  3. C.J. Argue, S. Bubeck, M.B. Cohen, A. Gupta, Y.T. Lee, A nearly-linear bound for chasing nested convex bodies, in Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms (SIAM, 2019), pp. 117–122

    Google Scholar 

  4. C.J. Argue, A. Gupta, G. Guruganesh, Z. Tang, Chasing convex bodies with linear competitive ratio. J. ACM 68(5), 1–10 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  5. K. Ball, An elementary introduction to modern convex geometry. Flavors Geom. 31, 1–58 (1997)

    MathSciNet  MATH  Google Scholar 

  6. N. Bansal, N. Buchbinder, A. Madry, J. Naor, A polylogarithmic-competitive algorithm for the k-server problem. J. ACM 62(5), 1–49 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  7. N. Bansal, A. Gupta, R. Krishnaswamy, K. Pruhs, K. Schewior, C. Stein, A 2-competitive algorithm for online convex optimization with switching costs, in Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2015). Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik (2015)

    Google Scholar 

  8. N. Bansal, M. Böhm, M. Eliáš, G. Koumoutsos, S.W. Umboh, Nested convex bodies are chaseable, in Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms (SIAM, 2018), pp. 1253–1260

    Google Scholar 

  9. Y. Bartal, N. Linial, M. Mendel, A. Naor, On metric Ramsey-type phenomena. Ann. Math. 162, 643–710 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  10. Y. Bartal, B. Bollobás, M. Mendel, Ramsey-type theorems for metric spaces with applications to online problems. J. Comput. Syst. Sci. 72(5), 890–921 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  11. A. Blum, C. Burch, On-line learning and the metrical task system problem. Mach. Learn. 39(1), 35–58 (2000)

    Article  MATH  Google Scholar 

  12. A. Borodin, N. Linial, M.E. Saks, An optimal on-line algorithm for metrical task system. J. ACM 39(4), 745–763 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  13. S. Bubeck, M.B. Cohen, J.R. Lee, Y.T. Lee, Metrical task systems on trees via mirror descent and unfair gluing, in Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms (SIAM, 2019), pp. 89–97

    Google Scholar 

  14. S. Bubeck, Y.T. Lee, Y. Li, M. Sellke, Competitively chasing convex bodies, in Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing (2019), pp. 861–868

    Google Scholar 

  15. S. Bubeck, B. Klartag, Y.T. Lee, Y. Li, M. Sellke, Chasing nested convex bodies nearly optimally, in Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms (SIAM, 2020), pp. 1496–1508

    Google Scholar 

  16. N. Chen, G. Goel, A. Wierman, Smoothed online convex optimization in high dimensions via online balanced descent, in Conference On Learning Theory (PMLR, 2018), pp. 1574–1594

    Google Scholar 

  17. C. Fefferman, P. Shvartsman, Sharp finiteness principles for Lipschitz selections. Geom. Funct. Anal. 28(6), 1641–1705 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  18. A. Fiat, M. Mendel, Better algorithms for unfair metrical task systems and applications. SIAM J. Comput. 32(6), 1403–1422 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  19. J. Friedman, N. Linial, On convex body chasing. Discrete Comput. Geom. 9(3), 293–321 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  20. G. Goel, Y. Lin, H. Sun, A. Wierman, Beyond online balanced descent: an optimal algorithm for smoothed online optimization. Adv. Neural Inf. Process. Syst. 32, 1875–1885 (2019)

    Google Scholar 

  21. R.L Graham, Bounds for certain multiprocessing anomalies. Bell Syst. Tech. J. 45(9), 1563–1581 (1966)

    Google Scholar 

  22. E.F. Grove, The harmonic online k-server algorithm is competitive, in Proceedings of the twenty-third annual ACM symposium on Theory of computing (1991), pp. 260–266

    Google Scholar 

  23. E. Koutsoupias, C.H. Papadimitriou, On the k-server conjecture. J. ACM 42(5), 971–983 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  24. R. Kumar, M. Purohit, Z. Svitkina, Improving Online Algorithms via ML Predictions, in Proceedings of the 32nd International Conference on Neural Information Processing Systems (2018), pp. 9684–9693

    Google Scholar 

  25. I. Kupka, Continuous selections for Lipschitz multifunctions. Acta Math. Univ. Comenianae 74(1), 133–141 (2005)

    MathSciNet  MATH  Google Scholar 

  26. M. Lin, A. Wierman, L.L.H. Andrew, E. Thereska, Dynamic right-sizing for power-proportional data centers. IEEE/ACM Trans. Netw. 21(5), 1378–1391 (2013)

    Article  Google Scholar 

  27. T. Lykouris, S. Vassilvtiskii, Competitive caching with machine learned advice, in International Conference on Machine Learning (PMLR, 2018), pp. 3296–3305

    Google Scholar 

  28. M.S. Manasse, L.A. McGeoch, D.D. Sleator, Competitive algorithms for server problems. J. Algorithms 11(2), 208–230 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  29. K. Przesławski, D. Yost, Continuity properties of selectors. Mich. Math. J. 36(1), 13 (1989)

    Google Scholar 

  30. R.T. Rockafellar, Convex Analysis, vol. 36 (Princeton University Press, Princeton, 1970)

    Book  MATH  Google Scholar 

  31. P. Shvartsman, Lipshitz selections of multivalued mappings and traces of the Zygmund class of functions to an arbitrary compact. Dokl. Acad. Nauk SSSR 276, 559–562 (1984). English translation in Soviet Math. Dokl, volume 29, pages 565–568, 1984

    Google Scholar 

  32. P. Shvartsman. Lipschitz selections of set-valued mappings and Helly’s theorem. J. Geom. Anal. 12(2), 289–324 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  33. D.D. Sleator, R.E. Tarjan, Amortized efficiency of list update and paging rules. Commun. ACM 28(2), 202–208 (1985)

    Article  MathSciNet  Google Scholar 

  34. J. Steiner. From the center of curvature of plane curves. J. Pure Appl. Math. 21, 33–63 (1840)

    Google Scholar 

  35. A. Wei, F. Zhang, Optimal Robustness-Consistency Trade-offs for Learning-Augmented Online Algorithms, in Advances in Neural Information Processing Systems, vol. 33 (2020), pp. 8042–8053

    Google Scholar 

Download references

Acknowledgements

The author thanks Sébastien Bubeck, Bo’az Klartag, Yin Tat Lee, and Yuanzhi Li for the introduction to convex body chasing and the Steiner point, and many stimulating discussions. He thanks Ethan Jaffe, Felipe Hernandez, and Christian Coester for discussions about properties of the work function, and the anonymous referee for several suggestions. He additionally thanks Sébastien for feedback on previous drafts and gratefully acknowledges the support of an NSF graduate fellowship and a Stanford graduate fellowship.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark Sellke .

Editor information

Editors and Affiliations

Appendix: Proof of Lemma 3.7

Appendix: Proof of Lemma 3.7

Proof

We prove the result for all \(v\in B_1^*\) where \(\nabla W^*_t(v)\) exists. This includes almost all v by Alexandrov’s theorem. Moreover it ensures the conjugate point \(v_t^*=\arg \min _{w\in X} W(w)-\langle v,w\rangle \) is well-defined and that \(W_t\) is strictly convex at \(v_t^*\) [30, Corollary 25.1.2]. We write:

$$\displaystyle \begin{aligned} W_{t+\delta}(v)=& \min_{x_s:[0,t+\delta]\to X} \left(\int_{0}^{t+\delta} (f_s(x_s)+||x^{\prime}_s||\text{d} s -\langle v,x_{t+\delta}\rangle\right)\\ =& \min_{ x_s:[t,t+\delta]\to X}\left(W_t(x_t)+\int_{t}^{t+\delta}f_s(x_s)+||x^{\prime}_s||\text{d} s - \langle v,x_{t+\delta}\rangle\right) \end{aligned} $$

For small \(\delta \in (0,\varepsilon )\), we show \(W^*_{t+\delta }(v)=W^*_t(v)+\delta f_t(v^*_t)+o(\delta )\). For the upper bound,

$$\displaystyle \begin{aligned} W_{t+\delta}(v_t^*)&\leq W_t(v_t^*)+\int_t^{t+\delta} f_s(v^*_t)\text{d} s\\ &= W_t(v_t^*)+\delta f_t(v^*_t)+o(\delta) \end{aligned} $$

holds by taking \(x_s=v_t^*\) constant for \(s\in [t,t+\delta )\) and recalling the assumption that \(f_s(x)\) is continuous on \(s\in [t,t+\delta )\). Since \(v_t^*=\arg \min _x \big (W_t(x)-\langle x,v\rangle \big )\), the upper bound follows from

$$\displaystyle \begin{aligned} W^*_{t+\delta}(v)&\leq W_{t+\delta}(v_t^*)-\langle v,v_t^*\rangle\\ &\leq W_t(v_t^*)+\delta f_t(v^*_t)+o(\delta)-\langle v,v_t^*\rangle\\ &=W^*_t(v)+\delta f_t(v^*_t)+o(\delta). \end{aligned} $$

For the lower bound, the strict convexity of \(W_t\) at \(v_t^*\) implies

$$\displaystyle \begin{aligned} W_t(x)= W_t(v^*_t)+\langle v,x-v^*_t\rangle +\gamma(||x-v^*_t||) \end{aligned}$$

where \(\gamma :\mathbb R^+\to \mathbb R^+\) is continuous and increasing with unique minimum \(F(0)=0\). Therefore any path \(x_s:[0,t+\delta ]\to X\) satisfies:

$$\displaystyle \begin{aligned} W_t(x_t)&+\int_{t}^{t+\delta}f_s(x_s)+||x^{\prime}_s||\text{d} s - \langle v,x_{t+\delta}\rangle \geq W_t(v^*_t)+\langle v,x_t-v^*_t\rangle\\ &+\gamma(||x_t-v^*_t||) + \int_t^{t+\delta} f_s(x_s)+||x^{\prime}_s||\text{d} s-\langle v,x_{t+\delta}\rangle. \end{aligned} $$

The observation \(\int _t^{t+\delta } ||x^{\prime }_s||\text{d} s \geq ||x_{t+\delta }-x_t|| \geq \langle v,x_{t+\delta }-x_t\rangle \) implies

$$\displaystyle \begin{aligned} W_t(x_t)+\int_{t}^{t+\delta}f_s(x_s)+||x^{\prime}_s||\text{d} s - \langle v,x_{t+\delta}\rangle\geq & W_t(x_t)-\langle v,v^*_t\rangle+f(||x_t-v^*_t||) \\ &\quad + \int_t^{t+\delta} f_s(x_s)\text{d} s\\ \geq & W_t(v^*_t)-\langle v,v^*_t\rangle +\gamma(||x_t-v^*_t||) \\ & \quad + \int_t^{t+\delta} f_s(x_s)\text{d} s\\ \geq & W^*_t(v)+\gamma(||x_t-v^*_t||)\\ &\quad + \int_t^{t+\delta} f_s(x_s)\text{d} s. \end{aligned} $$

Because \(W_{t+\delta }(v)=W_t(v)+O(\delta )\), we see that for \(\delta \to 0\) small we must have \(||x_t-v_t^*||=o_{\delta \to 0}(1)\) for any optimal trajectory \(x_s\) witnessing the correct value \(W_{t+\delta }\). Additionally,

$$\displaystyle \begin{aligned} \int_t^{t+\delta} ||x^{\prime}_s||\text{d} s + \langle v, x_t-x_{t+\delta}\rangle \geq (1-|v|)\int_t^{t+\delta} ||x^{\prime}_s||\text{d} s \geq (1-|v|)\sup_{s\in [t,t+\delta]} |x_t-x_s|. \end{aligned}$$

which similarly implies \(\sup _{s\in [t,t+\delta ]}||x_t-x_s||=o(1)\) for any optimal trajectory since \(||v||<1\). It follows that all optimal trajectories satisfy \(\int _t^{t+\delta } f_s(x_s)\text{d} s = \delta f_t(v^*_t) +o(\delta ).\) This concludes the proof. □

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Sellke, M. (2023). Chasing Convex Bodies Optimally. In: Eldan, R., Klartag, B., Litvak, A., Milman, E. (eds) Geometric Aspects of Functional Analysis. Lecture Notes in Mathematics, vol 2327. Springer, Cham. https://doi.org/10.1007/978-3-031-26300-2_12

Download citation

Publish with us

Policies and ethics