Abstract
We show that the exact worst-case performance of fixed-step first-order methods for unconstrained optimization of smooth (possibly strongly) convex functions can be obtained by solving convex programs. Finding the worst-case performance of a black-box first-order method is formulated as an optimization problem over a set of smooth (strongly) convex functions and initial conditions. We develop closed-form necessary and sufficient conditions for smooth (strongly) convex interpolation, which provide a finite representation for those functions. This allows us to reformulate the worst-case performance estimation problem as an equivalent finite dimension-independent semidefinite optimization problem, whose exact solution can be recovered up to numerical precision. Optimal solutions to this performance estimation problem provide both worst-case performance bounds and explicit functions matching them, as our smooth (strongly) convex interpolation procedure is constructive. Our works build on those of Drori and Teboulle (Math Program 145(1–2):451–482, 2014) who introduced and solved relaxations of the performance estimation problem for smooth convex functions. We apply our approach to different fixed-step first-order methods with several performance criteria, including objective function accuracy and gradient norm. We conjecture several numerically supported worst-case bounds on the performance of the fixed-step gradient, fast gradient and optimized gradient methods, both in the smooth convex and the smooth strongly convex cases, and deduce tight estimates of the optimal step size for the gradient method.
Similar content being viewed by others
Notes
This equation possesses several solutions, but the optimum is the unique point where the two terms feature derivatives of opposite signs (a necessary and sufficient condition for the maximum of two convex functions of one variable). This point can easily be computed numerically with an appropriate bisection method.
Except for tests where validation encountered numerical difficulties, i.e for which VSDP returned no valid interval, which occurred more and more frequently as the value of the worst-case bound became closer to zero.
References
Bauschke, H., Combettes, P.: Convex analysis and monotone operator theory in Hilbert spaces. Springer, Berlin (2011)
Beck, A.: Quadratic matrix programming. SIAM J. Optim. 17(4), 1224–1238 (2007)
Beck, A., Teboulle, M.: Fast gradient-based algorithms for constrained total variation image denoising and deblurring problems. Image Process. IEEE Trans. 18(11), 2419–2434 (2009)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imag. Sci. 2(1), 183–202 (2009)
Ben-Tal, A., Nemirovski, A.: Lectures on modern convex optimization: analysis, algorithms, and engineering applications, vol. 2. SIAM, Philadelphia (2001)
Bertsekas, D.P.: Convex optimization theory. Athena Scientific, Belmont (2009)
Bertsekas, D.P.: Convex optimization algorithms. Athena Scientific, Belmont (2015)
Boyd, S., Vandenberghe, L.: Convex optimization. Cambridge University Press, Cambridge (2004)
Devolder, O., Glineur, F., Nesterov, Y.: Double smoothing technique for large-scale linearly constrained convex optimization. SIAM J. Optim. 22(2), 702–727 (2012)
Drori, Y., Teboulle, M.: Performance of first-order methods for smooth convex minimization: a novel approach. Math. Program. 145(1–2), 451–482 (2014)
Härter, V., Jansson, C., Lange, M.: VSDP: A matlab toolbox for verified semidefinite-quadratic-linear programming. Technical report, Institute for Reliable Computing, Hamburg University of Technology (2012)
Hiriart-Urruty, J.B., Lemaréchal, C.: Convex analysis and minimization algorithms. Springer, Heidelberg (1996). (Two volumes—2nd printing)
Kim, D., Fessler, J.: On the convergence analysis of the optimized gradient methods. arXiv preprint arXiv:1510.08573 (2015)
Kim, D., Fessler, J.: Optimized first-order methods for smooth convex minimization. Math. Program. (2015). doi:10.1007/s10107-015-0949-3
Lambert, D., Crouzeix, J.P., Nguyen, V., Strodiot, J.J.: Finite convex integration. J. Convex Anal. 11(1), 131–146 (2004)
Lessard, L., Recht, B., Packard, A.: Analysis and design of optimization algorithms via integral quadratic constraints. SIAM. J. Optim. 26(1), 57–95 (2016)
Löfberg, J.: YALMIP: a toolbox for modeling and optimization in MATLAB. In: Proceedings of the CACSD Conference (2004)
Mosek, A.: The MOSEK optimization software, vol. 54. http://www.mosek.com (2010)
Nemirovsky, A., Yudin, D.: Problem complexity and method efficiency in optimization. Wiley-Interscience, New York (1983)
Nesterov, Y.: A method of solving a convex programming problem with convergence rate O(\(1/k^2\))). Sov. Math. Dokl. 27, 372–376 (1983)
Nesterov, Y.: Introductory lectures on convex optimization: a basic course. Applied optimization. Kluwer Academic Publishers, Dordrecht (2004)
Nesterov, Y.: How to make the gradients small. Optima 88, 10–11 (2012)
Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1996)
Rockafellar, R.T., Wets, R.B.: Variational analysis. Springer, Berlin (1998)
Sturm, J.F.: Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones. Optim. Methods Softw. 11–12, 625–653 (1999)
Vandenberghe, L., Boyd, S.: Semidefinite programming. SIAM Rev. 38, 49–95 (1994)
Author information
Authors and Affiliations
Corresponding author
Additional information
This paper presents research results of the Belgian Network DYSCO (Dynamical Systems, Control, and Optimization), funded by the Interuniversity Attraction Poles Programme, initiated by the Belgian State, Science Policy Office, and of the Concerted Research Action (ARC) programme supported by the Federation Wallonia-Brussels (Contract ARC 14/19-060). The scientific responsibility rests with its authors. A.B.T. is a F.R.I.A. fellow.
Rights and permissions
About this article
Cite this article
Taylor, A.B., Hendrickx, J.M. & Glineur, F. Smooth strongly convex interpolation and exact worst-case performance of first-order methods. Math. Program. 161, 307–345 (2017). https://doi.org/10.1007/s10107-016-1009-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-016-1009-3
Keywords
- Smooth convex minimization
- Smooth convex interpolation
- First-order methods
- Worst-case analysis
- Rates of convergence
- Semidefinite programming