Mathematical Programming

, Volume 145, Issue 1–2, pp 451–482 | Cite as

Performance of first-order methods for smooth convex minimization: a novel approach

  • Yoel Drori
  • Marc Teboulle
Full Length Paper Series A


We introduce a novel approach for analyzing the worst-case performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space. Our approach relies on the observation that by definition, the worst-case behavior of a black-box optimization method is by itself an optimization problem, which we call the performance estimation problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply this approach on the classical gradient method and derive a new and tight analytical bound on its performance. We then consider a broader class of first-order black-box methods, which among others, include the so-called heavy-ball method and the fast gradient schemes. We show that for this broader class, it is possible to derive new bounds on the performance of these methods by solving an adequately relaxed convex semidefinite PEP. Finally, we show an efficient procedure for finding optimal step sizes which results in a first-order black-box method that achieves best worst-case performance.


Performance of first-order algorithms Rate of convergence  Complexity Smooth convex minimization Duality Semidefinite relaxations  Fast gradient schemes  Heavy Ball method 

Mathematics Subject Classification (2000)

90C60 49M25 90C25 90C20 90C22 68Q25 



This work was initiated during our participation to the “Modern Trends in Optimization and Its Application” program at IPAM (UCLA), September–December 2010. We would like to thank IPAM for their support and for the very pleasant and stimulating environment provided to us during our stay. We thank Simi Haber, Ido Ben-Eliezer and Rani Hod for their help in the proof of Lemma 3, and we would also like to thank two anonymous referees for their careful reading and useful suggestions.


  1. 1.
    Attouch, H., Bolte, J., Redont, P.: Optimizing properties of an inertial dynamical system with geometric damping. Link with proximal methods. Control Cybern. 31(3), 643–657 (2002). Well-posedness in optimization and related topics (Warsaw, 2001)Google Scholar
  2. 2.
    Attouch, H., Goudou, X., Redont, P.: The heavy ball with friction method. I. The continuous dynamical system: global exploration of the local minima of a real-valued function by asymptotic analysis of a dissipative dynamical system. Commun. Contemp. Math. 2(1), 1–34 (2000)CrossRefzbMATHMathSciNetGoogle Scholar
  3. 3.
    Beck, A.: Quadratic matrix programming. SIAM J. Optim. 17(4), 1224–1238 (2006)CrossRefMathSciNetGoogle Scholar
  4. 4.
    Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imag. Sci. 2, 183–202 (2009)CrossRefzbMATHMathSciNetGoogle Scholar
  5. 5.
    Beck, A., Teboulle, M.: Gradient-based algorithms with applications to signal-recovery problems. In: Convex Optimization in Signal Processing and Communications, pp. 42–88. Cambridge University Press, Cambridge (2010)Google Scholar
  6. 6.
    Ben-Tal, A., Nemirovskii, A.S.: In: Lectures on modern convex optimization. SIAM (2001)Google Scholar
  7. 7.
    Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)CrossRefzbMATHGoogle Scholar
  8. 8.
    CVX Research, I.: CVX: Matlab software for disciplined convex programming, version 2.0 beta. (2012)
  9. 9.
    Gonzaga, C., Karas, E.: Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming. Math. Program. 1–26 (2012).
  10. 10.
    Grant, M., Boyd, S.: Graph implementations for nonsmooth convex programs. In: Blondel, V., Boyd, S., Kimura, H. (eds.) Recent Advances in Learning and Control, Lecture Notes in Control and Information Sciences, pp. 95–110. Springer-Verlag Limited (2008).
  11. 11.
    Helmberg, C., Rendl, F., Vanderbei, R., Wolkowicz, H.: An interior-point method for semidefinite programming. SIAM J. Optim. 6, 342–361 (1996)CrossRefzbMATHMathSciNetGoogle Scholar
  12. 12.
    Lan, G., Lu, Z., Monteiro, R.: Primal-dual first-order methods with iteration-complexity for cone programming. Math. Program. 126(1), 1–29 (2011)CrossRefzbMATHMathSciNetGoogle Scholar
  13. 13.
    Moreau, J.J.: Proximité et dualité dans un espace hilbertien. Bull. Soc. Math. France 93, 273–299 (1965)zbMATHMathSciNetGoogle Scholar
  14. 14.
    Nemirovsky, A.S., Yudin, D.B.: Problem complexity and Method Efficiency in Optimization. a Wiley-Interscience Publication. Wiley, New York (1983) Translated from the Russian and with a preface by E. R. Dawson, Wiley-Interscience Series in Discrete MathematicsGoogle Scholar
  15. 15.
    Nesterov, Y.: A method of solving a convex programming problem with convergence rate O\((1/k^2)\). Sov. Math. Dokl. 27(2), 372–376 (1983)zbMATHGoogle Scholar
  16. 16.
    Nesterov, Y.: Introductory lectures on convex optimization: a basic course. Applied optimization. Kluwer Academic Publishers, Dordrecht (2004)CrossRefGoogle Scholar
  17. 17.
    Palomar, D.P., Eldar, Y.C. (eds.): Convex Optimization in Signal Processing and Communications. Cambridge University Press, Cambridge (2010)zbMATHGoogle Scholar
  18. 18.
    Polyak, B.T.: Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys. 4(5), 1–17 (1964)CrossRefGoogle Scholar
  19. 19.
    Richtárik, P.: Improved algorithms for convex minimization in relative scale. SIAM J. Optim. 21(3), 1141–1167 (2011)CrossRefzbMATHMathSciNetGoogle Scholar
  20. 20.
    Rockafellar, R.T., Roger, J.B.W.: Variational analysis, Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences], vol. 317. Springer, Berlin (1998)Google Scholar
  21. 21.
    Sra, S., Nowozin, S., Wright, S.J. (eds.): Optimization for Machine Learning. MIT Press, Cambridge (2011)Google Scholar
  22. 22.
    Vandenberghe, L., Boyd, S.: Semidefinite programming. SIAM Rev. 38(1), 49–95 (1996)CrossRefzbMATHMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society 2013

Authors and Affiliations

  1. 1.School of Mathematical SciencesTel-Aviv UniversityTel-AvivIsrael

Personalised recommendations