Advertisement

Mathematical Programming

, Volume 138, Issue 1–2, pp 141–166 | Cite as

Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming

  • Clóvis C. Gonzaga
  • Elizabeth W. Karas
Full Length Paper Series A

Abstract

We modify the first order algorithm for convex programming described by Nesterov in his book (in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, 2004). In his algorithms, Nesterov makes explicit use of a Lipschitz constant L for the function gradient, which is either assumed known (Nesterov in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, 2004), or is estimated by an adaptive procedure (Nesterov 2007). We eliminate the use of L at the cost of an extra imprecise line search, and obtain an algorithm which keeps the optimal complexity properties and also inherit the global convergence properties of the steepest descent method for general continuously differentiable optimization. Besides this, we develop an adaptive procedure for estimating a strong convexity constant for the function. Numerical tests for a limited set of toy problems show an improvement in performance when compared with the original Nesterov’s algorithms.

Keywords

Unconstrained convex problems Optimal method 

Mathematics Subject Classification

62K05 65K05 68Q25 90C25 90C60 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bertsekas D.P., Nedić A., Ozdaglar A.E.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)zbMATHGoogle Scholar
  2. 2.
    Lan, G., Lu, Z., Monteiro, R.D.C.: Primal-dual first-order methods with \({{O}(1/\epsilon)}\) iteration-complexity for cone programming. Technical report, School of ISyE, Georgia Tech, USA, Accepted in Mathematical Programming (2006)Google Scholar
  3. 3.
    Moré, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. Technical Report MCS-P330-1092, Mathematics and Computer Science Division, Argonne National Laboratory (1992)Google Scholar
  4. 4.
    Nemirovsky A.S., Yudin D.B.: Problem Complexity and Method Efficiency in Optimization. Wiley, New York (1983)zbMATHGoogle Scholar
  5. 5.
    Nesterov Y.: Introductory Lectures on Convex Optimization. A Basic Course. Kluwer, Boston (2004)zbMATHGoogle Scholar
  6. 6.
    Nesterov Y.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)MathSciNetzbMATHCrossRefGoogle Scholar
  7. 7.
    Nesterov, Y.: Gradient methods for minimizing composite objective function. Discussion paper 76, CORE, UCL, Belgium (2007)Google Scholar
  8. 8.
    Nesterov Y.: Smoothing technique and its applications in semidefinite optimization. Math. Program. 110(2), 245–259 (2007)MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Nesterov Y., Polyak B.T.: Cubic regularization of Newton method and its global performance. Math. Program. 108(1), 177–205 (2006)MathSciNetzbMATHCrossRefGoogle Scholar
  10. 10.
    Nocedal J., Wright S.J.: Numerical Optimization Springer Series in Operations Research. Springer, Berlin (1999)Google Scholar
  11. 11.
    Polyak B.T.: Introduction to Optimization. Optimization Software Inc., New York (1987)Google Scholar
  12. 12.
    Richtárik P.: Improved algorithms for convex minimization in relative scale. SIAM J. Optim. 21(3), 1141–1167 (2011)MathSciNetzbMATHCrossRefGoogle Scholar
  13. 13.
    Shor N.: Minimization Methods for Non-differentiable Functions. Springer, Berlin (1985)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer and Mathematical Optimization Society 2012

Authors and Affiliations

  1. 1.Department of MathematicsFederal University of Santa CatarinaFlorianópolisBrazil
  2. 2.Department of MathematicsFederal University of ParanáCuritibaBrazil

Personalised recommendations