Skip to main content
Log in

OSGA: a fast subgradient algorithm with optimal complexity

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex, finite-dimensional domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst case number of iterations needed to achieve a given accuracy is independent of the dimension and—apart from a constant factor—best possible under a variety of smoothness assumptions on the objective function.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ahookhosh, M.: Optimal subgradient algorithms with application to large-scale linear inverse problems, Submitted. http://arxiv.org/abs/1402.7291 (2014)

  2. Ahookhosh, M., Neumaier, A.: High-dimensional convex optimization via optimal affine subgradient algorithms. In: ROKS workshop, 83–84 (2013)

  3. Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithm with subspace search for costly convex optimization problems. Submitted. http://www.optimization-online.org/DB_FILE/2015/04/4852 (2015)

  4. Ahookhosh, M., Neumaier, A.: Solving nonsmooth convex optimization with complexity \(O(\varepsilon ^{-1/2})\). Submitted. http://www.optimizationonline.org/DB_HTML/2015/05/4900.html (2015)

  5. Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithms for large-scale bound-constrained convex optimization. Submitted. http://arxiv.org/abs/1501.01497 (2015)

  6. Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithms for large-scale convex optimization in simple domains. Submitted. http://arxiv.org/abs/1501.01451 (2015)

  7. Auslender, A., Teboulle, M.: Interior gradient and proximal methods for convex and conic optimization. SIAM J. Optim. 16, 697–725 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  8. Axelsson, O., Lindskog, G.: On the rate of convergence of the conjugate gradient method. Numer. Math. 48, 499–523 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  9. Aybat, N.S., Iyengar, G.: A first-order augmented Lagrangian method for compressed sensing. SIAM J. Optim. 22(2), 429–459 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  10. Beck, A., Ben-Tal, A., Guttmann-Beck, N., Tetruashvili, L.: The CoMirror algorithm for solving nonsmooth constrained convex problems. Oper. Res. Lett. 38, 493–498 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  11. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  12. Becker, S.R., Candès, E.J., Grant, M.C.: Templates for convex cone problems with applications to sparse signal recovery. Math. Program. Comput. 3, 165–218 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  13. Chen, J., Burer, S.: A first-order smoothing technique for a class of large-scale linear programs. SIAM J. Optim. 24, 598–620 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  14. Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math. Program. 146, 37–75 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  15. Fountoulakis, K., Gondzio, J., Zhlobich, P.: Matrix-free interior point method for compressed sensing problems. Math. Program. Comput. 6, 1–31 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  16. Gonzaga, C.C., Karas, E.W.: Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming. Math. Program. 138, 141–166 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  17. Gonzaga, C.C., Karas, E.W., Rossetto, D.R.: An optimal algorithm for constrained differentiable convex optimization. SIAM J. Optim. 23, 1939–1955 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  18. Gu, M., Lim, L.-H., Wu, C.J.: PARNES: a rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals. Numer. Algorithm. 64, 321–347 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  19. Juditsky, A., Nesterov, Y.: Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization. Stoch. Syst. 4(1), 44–80 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  20. Lan, G.: Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization. Mathematical Programming (2013). doi:10.1007/s10107-013-0737-x

    Google Scholar 

  21. Lan, G., Lu, Z., Monteiro, R.D.C.: Primal-dual first-order methods with \(O(1/\varepsilon )\) iteration-complexity for cone programming. Math. Program. 126, 1–29 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  22. Meng, X., Chen, H.: Accelerating Nesterov’s method for strongly convex functions with Lipschitz gradient, Arxiv preprint arXiv:1109.6058 (2011)

  23. Nemirovsky, A.S., Yudin, D.B.: Problem Complexity and Method Efficiency in Optimization. Wiley, New York (1983)

    Google Scholar 

  24. Nesterov, Y.: A method of solving a convex programming problem with convergence rate \(O(1, k^2)\) (in Russian), Doklady AN SSSR 269 (1983), 543–547. Engl. translation: Soviet Math. Dokl. 27(1983), 372–376

  25. Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer, Dordrecht (2004)

    Book  MATH  Google Scholar 

  26. Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103, 127–152 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  27. Nesterov, Y.: Rounding of convex sets and efficient gradient methods for linear programming problems. Optim. Method. Softw. 23, 109–128 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  28. Nesterov, Y.: Unconstrained convex minimization in relative scale. Math. Oper. Res. 34, 180–193 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  29. Nesterov, Y.: Primal-dual subgradient methods for convex problems. Math. Program. 120, 221–259 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  30. Nesterov, Y.: Gradient methods for minimizing composite objective function. Math. Program. 140, 125–161 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  31. Nesterov, Y.: Universal gradient methods for convex optimization problems. Math. Programming (2014). doi:10.1007/s10107-014-0790-0

    MATH  Google Scholar 

  32. Richtarik, P.: Improved algorithms for convex minimization in relative scale. SIAM J. Optim. 21, 1141–1167 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  33. Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization, Technical report, Math. Dept., Univ. of Washington. http://pages.cs.wisc.edu/~brecht/cs726docs/Tseng.APG (2008)

  34. Yu, J., Vishvanathan, S.V.N., Günter, S., Schraudolph, N.N.: A Quasi–Newton approach to nonsmooth convex optimization problems in machine learning. J. Mach. Learn. Res. 11, 1145–1200 (2010)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

I’d like to thank Masoud Ahookhosh for numerous useful remarks on earlier versions of the manuscript. Thanks also to the referees for a number of suggestions that improved the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Arnold Neumaier.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Neumaier, A. OSGA: a fast subgradient algorithm with optimal complexity. Math. Program. 158, 1–21 (2016). https://doi.org/10.1007/s10107-015-0911-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-015-0911-4

Keywords

Mathematics Subject Classification

Navigation