Skip to main content
Log in

First-order methods of smooth convex optimization with inexact oracle

  • Full Length Paper
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

We introduce the notion of inexact first-order oracle and analyze the behavior of several first-order methods of smooth convex optimization used with such an oracle. This notion of inexact oracle naturally appears in the context of smoothing techniques, Moreau–Yosida regularization, Augmented Lagrangians and many other situations. We derive complexity estimates for primal, dual and fast gradient methods, and study in particular their dependence on the accuracy of the oracle and the desired accuracy of the objective function. We observe that the superiority of fast gradient methods over the classical ones is no longer absolute when an inexact oracle is used. We prove that, contrary to simple gradient schemes, fast gradient methods must necessarily suffer from error accumulation. Finally, we show that the notion of inexact oracle allows the application of first-order methods of smooth convex optimization to solve non-smooth or weakly smooth convex problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Baes, M.: Estimate Sequence Methods: Extensions and Approximations. IFOR Internal report, ETH Zurich, Switzerland (2009)

    Google Scholar 

  2. Correa, R., Lemarechal, C.: Convergence of some algorithms for convex minimization. Math. Program. Ser. A 62, 261–275 (1993)

    Article  MATH  MathSciNet  Google Scholar 

  3. D’Aspremont, A.: Smooth optimization with approximate gradient. SIAM J. Optim. 19, 1171–1183 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  4. Devolder, O., Glineur, F., Nesterov, Y.: Double smoothing technique for large-scale linearly constrained convex optimization. SIAM J. Optim. 22(2), 702–727 (2012)

    Google Scholar 

  5. Hintermuller, M.: A proximal bundle method based on approximative subgradient. Comput. Optim. Appl. 20, 245–266 (2001)

    Article  MathSciNet  Google Scholar 

  6. Kiwiel, K.: A proximal bundel method with approximative subgradient linearization. SIAM J. Optim. 16, 1007–1023 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  7. Khachiyan, L., Nemirovskii, A., Nesterov, Y.: Optimal methods of convex programming and polynomial methods of linear programming. In: Elster, H. (ed.) Modern Mathematical Methods of Optimization, pp. 75–115. Akademie Verlag, Berlin (1993)

    Google Scholar 

  8. Lan, G.: An optimal method for stochastic composite optimization. Math. Program. Ser. A 133(1–2), 365–397 (2012)

    Article  MATH  Google Scholar 

  9. Nedic, A., Bertsekas, D.: The effect of deterministic noise in subgradient methods. Math. Program. Ser. A 125, 75–99 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  10. Nemirovskii, A., Nesterov, Y.: Optimal methods for smooth convex minimization. Zh. Vichisl. Mat. Fiz. (In Russian) 25(3), 356–369 (1985)

    MathSciNet  Google Scholar 

  11. Nemirovskii, A., Yudin, D.: Problem Complexity and Method Efficiency in Optimization. Wiley, New York (1983)

    Google Scholar 

  12. Nesterov, Y.: A method for unconstrained convex minimization with the rate of convergence of \({O}(\frac{1}{k^2})\). Doklady AN SSSR 269, 543–547 (1983)

    MathSciNet  Google Scholar 

  13. Nesterov, Y.: On an approach to the construction of optimal methods of minimization of smooth convex function. Èkonom. i. Mat. Metody (In Russian) 24, 509–517 (1988)

    MATH  MathSciNet  Google Scholar 

  14. Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer, Dordrecht (2004)

    Book  Google Scholar 

  15. Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. Ser. A 103, 127–152 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  16. Nesterov, Y.: Excessive gap technique in nonsmooth convex minimization. SIAM J. Optim. 16, 235–249 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  17. Nesterov, Y.: Smoothing technique and its applications in semidefinite optimization. Math. Program. A 110, 245–259 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  18. Nesterov, Y.: Gradient methods for minimizing composite objective function. Math. Program. Ser. B. doi:10.1007/s10107-012-0629-5 (2012)

  19. Polyak, B.T.: Introduction to Optimization. Optimization Software Inc, New York (1987)

  20. Shor, N.Z.: Minimization Methods for Non-Differentiable Functions. Springer Series in Computational Mathematics. Springer, Berlin (1985)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Olivier Devolder.

Additional information

This text presents research results of the Belgian Program on Interuniversity Poles of Attraction initiated by the Belgian State, Prime Minister’s Office, Science Policy Programming. The first author is a F.R.S.-FNRS Research Fellow. The research of the third author was partly supported by the grant Action de recherche concertée ARC 04/09-315 from the Direction de la recherche scientifique—Communauté française de Belgique. The third author also acknowledges the support from Laboratory of Structural Methods of Data Analysis in Predictive Modelling, through the RF government grant 11.G34.31.0073. The scientific responsibility rests with its authors.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Devolder, O., Glineur, F. & Nesterov, Y. First-order methods of smooth convex optimization with inexact oracle. Math. Program. 146, 37–75 (2014). https://doi.org/10.1007/s10107-013-0677-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-013-0677-5

Keywords

Mathematics Subject Classification (2000)

Navigation