Skip to main content
Log in

The effect of deterministic noise in subgradient methods

  • FULL LENGTH PAPER
  • Series A
  • Published:
Mathematical Programming Submit manuscript

Abstract

In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ben-Tal A., Margalit T., Nemirovski A.: The ordered subsets mirror descent optimization method and its use for the positron emission tomography reconstruction. SIAM J. Optim 12(1), 79–108 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  2. Bertsekas D.P.: Nonlinear programming, 2nd edn. Athena Scientific, Belmont (1999)

    MATH  Google Scholar 

  3. Bertsekas D.P., Nedić A., Ozdaglar A.E.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)

    MATH  Google Scholar 

  4. Brännlund, U.: On relaxation methods for nonsmooth convex optimization. Doctoral Thesis, Royal Institute of Technology, Stockholm (1993)

  5. Burke J.V., Ferris M.C.: Weak sharp minima in mathematical programming. SIAM J. Control Optim. 31(5), 1340–1359 (1993)

    Article  MATH  MathSciNet  Google Scholar 

  6. Dem’yanov V.F., Vasil’ev L.V.: Nondifferentiable Optimization. Optimization Software Inc., New York (1985)

    Google Scholar 

  7. Ermoliev Yu.M.: On the stochastic quasi-gradient method and stochastic quasi-feyer sequences. Kibernetika 2, 73–83 (1969)

    Google Scholar 

  8. Ermoliev Yu.M.: Stochastic Programming Methods. Nauka, Moscow (1976)

    Google Scholar 

  9. Ermoliev Yu.M.: Stochastic quasigradient methods and their application to system optimization. Stochastics 9, 1–36 (1983)

    MATH  MathSciNet  Google Scholar 

  10. Ermoliev Yu.M.: Stochastic quasigradient methods. In: Ermoliev, Yu.M., Wets, R.J.-B.(eds) Numerical Techniques for Stochastic Optimization. IIASA, pp. 141–185. Springer, Heidelberg (1988)

    Google Scholar 

  11. Gaudioso M., Giallombardo G., Miglionico G.: An incremental method for solving convex finite min–max problems. Math. Oper. Res. 31(1), 173–187 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  12. Goffin J.L., Kiwiel K.: Convergence of a simple subgradient level method. Math. Program. 85, 207–211 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  13. Kashyap A., Basar T., Srikant R.: Quantized consensus. Automatica 43, 1192–1203 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  14. Kibardin V.M.: Decomposition into functions in the minimization problem. Autom. Remote Control 40, 1311–1323 (1980)

    Google Scholar 

  15. Kiwiel K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14(3), 807–840 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  16. Kiwiel K.C.: A proximal bundle method with approximate subgradient linearizations. SIAM J. Optim. 16(4), 1007–1023 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  17. Nedić, A., Bertsekas, D.P., Borkar, V.: Distributed asynchronous incremental subgradient methods. In: Butnariu, D., Censor Y., Reich, S. (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and their Applications. Stud. Comput. Math., Elsevier, Amsterdam (2001)

  18. Nedić A., Bertsekas D.P.: Convergence rate of incremental subgradient algorithm. In: Uryasev, S., Pardalos, P.M.(eds) Stochastic Optimization: Algorithms and Applications, pp. 263–304. Kluwer, Dordrecht (2000)

    Google Scholar 

  19. Nedić A., Bertsekas D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  20. Nedić, A.: Subgradient Methods for Convex Optimization. Ph.D. Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (2002)

  21. Nurminskii E.A.: Minimization of nondifferentiable functions in presence of noise. Kibernetika 10(4), 59–61 (1974)

    MathSciNet  Google Scholar 

  22. Pang J.-S.: Error bounds in mathematical programming. Math. Program. Ser. B 79, 299–332 (1997)

    Google Scholar 

  23. Polyak B.T.: Nonlinear programming methods in the presence of noise. Math. Program. 1(4), 87–97 (1978)

    Article  Google Scholar 

  24. Polyak B.T.: Introduction to Optimization. Optimization Software Inc., New York (1987)

    Google Scholar 

  25. Rabbat M.G., Nowak R.D.: Quantized incremental algorithms for distributed optimization. IEEE J. Select. Areas Commun. 23(4), 798–808 (2005)

    Article  Google Scholar 

  26. Solodov M.V., Zavriev S.K.: Error stability properties of generalized gradient-type algorithms. J. Opt. Theory Appl. 98(3), 663–680 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  27. Tuncer C.A., Coates M.J., Rabbat M.G.: Distributed average consensus with dithered quantization. IEEE Trans. Signal Process 56(10), 4905–4918 (2008)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Angelia Nedić.

Additional information

Research supported by NSF under Grant ACI-9873339.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nedić, A., Bertsekas, D.P. The effect of deterministic noise in subgradient methods. Math. Program. 125, 75–99 (2010). https://doi.org/10.1007/s10107-008-0262-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-008-0262-5

Mathematics Subject Classification (2000)

Navigation