Mathematical Programming

, Volume 125, Issue 1, pp 75–99 | Cite as

The effect of deterministic noise in subgradient methods

FULL LENGTH PAPER Series A

Abstract

In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.

Mathematics Subject Classification (2000)

90C25 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ben-Tal A., Margalit T., Nemirovski A.: The ordered subsets mirror descent optimization method and its use for the positron emission tomography reconstruction. SIAM J. Optim 12(1), 79–108 (2001)MATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Bertsekas D.P.: Nonlinear programming, 2nd edn. Athena Scientific, Belmont (1999)MATHGoogle Scholar
  3. 3.
    Bertsekas D.P., Nedić A., Ozdaglar A.E.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)MATHGoogle Scholar
  4. 4.
    Brännlund, U.: On relaxation methods for nonsmooth convex optimization. Doctoral Thesis, Royal Institute of Technology, Stockholm (1993)Google Scholar
  5. 5.
    Burke J.V., Ferris M.C.: Weak sharp minima in mathematical programming. SIAM J. Control Optim. 31(5), 1340–1359 (1993)MATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Dem’yanov V.F., Vasil’ev L.V.: Nondifferentiable Optimization. Optimization Software Inc., New York (1985)Google Scholar
  7. 7.
    Ermoliev Yu.M.: On the stochastic quasi-gradient method and stochastic quasi-feyer sequences. Kibernetika 2, 73–83 (1969)Google Scholar
  8. 8.
    Ermoliev Yu.M.: Stochastic Programming Methods. Nauka, Moscow (1976)Google Scholar
  9. 9.
    Ermoliev Yu.M.: Stochastic quasigradient methods and their application to system optimization. Stochastics 9, 1–36 (1983)MATHMathSciNetGoogle Scholar
  10. 10.
    Ermoliev Yu.M.: Stochastic quasigradient methods. In: Ermoliev, Yu.M., Wets, R.J.-B.(eds) Numerical Techniques for Stochastic Optimization. IIASA, pp. 141–185. Springer, Heidelberg (1988)Google Scholar
  11. 11.
    Gaudioso M., Giallombardo G., Miglionico G.: An incremental method for solving convex finite min–max problems. Math. Oper. Res. 31(1), 173–187 (2006)MATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Goffin J.L., Kiwiel K.: Convergence of a simple subgradient level method. Math. Program. 85, 207–211 (1999)MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Kashyap A., Basar T., Srikant R.: Quantized consensus. Automatica 43, 1192–1203 (2007)MATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Kibardin V.M.: Decomposition into functions in the minimization problem. Autom. Remote Control 40, 1311–1323 (1980)Google Scholar
  15. 15.
    Kiwiel K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14(3), 807–840 (2004)MATHCrossRefMathSciNetGoogle Scholar
  16. 16.
    Kiwiel K.C.: A proximal bundle method with approximate subgradient linearizations. SIAM J. Optim. 16(4), 1007–1023 (2006)MATHCrossRefMathSciNetGoogle Scholar
  17. 17.
    Nedić, A., Bertsekas, D.P., Borkar, V.: Distributed asynchronous incremental subgradient methods. In: Butnariu, D., Censor Y., Reich, S. (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and their Applications. Stud. Comput. Math., Elsevier, Amsterdam (2001)Google Scholar
  18. 18.
    Nedić A., Bertsekas D.P.: Convergence rate of incremental subgradient algorithm. In: Uryasev, S., Pardalos, P.M.(eds) Stochastic Optimization: Algorithms and Applications, pp. 263–304. Kluwer, Dordrecht (2000)Google Scholar
  19. 19.
    Nedić A., Bertsekas D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)MATHCrossRefMathSciNetGoogle Scholar
  20. 20.
    Nedić, A.: Subgradient Methods for Convex Optimization. Ph.D. Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (2002)Google Scholar
  21. 21.
    Nurminskii E.A.: Minimization of nondifferentiable functions in presence of noise. Kibernetika 10(4), 59–61 (1974)MathSciNetGoogle Scholar
  22. 22.
    Pang J.-S.: Error bounds in mathematical programming. Math. Program. Ser. B 79, 299–332 (1997)Google Scholar
  23. 23.
    Polyak B.T.: Nonlinear programming methods in the presence of noise. Math. Program. 1(4), 87–97 (1978)CrossRefGoogle Scholar
  24. 24.
    Polyak B.T.: Introduction to Optimization. Optimization Software Inc., New York (1987)Google Scholar
  25. 25.
    Rabbat M.G., Nowak R.D.: Quantized incremental algorithms for distributed optimization. IEEE J. Select. Areas Commun. 23(4), 798–808 (2005)CrossRefGoogle Scholar
  26. 26.
    Solodov M.V., Zavriev S.K.: Error stability properties of generalized gradient-type algorithms. J. Opt. Theory Appl. 98(3), 663–680 (1998)MATHCrossRefMathSciNetGoogle Scholar
  27. 27.
    Tuncer C.A., Coates M.J., Rabbat M.G.: Distributed average consensus with dithered quantization. IEEE Trans. Signal Process 56(10), 4905–4918 (2008)CrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag 2008

Authors and Affiliations

  1. 1.Department of Industrial and Enterprise Systems EngineeringUIUCUrbanaUSA
  2. 2.Department of Electrical Engineering and Computer ScienceM.I.T.CambridgeUSA

Personalised recommendations