Abstract
In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.
Similar content being viewed by others
References
Ben-Tal A., Margalit T., Nemirovski A.: The ordered subsets mirror descent optimization method and its use for the positron emission tomography reconstruction. SIAM J. Optim 12(1), 79–108 (2001)
Bertsekas D.P.: Nonlinear programming, 2nd edn. Athena Scientific, Belmont (1999)
Bertsekas D.P., Nedić A., Ozdaglar A.E.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)
Brännlund, U.: On relaxation methods for nonsmooth convex optimization. Doctoral Thesis, Royal Institute of Technology, Stockholm (1993)
Burke J.V., Ferris M.C.: Weak sharp minima in mathematical programming. SIAM J. Control Optim. 31(5), 1340–1359 (1993)
Dem’yanov V.F., Vasil’ev L.V.: Nondifferentiable Optimization. Optimization Software Inc., New York (1985)
Ermoliev Yu.M.: On the stochastic quasi-gradient method and stochastic quasi-feyer sequences. Kibernetika 2, 73–83 (1969)
Ermoliev Yu.M.: Stochastic Programming Methods. Nauka, Moscow (1976)
Ermoliev Yu.M.: Stochastic quasigradient methods and their application to system optimization. Stochastics 9, 1–36 (1983)
Ermoliev Yu.M.: Stochastic quasigradient methods. In: Ermoliev, Yu.M., Wets, R.J.-B.(eds) Numerical Techniques for Stochastic Optimization. IIASA, pp. 141–185. Springer, Heidelberg (1988)
Gaudioso M., Giallombardo G., Miglionico G.: An incremental method for solving convex finite min–max problems. Math. Oper. Res. 31(1), 173–187 (2006)
Goffin J.L., Kiwiel K.: Convergence of a simple subgradient level method. Math. Program. 85, 207–211 (1999)
Kashyap A., Basar T., Srikant R.: Quantized consensus. Automatica 43, 1192–1203 (2007)
Kibardin V.M.: Decomposition into functions in the minimization problem. Autom. Remote Control 40, 1311–1323 (1980)
Kiwiel K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14(3), 807–840 (2004)
Kiwiel K.C.: A proximal bundle method with approximate subgradient linearizations. SIAM J. Optim. 16(4), 1007–1023 (2006)
Nedić, A., Bertsekas, D.P., Borkar, V.: Distributed asynchronous incremental subgradient methods. In: Butnariu, D., Censor Y., Reich, S. (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and their Applications. Stud. Comput. Math., Elsevier, Amsterdam (2001)
Nedić A., Bertsekas D.P.: Convergence rate of incremental subgradient algorithm. In: Uryasev, S., Pardalos, P.M.(eds) Stochastic Optimization: Algorithms and Applications, pp. 263–304. Kluwer, Dordrecht (2000)
Nedić A., Bertsekas D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)
Nedić, A.: Subgradient Methods for Convex Optimization. Ph.D. Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (2002)
Nurminskii E.A.: Minimization of nondifferentiable functions in presence of noise. Kibernetika 10(4), 59–61 (1974)
Pang J.-S.: Error bounds in mathematical programming. Math. Program. Ser. B 79, 299–332 (1997)
Polyak B.T.: Nonlinear programming methods in the presence of noise. Math. Program. 1(4), 87–97 (1978)
Polyak B.T.: Introduction to Optimization. Optimization Software Inc., New York (1987)
Rabbat M.G., Nowak R.D.: Quantized incremental algorithms for distributed optimization. IEEE J. Select. Areas Commun. 23(4), 798–808 (2005)
Solodov M.V., Zavriev S.K.: Error stability properties of generalized gradient-type algorithms. J. Opt. Theory Appl. 98(3), 663–680 (1998)
Tuncer C.A., Coates M.J., Rabbat M.G.: Distributed average consensus with dithered quantization. IEEE Trans. Signal Process 56(10), 4905–4918 (2008)
Author information
Authors and Affiliations
Corresponding author
Additional information
Research supported by NSF under Grant ACI-9873339.
Rights and permissions
About this article
Cite this article
Nedić, A., Bertsekas, D.P. The effect of deterministic noise in subgradient methods. Math. Program. 125, 75–99 (2010). https://doi.org/10.1007/s10107-008-0262-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-008-0262-5