# On convergence rates of subgradient optimization methods

- Received:
- Revised:

DOI: 10.1007/BF01584346

- Cite this article as:
- Goffin, J.L. Mathematical Programming (1977) 13: 329. doi:10.1007/BF01584346

- 75 Citations
- 594 Downloads

## Abstract

Rates of convergence of subgradient optimization are studied. If the step size is chosen to be a geometric progression with ratio*ρ* the convergence, if it occurs, is geometric with rate*ρ.* For convergence to occur, it is necessary that the initial step size be large enough, and that the ratio*ρ* be greater than a sustainable rate*z(μ)*, which depends upon a condition number*μ*, defined for both differentiable and nondifferentiable functions. The sustainable rate*z(μ)* is closely related to the rate of convergence of the steepest ascent method for differentiable functions: in fact it is identical if the function is not too well conditioned.