, Volume 13, Issue 1, pp 329-347

On convergence rates of subgradient optimization methods

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Rates of convergence of subgradient optimization are studied. If the step size is chosen to be a geometric progression with ratioρ the convergence, if it occurs, is geometric with rateρ. For convergence to occur, it is necessary that the initial step size be large enough, and that the ratioρ be greater than a sustainable ratez(μ), which depends upon a condition numberμ, defined for both differentiable and nondifferentiable functions. The sustainable ratez(μ) is closely related to the rate of convergence of the steepest ascent method for differentiable functions: in fact it is identical if the function is not too well conditioned.
This research was supported in part by the D.G.E.S. (Quebec) and the N.R.C. of Canada under grants A8970 and A4152.