Abstract
In this chapter we study the continuous subgradient algorithm for minimization of convex functions, under the presence of computational errors. We show that our algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how much time one needs for this.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Antipin AS (1994) Minimization of convex functions on convex sets by means of differential equations. Differ Equ 30:1365–1375
Baillon JB (1978) Un Exemple Concernant le Comportement Asymptotique de la Solution du Probleme 0 ∈ du∕dt + ∂ ϕ(u). J Funct Anal 28:369–376
Barbu V, Precupanu T (2012) Convexity and optimization in Banach spaces. Springer, Heidelberg, London, New York
Bolte J (2003) Continuous gradient projection method in Hilbert spaces. J Optim Theory Appl 119:235–259
Brezis H (1973) Opérateurs maximaux monotones. North Holland, Amsterdam
Bruck RE (1974) Asymptotic convergence of nonlinear contraction semigroups in a Hilbert space. J Funct Anal 18:15–26
Reich S, Zaslavski AJ (2014) Genericity in nonlinear analysis. Springer, New York
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Zaslavski, A.J. (2016). Continuous Subgradient Method. In: Numerical Optimization with Computational Errors. Springer Optimization and Its Applications, vol 108. Springer, Cham. https://doi.org/10.1007/978-3-319-30921-7_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-30921-7_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-30920-0
Online ISBN: 978-3-319-30921-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)