Abstract
In this chapter we study the subgradient projection algorithm for minimization of convex and nonsmooth functions and for computing the saddle points of convex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In general, these two computational errors are different. We show that our algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if we know the computational errors for the two steps of our algorithm, we find out what approximate solution can be obtained and how many iterates one needs for this.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alber YI (1971) On minimization of smooth functional by gradient methods. USSR Comp Math Math Phys 11:752–758
Alber YI, Iusem AN, Solodov MV (1998) On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math Program 81:23–35
Barty K, Roy J-S, Strugarek C (2007) Hilbert-valued perturbed subgradient algorithms. Math Oper Res 32:551–562
Bauschke HH, Borwein JM (1996) On projection algorithms for solving convex feasibility problems. SIAM Rev 38:367–426
Bauschke HH, and Combettes PL (2011) Convex analysis and monotone operator theory in Hilbert spaces. Springer, New York
Bauschke H, Wang C, Wang X, Xu J (2015) On subgradient projectors. SIAM J Optim 25:1064–1082
Beck A, Teboulle M (2003) Mirror descent and nonlinear projected subgradient methods for convex optimization. Oper Res Lett 31:167–175
Burachik RS, Grana Drummond LM, Iusem AN, Svaiter BF (1995) Full convergence of the steepest descent method with inexact line searches. Optimization 32:137–146
Butnariu D, Resmerita E (2002) Averaged subgradient methods for constrained convex optimization and Nash equilibria computation. Optimization 51:863–888
Censor Y, Gibali A, Reich S (2011) The subgradient extragradient method for solving variational inequalities in Hilbert space. J Optim Theory Appl 148:318–335
Censor Y, Gibali A, Reich S, Sabach S (2012) Common solutions to variational inequalities. Set-Valued Var Anal 20:229–247
Chadli O, Konnov IV, Yao JC (2004) Descent methods for equilibrium problems in a Banach space. Comput Math Appl 48:609–616
Davis D, Drusvyatskiy D, MacPhee KJ, Paquette C (2018) Subgradient methods for sharp weakly convex functions. J Optim Theory Appl 179:962–982
Demyanov VF, Vasilyev LV (1985) Nondifferentiable optimization. Optimization Software, New York
Facchinei F, Pang J-S (2003) Finite-dimensional variational inequalities and complementarity problems, volume I and volume II. Springer-Verlag, New York
Gibali A, Jadamba B, Khan AA, Raciti F, Winkler B (2016) Gradient and extragradient methods for the elasticity imaging inverse problem using an equation error formulation: a comparative numerical study. Nonlinear Anal Optim Contemp Math 659:65–89
Griva I (2018) Convergence analysis of augmented Lagrangian-fast projected gradient method for convex quadratic problems. Pure Appl Funct Anal 3:417–428
Hiriart-Urruty J-B, Lemarechal C (1993) Convex analysis and minimization algorithms. Springer, Berlin
Konnov IV (2003) On convergence properties of a subgradient method. Optim Methods Softw 18:53–62
Konnov IV (2018) Simplified versions of the conditional gradient method. Optimization 67:2275–2290
Zaslavski AJ (2016) Numerical optimization with computational errors. Springer, Cham
Zaslavski AJ (2016) Approximate solutions of common fixed point problems, Springer optimization and its applications. Springer, Cham
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
J. Zaslavski, A. (2020). Subgradient Projection Algorithm. In: Convex Optimization with Computational Errors. Springer Optimization and Its Applications, vol 155. Springer, Cham. https://doi.org/10.1007/978-3-030-37822-6_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-37822-6_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-37821-9
Online ISBN: 978-3-030-37822-6
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)