Skip to main content

Subgradient Projection Algorithm

  • Chapter
  • First Online:
Convex Optimization with Computational Errors

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 155))

  • 879 Accesses

Abstract

In this chapter we study the subgradient projection algorithm for minimization of convex and nonsmooth functions and for computing the saddle points of convex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In general, these two computational errors are different. We show that our algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if we know the computational errors for the two steps of our algorithm, we find out what approximate solution can be obtained and how many iterates one needs for this.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 49.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 49.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alber YI (1971) On minimization of smooth functional by gradient methods. USSR Comp Math Math Phys 11:752–758

    MathSciNet  Google Scholar 

  2. Alber YI, Iusem AN, Solodov MV (1998) On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math Program 81:23–35

    MathSciNet  MATH  Google Scholar 

  3. Barty K, Roy J-S, Strugarek C (2007) Hilbert-valued perturbed subgradient algorithms. Math Oper Res 32:551–562

    Article  MathSciNet  Google Scholar 

  4. Bauschke HH, Borwein JM (1996) On projection algorithms for solving convex feasibility problems. SIAM Rev 38:367–426

    Article  MathSciNet  Google Scholar 

  5. Bauschke HH, and Combettes PL (2011) Convex analysis and monotone operator theory in Hilbert spaces. Springer, New York

    Book  Google Scholar 

  6. Bauschke H, Wang C, Wang X, Xu J (2015) On subgradient projectors. SIAM J Optim 25:1064–1082

    Article  MathSciNet  Google Scholar 

  7. Beck A, Teboulle M (2003) Mirror descent and nonlinear projected subgradient methods for convex optimization. Oper Res Lett 31:167–175

    Article  MathSciNet  Google Scholar 

  8. Burachik RS, Grana Drummond LM, Iusem AN, Svaiter BF (1995) Full convergence of the steepest descent method with inexact line searches. Optimization 32:137–146

    Article  MathSciNet  Google Scholar 

  9. Butnariu D, Resmerita E (2002) Averaged subgradient methods for constrained convex optimization and Nash equilibria computation. Optimization 51:863–888

    Article  MathSciNet  Google Scholar 

  10. Censor Y, Gibali A, Reich S (2011) The subgradient extragradient method for solving variational inequalities in Hilbert space. J Optim Theory Appl 148:318–335

    Article  MathSciNet  Google Scholar 

  11. Censor Y, Gibali A, Reich S, Sabach S (2012) Common solutions to variational inequalities. Set-Valued Var Anal 20:229–247

    Article  MathSciNet  Google Scholar 

  12. Chadli O, Konnov IV, Yao JC (2004) Descent methods for equilibrium problems in a Banach space. Comput Math Appl 48:609–616

    Article  MathSciNet  Google Scholar 

  13. Davis D, Drusvyatskiy D, MacPhee KJ, Paquette C (2018) Subgradient methods for sharp weakly convex functions. J Optim Theory Appl 179:962–982

    Article  MathSciNet  Google Scholar 

  14. Demyanov VF, Vasilyev LV (1985) Nondifferentiable optimization. Optimization Software, New York

    Book  Google Scholar 

  15. Facchinei F, Pang J-S (2003) Finite-dimensional variational inequalities and complementarity problems, volume I and volume II. Springer-Verlag, New York

    MATH  Google Scholar 

  16. Gibali A, Jadamba B, Khan AA, Raciti F, Winkler B (2016) Gradient and extragradient methods for the elasticity imaging inverse problem using an equation error formulation: a comparative numerical study. Nonlinear Anal Optim Contemp Math 659:65–89

    MathSciNet  MATH  Google Scholar 

  17. Griva I (2018) Convergence analysis of augmented Lagrangian-fast projected gradient method for convex quadratic problems. Pure Appl Funct Anal 3:417–428

    MathSciNet  Google Scholar 

  18. Hiriart-Urruty J-B, Lemarechal C (1993) Convex analysis and minimization algorithms. Springer, Berlin

    Book  Google Scholar 

  19. Konnov IV (2003) On convergence properties of a subgradient method. Optim Methods Softw 18:53–62

    Article  MathSciNet  Google Scholar 

  20. Konnov IV (2018) Simplified versions of the conditional gradient method. Optimization 67:2275–2290

    Article  MathSciNet  Google Scholar 

  21. Zaslavski AJ (2016) Numerical optimization with computational errors. Springer, Cham

    Book  Google Scholar 

  22. Zaslavski AJ (2016) Approximate solutions of common fixed point problems, Springer optimization and its applications. Springer, Cham

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

J. Zaslavski, A. (2020). Subgradient Projection Algorithm. In: Convex Optimization with Computational Errors. Springer Optimization and Its Applications, vol 155. Springer, Cham. https://doi.org/10.1007/978-3-030-37822-6_2

Download citation

Publish with us

Policies and ethics