Skip to main content

Continuous Subgradient Method

  • Chapter
  • First Online:
  • 1647 Accesses

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 108))

Abstract

In this chapter we study the continuous subgradient algorithm for minimization of convex functions, under the presence of computational errors. We show that our algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Moreover, for a known computational error, we find out what an approximate solution can be obtained and how much time one needs for this.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.00
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Antipin AS (1994) Minimization of convex functions on convex sets by means of differential equations. Differ Equ 30:1365–1375

    MathSciNet  MATH  Google Scholar 

  2. Baillon JB (1978) Un Exemple Concernant le Comportement Asymptotique de la Solution du Probleme 0 ∈ dudt + ∂ ϕ(u). J Funct Anal 28:369–376

    Article  MathSciNet  MATH  Google Scholar 

  3. Barbu V, Precupanu T (2012) Convexity and optimization in Banach spaces. Springer, Heidelberg, London, New York

    Book  MATH  Google Scholar 

  4. Bolte J (2003) Continuous gradient projection method in Hilbert spaces. J Optim Theory Appl 119:235–259

    Article  MathSciNet  MATH  Google Scholar 

  5. Brezis H (1973) Opérateurs maximaux monotones. North Holland, Amsterdam

    MATH  Google Scholar 

  6. Bruck RE (1974) Asymptotic convergence of nonlinear contraction semigroups in a Hilbert space. J Funct Anal 18:15–26

    Article  MathSciNet  MATH  Google Scholar 

  7. Reich S, Zaslavski AJ (2014) Genericity in nonlinear analysis. Springer, New York

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Zaslavski, A.J. (2016). Continuous Subgradient Method. In: Numerical Optimization with Computational Errors. Springer Optimization and Its Applications, vol 108. Springer, Cham. https://doi.org/10.1007/978-3-319-30921-7_14

Download citation

Publish with us

Policies and ethics