Skip to main content

Convergent Stepsize Rules for Subgradient Optimization

  • Conference paper
  • 269 Accesses

Part of the book series: Operations Research Proceedings ((ORP,volume 1997))

Abstract

We study subgradient methods for convex optimization that use projections onto successive approximations of level sets of the objective corresponding to estimates of the optimal value. We establish global convergence in objective values for simple level controls without requiring that the feasible set be compact. Our framework may handle accelerations based on “cheap” projections, surrogate constraints, and conjugate subgradient techniques.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   119.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   153.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. U. Brännlund, On relaxation methods for nonsmooth convex optimization, Ph.D. thesis, Department of Mathematics, Royal Institute of Technology, Stockholm, 1993.

    Google Scholar 

  2. A. Cegielski, Projection onto an acute cone and convex feasibility problem, in System Modelling and Optimization, J. Henry and J.-P. Yvon, eds., Lecture Notes in Control and Information Sciences 197, Springer-Ver lag, Berlin, 1994, pp. 187–194.

    Chapter  Google Scholar 

  3. J.-L. Goffin and K. C. Kiwiel, Convergence of a simple subgradient level method, Tech. Report G-96-56, GERAD, Montreal, November 1996.

    Google Scholar 

  4. S. Kim, H. Ahn and S.-C. Cho, Variable target value subgradient method, Math. Programming 49 (1991) 359–369.

    Article  Google Scholar 

  5. K. C. Kiwiel, Proximity control in bundle methods for convex nondifferentiable minimization, Math. Programming 46 (1990) 105–122.

    Article  Google Scholar 

  6. —, Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities, Math. Programming 69 (1995) 89–109.

    Google Scholar 

  7. —, The efficiency of subgradient projection methods for convex optimization, part I: General level methods, SIAM J. Control Optim. 34 (1996) 660–676.

    Article  Google Scholar 

  8. —, The efficiency of subgradient projection methods for convex optimization, part II: Implementations and extensions, SIAM J. Control Optim. 34 (1996) 677–697.

    Article  Google Scholar 

  9. —, Monotone Gram matrices and deepest surrogate inequalities in accelerated relaxation methods for convex feasibility problems, Linear Algebra Appl. 252 (1997) 27–33.

    Article  Google Scholar 

  10. K. C. Kiwiel, T. Larsson and P. O. Lindberg, The efficiency of ballstep subgradient level methods for convex optimization, Tech. report, Systems Research Institute, Warsaw, 1997. Available at the URL ftp://ftp.ibspan.waw.pl/pub/private/kiwiel/ebsl.ps.gz.

    Google Scholar 

  11. C. Lemaréchal, A. S. Nemirovskii and Yu. E. Nesterov, New variants of bundle methods, Math. Programming 69 (1995) 111–147.

    Article  Google Scholar 

  12. B. T. Polyak, Minimization of unsmooth functionals, Zh. Vychisl. Mat. i Mat. Fiz. 9 (1969) 509–521 (Russian). English transl. in U.S.S.R. Comput. Math. and Math. Phys. 9 (1969) 14-29.

    Google Scholar 

  13. G. Reinelt, TSPLIB 95, Tech. report, Institut für Angewandte Mathematik, Universität Heidelberg, Heidelberg, ftp://ftp.iwr.uni-heidelberg.de//pub/tsplib, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kiwiel, K.C., Larsson, T., Lindberg, PO. (1998). Convergent Stepsize Rules for Subgradient Optimization. In: Operations Research Proceedings 1997. Operations Research Proceedings, vol 1997. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-58891-4_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-58891-4_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-64240-4

  • Online ISBN: 978-3-642-58891-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics