Abstract
We study subgradient methods for convex optimization that use projections onto successive approximations of level sets of the objective corresponding to estimates of the optimal value. We establish global convergence in objective values for simple level controls without requiring that the feasible set be compact. Our framework may handle accelerations based on “cheap” projections, surrogate constraints, and conjugate subgradient techniques.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
U. Brännlund, On relaxation methods for nonsmooth convex optimization, Ph.D. thesis, Department of Mathematics, Royal Institute of Technology, Stockholm, 1993.
A. Cegielski, Projection onto an acute cone and convex feasibility problem, in System Modelling and Optimization, J. Henry and J.-P. Yvon, eds., Lecture Notes in Control and Information Sciences 197, Springer-Ver lag, Berlin, 1994, pp. 187–194.
J.-L. Goffin and K. C. Kiwiel, Convergence of a simple subgradient level method, Tech. Report G-96-56, GERAD, Montreal, November 1996.
S. Kim, H. Ahn and S.-C. Cho, Variable target value subgradient method, Math. Programming 49 (1991) 359–369.
K. C. Kiwiel, Proximity control in bundle methods for convex nondifferentiable minimization, Math. Programming 46 (1990) 105–122.
—, Proximal level bundle methods for convex nondifferentiable optimization, saddle-point problems and variational inequalities, Math. Programming 69 (1995) 89–109.
—, The efficiency of subgradient projection methods for convex optimization, part I: General level methods, SIAM J. Control Optim. 34 (1996) 660–676.
—, The efficiency of subgradient projection methods for convex optimization, part II: Implementations and extensions, SIAM J. Control Optim. 34 (1996) 677–697.
—, Monotone Gram matrices and deepest surrogate inequalities in accelerated relaxation methods for convex feasibility problems, Linear Algebra Appl. 252 (1997) 27–33.
K. C. Kiwiel, T. Larsson and P. O. Lindberg, The efficiency of ballstep subgradient level methods for convex optimization, Tech. report, Systems Research Institute, Warsaw, 1997. Available at the URL ftp://ftp.ibspan.waw.pl/pub/private/kiwiel/ebsl.ps.gz.
C. Lemaréchal, A. S. Nemirovskii and Yu. E. Nesterov, New variants of bundle methods, Math. Programming 69 (1995) 111–147.
B. T. Polyak, Minimization of unsmooth functionals, Zh. Vychisl. Mat. i Mat. Fiz. 9 (1969) 509–521 (Russian). English transl. in U.S.S.R. Comput. Math. and Math. Phys. 9 (1969) 14-29.
G. Reinelt, TSPLIB 95, Tech. report, Institut für Angewandte Mathematik, Universität Heidelberg, Heidelberg, ftp://ftp.iwr.uni-heidelberg.de//pub/tsplib, 1995.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kiwiel, K.C., Larsson, T., Lindberg, PO. (1998). Convergent Stepsize Rules for Subgradient Optimization. In: Operations Research Proceedings 1997. Operations Research Proceedings, vol 1997. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-58891-4_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-58891-4_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64240-4
Online ISBN: 978-3-642-58891-4
eBook Packages: Springer Book Archive