Abstract
In this paper we develop a new affine-invariant primal–dual subgradient method for nonsmooth convex optimization problems. This scheme is based on a self-concordant barrier for the basic feasible set. It is suitable for finding approximate solutions with certain relative accuracy. We discuss some applications of this technique including fractional covering problem, maximal concurrent flow problem, semidefinite relaxations and nonlinear online optimization. For all these problems, the rate of convergence of our method does not depend on the problem’s data.
Similar content being viewed by others
References
d’Aspremont A., Banerjee O., El Ghaoui L.: First-order methods for Sparse covariance selection. SIAM J. Matrix Anal. Appl. 30(1), 56–66 (2008)
Bienstock D.: Potential Function Methods for Approximately Solving Linear Programming Problems: Theory and Practice. Springer, NY (2002)
Grigoriadis M.D., Khachiyan L.G.: Fast approximation schemes for convex programs with many blocks and coupling constraints. SIAM J. Optim. 4, 86–107 (1994)
Grigoriadis M.D., Khachiyan L.G.: Approximate minimum-cost multicommodity flows. Math. Program. 75, 477–482 (1996)
Nemirovski A.: Prox-method with rate of convergence O(1/t) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optim. 15(1), 229–251 (2004)
Nemirovsky A., Yudin D.: Problem Complexity and Method Efficiency in Optimization. Wiley, NY (1983)
Nesterov Yu.: Semidefinite relaxation and nonconvex quadratic optimization. Optim. Methods Softw. 9, 141–160 (1998)
Nesterov Yu.: Introductory Lectures on Convex Optimization. Kluwer, Boston (2004)
Nesterov Yu.: Smooth minimization of non-smooth functions. Math. Program. 103(1), 127–152 (2005)
Nesterov Yu.: Excessive gap technique in nonsmooth convex minimization. SIAM J. Optim. 16(1), 235–249 (2005)
Nesterov Yu.: Dual extrapolation and its application for solving variational inequalities and related problems. Math. Program. 109(2–3), 319–344 (2007)
Nesterov Yu.: Smoothing technique and its applications in semidefinite optimization. Math. Program. 110(2), 245–259 (2007)
Nesterov Yu.: Primal-dual subgradient methods for convex problems. Math. Program. 120(1), 261–283 (2009)
Nesterov, Yu.: Gradient methods for minimizing composite objective function. CORE Discussion Paper 2007/76 (2007)
Nesterov Yu.: Rounding of convex sets and efficient gradient methods for linear programming problems. Optim. Methods Softw. 23(1), 109–128 (2008)
Nesterov Yu., Nemirovskii A.: Interior Point Polynomial Methods in Convex Programming: Theory and Applications. SIAM, Philadelphia (1994)
Nesterov Yu., Todd M.J., Ye Y.: Infeasible-start primal-dual methods and infeasibility detectors. Math. Program. 84(2), 227–267 (1999)
Plotkin S.A., Shmoys D., Tardos E.: Fast approximation algorithms for fractional packing and covering problems. Math. Oper. Res. 20, 257–301 (1995)
Shahrokhi F., Matula D.W.: The maximum concurrent flow problem. J. ACM 37, 318–334 (1991)
Author information
Authors and Affiliations
Corresponding author
Additional information
The research results presented in this paper have been supported by a grant “Action de recherche concertè ARC 04/09-315” from the “Direction de la recherche scientifique - Communautè Française de Belgique”. The scientific responsibility rests with its author.
Rights and permissions
About this article
Cite this article
Nesterov, Y. Barrier subgradient method. Math. Program. 127, 31–56 (2011). https://doi.org/10.1007/s10107-010-0421-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-010-0421-3
Keywords
- Convex optimization
- Subgradient methods
- Non-smooth optimization
- Minimax problems
- Saddle points
- Variational inequalities
- Stochastic optimization
- Black-box methods
- Lower complexity bounds