Abstract
This paper examines the computational complexity certification of the fast gradient method for the solution of the dual of a parametric convex program. To this end, a lower iteration bound is derived such that for all parameters from a compact set a solution with a specified level of suboptimality will be obtained. For its practical importance, the derivation of the smallest lower iteration bound is considered. In order to determine it, we investigate both the computation of the worst case minimal Euclidean distance between an initial iterate and a Lagrange multiplier and the issue of finding the largest step size for the fast gradient method. In addition, we argue that optimal preconditioning of the dual problem cannot be proven to decrease the smallest lower iteration bound. The findings of this paper are of importance in embedded optimization, for instance, in model predictive control.
Similar content being viewed by others
Notes
From a certification point of view, a constant step size is not a limitation as for gradient methods in convex optimization advanced step size rules, e.g. exact line search, do not exhibit better convergence rate results (Polyak 1987, §3.1.2).
References
Bank B, Guddat J, Klatte D, Kummer B, Tammer K (1982) Non-linear parametric optimization. Akademie-Verlag, Berlin
Bertsekas DP (1999) Nonlinear programming, 2nd edn. Athena Scientific, Massachusetts
Bertsekas DP (2009) Convex Optimization theory, 1st edn. Athena Scientific, Massachusetts
Blackmore L, Açikmese B, Scharf D (2010) Minimum-landing-error powered-descent guidance for mars landing using convex optimization. J Guid Control Dyn 33(4):1161–1171
Bleris L, Kothare M (2005) Real-time implementation of model predictive control. In: American control conference, vol 6, pp 4166–4171
Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge University Press, Cambridge
Boyd S, Ghaoui LE, Feron E, Balakrishnan V (1994) Linear matrix inequalities in system & control theory. Soc Ind Appl Math, vol 15. SIAM studies in Applied Mathematics, Philadelphia, PA
Defraene B, van Waterschoot T, Ferreau HJ, Diehl M, Moonen M (2012) Real-time perception-based clipping of audio signals using convex optimization. IEEE Trans Audio Speech Lang Process 20(10):2657–2671
Devolder O, Glineur F, Nesterov Y (2011) First-order methods of smooth convex optimization with inexact oracle. Math Program (submitted). Available at http://www.optimizationonline.org/DBFILE/2010/12/2865.pdf
Devolder O, Glineur F, Nesterov Y (2012) Double smoothing technique for large-scale linearly constrained convex optimization. SIAM J Optim 22(2):702–727
Doan M, Keviczky T, De Schutter B (2011) A dual decomposition-based optimization method with guaranteed primal feasibility for hierarchical MPC problems. In: 18th IFAC world congress
Fuchs A, Mariéthoz S, Larsson M, Morari M (2011) Grid stabilization through VSC-HVDC using wide area measurements. In: IEEE powertech. Power System Technology, Norway
Gauvin J (1977) A necessary and sufficient regularity condition to have bounded multipliers in nonconvex programming. Math Program 12:136–138
Gol’šteǐn EG (1972) Theory of convex programming, vol 36. American Mathematical Society, Providence
Lan G, Monteiro RD (2009) Iteration-complexity of first-order augmented Lagrangian methods for convex programming. Math Program (submitted). Available at http://www.optimizationonline.org/DB_HTML/2009/05/2294.html
Lemaréchal C (2001) Lagrangian relaxation. In: Junger M, Naddef D (eds) Computational combinatorial optimization. Lecture notes in computer science, vol 2241. Springer, Berlin, Heidelberg, pp 112–156
McGovern LK (2000) Computational analysis of real-time convex optimization for control systems. Thesis, Massachusetts Institute of Technology
Nedić A, Ozdaglar A (2009) Approximate primal solutions and rate analysis for dual subgradient methods. SIAM J Optim 19(4):1757–1780
Nesterov Y (1983) A method for solving a convex programming problem with convergence rate \(1/k^2\). Soviet Math Dokl 27(2):372–376
Nesterov Y (2004a) Introductory lectures on convex optimization. Springer, Berlin
Nesterov Y (2004b) Smooth minimization of non-smooth functions. Math Program 103(1):127–152
Nesterov Y, Nemirovskii A (1994) Interior-point polynomial algorithms in convex programming. Soc Ind Appl Math, SIAM studies in Applied Mathematics, Philadelphia, PA, p 405
Polyak BT (1987) Introduction to optimization. Optimization Software
Rawlings JB, Mayne DQ (2009) Model predictive control theory and design. Nob Hill Pub, Madison
Richter S, Morari M, Jones CN (2011) Towards computational complexity certification for constrained MPC based on Lagrange relaxation and the fast gradient method. In: Conference on decision and control (CDC), Orlando
Richter S, Jones CN, Morari M (2012) Computational complexity certification for real-time MPC with input constraints based on the fast gradient method. IEEE Trans Autom Control 57(6):1391–1403
Rockafellar RT (1997) Convex analysis. Princeton University Press, Princeton
Schmidt M, Roux NL, Bach F (2011) Convergence rates of inexact proximal-gradient methods for convex, optimization. arXiv:11092415
Tseng P (2008) On accelerated proximal gradient methods for convex-concave optimization. SIAM J Optim (submitted)
Acknowledgments
The authors would like to thank D. Klatte for helpful comments and the anonymous reviewer for the valuable suggestions that helped to improve this paper.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Richter, S., Jones, C.N. & Morari, M. Certification aspects of the fast gradient method for solving the dual of parametric convex programs. Math Meth Oper Res 77, 305–321 (2013). https://doi.org/10.1007/s00186-012-0420-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00186-012-0420-7