Abstract
We introduce a perturbed augmented Lagrangian method framework, which is a convenient tool for local analyses of convergence and rates of convergence of some modifications of the classical augmented Lagrangian algorithm. One example to which our development applies is the proximal augmented Lagrangian method. Previous results for this version required twice differentiability of the problem data, the linear independence constraint qualification, strict complementarity, and second-order sufficiency; or the linear independence constraint qualification and strong second-order sufficiency. We obtain a set of convergence properties under significantly weaker assumptions: once (not twice) differentiability of the problem data, uniqueness of the Lagrange multiplier, and second-order sufficiency (no linear independence constraint qualification and no strict complementarity); or even second-order sufficiency only. Another version to which the general framework applies is the smoothed augmented Lagrangian method, where the plus-function associated with penalization of inequality constraints is approximated by a family of smooth functions (so that the subproblems are twice differentiable if the problem data are). Furthermore, for all the modifications, inexact solution of subproblems is handled naturally. The presented framework also subsumes the basic augmented Lagrangian method, both exact and inexact.
Similar content being viewed by others
Data availability
Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.
References
Bertsekas, D.P.: Multiplier methods: a survey. Autom. J. IFAC 12, 133–145 (1976)
Bertsekas, D.P.: Constrained Optimization and Lagrange Multiplier Methods. Academic Press, New York (1982)
Bertsekas, D.P.: Nonlinear Programming. Athena Scientific, Belmont (1995)
Birgin, E.G., Martínez, J.M.: Practical Augmented Lagrangian Methods for Constrained Optimization. Fundamentals of Algorithms, vol. 10. Society for Industrial and Applied Mathematics (SIAM), Philadelphia (2014)
Bonnans, J.F.: Local analysis of Newton-type methods for variational inequalities and nonlinear programming. Appl. Math. Optim. 29, 161–186 (1994)
Börgens, E., Kanzow, C., Steck, D.: Local and global analysis of multiplier methods for constrained optimization in Banach spaces. SIAM J. Control Optim. 57, 3694–3722 (2019)
Chen, C., Mangasarian, O.L.: A class of smoothing functions for nonlinear and mixed complementarity problems. Comput. Optim. Appl. 5, 97–138 (1996)
Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Trust-Region Methods. SIAM, Philadelphia (2000)
Contesse-Becker, L.: Extended convergence results for the method of multipliers for non-strictly binding inequality constraints. J. Optim. Theory Appl. 79, 273–310 (1993)
Cui, Y., Sun, D., Toh, K.-C.: On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming. Math. Program. 178, 381–415 (2019)
Dontchev, A.L., Rockafellar, R.T.: Implicit Functions and Solution Mappings, 2nd edn. Springer, New York (2014)
Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer Series in Operations Research, vol. II. Springer, New York (2003)
Fernández, D., Solodov, M.V.: Local convergence of exact and inexact augmented Lagrangian methods under the second-order sufficient optimality condition. SIAM J. Optim. 22, 384–407 (2012)
Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)
Ito, K., Kunisch, K.: The augmented Lagrangian method for equality and inequality contraints in Hilbert spaces. Math. Program. 46, 341–360 (1990)
Izmailov, A.F., Kurennoy, A.S.: Abstract Newtonian frameworks and their applications. SIAM J. Optim. 23, 2369–2396 (2013)
Izmailov, A.F., Kurennoy, A.S., Solodov, M.V.: The Josephy–Newton method for semismooth generalized equations and semismooth SQP for optimization. Set-Valued Var. Anal. 21, 17–45 (2013)
Izmailov, A.F., Kurennoy, A.S., Solodov, M.V.: A note on upper Lipschitz stability, error bounds, and critical multipliers for Lipschitz-continuous KKT systems. Math. Program. 142, 591–604 (2013)
Izmailov, A.F., Kurennoy, A.S., Solodov, M.V.: Local convergence of the method of multipliers for variational and optimization problems under the noncriticality assumption. Comput. Optim. Appl. 60, 111–140 (2015)
Izmailov, A.F., Solodov, M.V.: Newton-type Methods for Optimization and Variational Problems. Springer, Cham (2014)
Izmailov, A.F., Solodov, M.V.: Newton-type methods: A broader view. J. Optim. Theory Appl. 164, 577–620 (2015)
Kanzow, C., Steck, D.: Improved local convergence results for augmented Lagrangian methods in \(C^2\)-cone reducible constrained optimization. Math. Program. 177, 425–438 (2019)
Klatte, D., Tammer, K.: On the second order sufficient conditions to perturbed \({{\rm C}}^{1,\, 1}\) optimization problems. Optimization 19, 169–180 (1988)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)
Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Fletcher, R. (ed.) Optimization, pp. 283–298. Academic Press, New York (1972)
Rockafellar, R.T.: The multiplier method of Hestenes and Powell applied to convex programming. J. Optim. Theory Appl. 12, 555–562 (1973)
Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1, 97–116 (1976)
Rockafellar, R.T.: Lagrange multipliers and optimality. SIAM Rev. 35, 183–238 (1993)
Ruszczynski, A.: Nonlinear Optimization. Princeton University Press, Princeton (2006)
Zhang, Y., Wu, J., Zhang, L.: The rate of convergence of proximal method of multipliers for equality constrained optimization problems. Optim. Lett. 14, 1599–1613 (2020)
Zhang, Y., Wu, J., Zhang, L.: The rate of convergence of proximal method of multipliers for nonlinear programming. Optim. Methods Softw. 35, 1022–1049 (2020)
Acknowledgements
This research was supported by the Russian Foundation for Basic Research Grants 19-51-12003 NNIO_a and 20-01-00106, by CNPq Grant 303913/2019-3, by FAPERJ Grant E-26/202.540/2019, by PRONEX–Optimization, and by Volkswagen Foundation.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Boris S. Mordukhovich.
Dedicated to Professor Franco Giannessi on the occasion of his 85th birthday, and in appreciation of his longstanding selfless service for the success of JOTA.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Izmailov, A.F., Solodov, M.V. Perturbed Augmented Lagrangian Method Framework with Applications to Proximal and Smoothed Variants. J Optim Theory Appl 193, 491–522 (2022). https://doi.org/10.1007/s10957-021-01914-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-021-01914-y
Keywords
- Augmented Lagrangian
- Proximal method of multipliers
- Smoothing
- Linear convergence
- Superlinear convergence
- Strong metric regularity
- Semistability
- Upper Lipschitz stability
- Second-order sufficient optimality conditions