Skip to main content
Log in

Perturbed Augmented Lagrangian Method Framework with Applications to Proximal and Smoothed Variants

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

We introduce a perturbed augmented Lagrangian method framework, which is a convenient tool for local analyses of convergence and rates of convergence of some modifications of the classical augmented Lagrangian algorithm. One example to which our development applies is the proximal augmented Lagrangian method. Previous results for this version required twice differentiability of the problem data, the linear independence constraint qualification, strict complementarity, and second-order sufficiency; or the linear independence constraint qualification and strong second-order sufficiency. We obtain a set of convergence properties under significantly weaker assumptions: once (not twice) differentiability of the problem data, uniqueness of the Lagrange multiplier, and second-order sufficiency (no linear independence constraint qualification and no strict complementarity); or even second-order sufficiency only. Another version to which the general framework applies is the smoothed augmented Lagrangian method, where the plus-function associated with penalization of inequality constraints is approximated by a family of smooth functions (so that the subproblems are twice differentiable if the problem data are). Furthermore, for all the modifications, inexact solution of subproblems is handled naturally. The presented framework also subsumes the basic augmented Lagrangian method, both exact and inexact.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data availability

Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.

References

  1. Bertsekas, D.P.: Multiplier methods: a survey. Autom. J. IFAC 12, 133–145 (1976)

    Article  MathSciNet  Google Scholar 

  2. Bertsekas, D.P.: Constrained Optimization and Lagrange Multiplier Methods. Academic Press, New York (1982)

    MATH  Google Scholar 

  3. Bertsekas, D.P.: Nonlinear Programming. Athena Scientific, Belmont (1995)

    MATH  Google Scholar 

  4. Birgin, E.G., Martínez, J.M.: Practical Augmented Lagrangian Methods for Constrained Optimization. Fundamentals of Algorithms, vol. 10. Society for Industrial and Applied Mathematics (SIAM), Philadelphia (2014)

    Book  Google Scholar 

  5. Bonnans, J.F.: Local analysis of Newton-type methods for variational inequalities and nonlinear programming. Appl. Math. Optim. 29, 161–186 (1994)

    Article  MathSciNet  Google Scholar 

  6. Börgens, E., Kanzow, C., Steck, D.: Local and global analysis of multiplier methods for constrained optimization in Banach spaces. SIAM J. Control Optim. 57, 3694–3722 (2019)

    Article  MathSciNet  Google Scholar 

  7. Chen, C., Mangasarian, O.L.: A class of smoothing functions for nonlinear and mixed complementarity problems. Comput. Optim. Appl. 5, 97–138 (1996)

    Article  MathSciNet  Google Scholar 

  8. Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Trust-Region Methods. SIAM, Philadelphia (2000)

    Book  Google Scholar 

  9. Contesse-Becker, L.: Extended convergence results for the method of multipliers for non-strictly binding inequality constraints. J. Optim. Theory Appl. 79, 273–310 (1993)

    Article  MathSciNet  Google Scholar 

  10. Cui, Y., Sun, D., Toh, K.-C.: On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming. Math. Program. 178, 381–415 (2019)

    Article  MathSciNet  Google Scholar 

  11. Dontchev, A.L., Rockafellar, R.T.: Implicit Functions and Solution Mappings, 2nd edn. Springer, New York (2014)

    MATH  Google Scholar 

  12. Facchinei, F., Pang, J.S.: Finite-Dimensional Variational Inequalities and Complementarity Problems. Springer Series in Operations Research, vol. II. Springer, New York (2003)

    MATH  Google Scholar 

  13. Fernández, D., Solodov, M.V.: Local convergence of exact and inexact augmented Lagrangian methods under the second-order sufficient optimality condition. SIAM J. Optim. 22, 384–407 (2012)

    Article  MathSciNet  Google Scholar 

  14. Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)

    Article  MathSciNet  Google Scholar 

  15. Ito, K., Kunisch, K.: The augmented Lagrangian method for equality and inequality contraints in Hilbert spaces. Math. Program. 46, 341–360 (1990)

    Article  Google Scholar 

  16. Izmailov, A.F., Kurennoy, A.S.: Abstract Newtonian frameworks and their applications. SIAM J. Optim. 23, 2369–2396 (2013)

    Article  MathSciNet  Google Scholar 

  17. Izmailov, A.F., Kurennoy, A.S., Solodov, M.V.: The Josephy–Newton method for semismooth generalized equations and semismooth SQP for optimization. Set-Valued Var. Anal. 21, 17–45 (2013)

    Article  MathSciNet  Google Scholar 

  18. Izmailov, A.F., Kurennoy, A.S., Solodov, M.V.: A note on upper Lipschitz stability, error bounds, and critical multipliers for Lipschitz-continuous KKT systems. Math. Program. 142, 591–604 (2013)

    Article  MathSciNet  Google Scholar 

  19. Izmailov, A.F., Kurennoy, A.S., Solodov, M.V.: Local convergence of the method of multipliers for variational and optimization problems under the noncriticality assumption. Comput. Optim. Appl. 60, 111–140 (2015)

    Article  MathSciNet  Google Scholar 

  20. Izmailov, A.F., Solodov, M.V.: Newton-type Methods for Optimization and Variational Problems. Springer, Cham (2014)

    Book  Google Scholar 

  21. Izmailov, A.F., Solodov, M.V.: Newton-type methods: A broader view. J. Optim. Theory Appl. 164, 577–620 (2015)

    Article  MathSciNet  Google Scholar 

  22. Kanzow, C., Steck, D.: Improved local convergence results for augmented Lagrangian methods in \(C^2\)-cone reducible constrained optimization. Math. Program. 177, 425–438 (2019)

    Article  MathSciNet  Google Scholar 

  23. Klatte, D., Tammer, K.: On the second order sufficient conditions to perturbed \({{\rm C}}^{1,\, 1}\) optimization problems. Optimization 19, 169–180 (1988)

    Article  MathSciNet  Google Scholar 

  24. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)

    Book  Google Scholar 

  25. Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Fletcher, R. (ed.) Optimization, pp. 283–298. Academic Press, New York (1972)

    Google Scholar 

  26. Rockafellar, R.T.: The multiplier method of Hestenes and Powell applied to convex programming. J. Optim. Theory Appl. 12, 555–562 (1973)

    Article  MathSciNet  Google Scholar 

  27. Rockafellar, R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1, 97–116 (1976)

    Article  MathSciNet  Google Scholar 

  28. Rockafellar, R.T.: Lagrange multipliers and optimality. SIAM Rev. 35, 183–238 (1993)

    Article  MathSciNet  Google Scholar 

  29. Ruszczynski, A.: Nonlinear Optimization. Princeton University Press, Princeton (2006)

    Book  Google Scholar 

  30. Zhang, Y., Wu, J., Zhang, L.: The rate of convergence of proximal method of multipliers for equality constrained optimization problems. Optim. Lett. 14, 1599–1613 (2020)

    Article  MathSciNet  Google Scholar 

  31. Zhang, Y., Wu, J., Zhang, L.: The rate of convergence of proximal method of multipliers for nonlinear programming. Optim. Methods Softw. 35, 1022–1049 (2020)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This research was supported by the Russian Foundation for Basic Research Grants 19-51-12003 NNIO_a and 20-01-00106, by CNPq Grant 303913/2019-3, by FAPERJ Grant E-26/202.540/2019, by PRONEX–Optimization, and by Volkswagen Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. F. Izmailov.

Additional information

Communicated by Boris S. Mordukhovich.

Dedicated to Professor Franco Giannessi on the occasion of his 85th birthday, and in appreciation of his longstanding selfless service for the success of JOTA.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Izmailov, A.F., Solodov, M.V. Perturbed Augmented Lagrangian Method Framework with Applications to Proximal and Smoothed Variants. J Optim Theory Appl 193, 491–522 (2022). https://doi.org/10.1007/s10957-021-01914-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-021-01914-y

Keywords

Mathematics Subject Classification

Navigation