Mathematical Programming

, Volume 141, Issue 1–2, pp 319–348 | Cite as

A practical relative error criterion for augmented Lagrangians

Full Length Paper Series A

Abstract

This paper develops a new error criterion for the approximate minimization of augmented Lagrangian subproblems. This criterion is practical since it is readily testable given only a gradient (or subgradient) of the augmented Lagrangian. It is also “relative” in the sense of relative error criteria for proximal point algorithms: in particular, it uses a single relative tolerance parameter, rather than a summable parameter sequence. Our analysis first describes an abstract version of the criterion within Rockafellar’s general parametric convex duality framework, and proves a global convergence result for the resulting algorithm. Specializing this algorithm to a standard formulation of convex programming produces a version of the classical augmented Lagrangian method with a novel inexact solution condition for the subproblems. Finally, we present computational results drawn from the CUTE test set—including many nonconvex problems—indicating that the approach works well in practice.

Mathematics Subject Classification

90C25 90C30 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Andreani R., Birgin E.G., Martínez J.M., Schuverdt M.L.: On augmented Lagrangian methods with general lower-level constraints. SIAM J. Optim. 18(4), 1286–1309 (2007)MathSciNetCrossRefMATHGoogle Scholar
  2. 2.
    Andreani R., Birgin E.G., Martínez J.M., Schuverdt M.L.: Augmented Lagrangian methods under the constant positive linear dependence constraint qualification. Math. Program. 111(1–2), 5–32 (2008)MathSciNetMATHGoogle Scholar
  3. 3.
    Andreani, R., Haeser, G., Schuverdt, M., Silva, P.J.S.: A relaxed constant positive linear dependence constraint qualification and applications. Math. Program. Published electronically. doi:10.1007/s10107-011-0456-0 (2011)
  4. 4.
    Andreani, R., Haeser, G., Schuverdt, M., Silva, P.J.S.: Two new weak constraint qualifications and applications. Available at Optimization Online: http://www.optimization-online.org/DB_HTML/2011/07/3105.html (2011)
  5. 5.
    Bertsekas D.P.: Constrained Optimization and Lagrange Multiplier Methods. Academic Press, New York, NY (1982)MATHGoogle Scholar
  6. 6.
    Birgin, E.G., Fernández, D., Martínez, J.M.: On the boundedness of penalty parameters in an augmented Lagrangian method with constrained subproblems. Optim. Meth. Softw. (2012, in press)Google Scholar
  7. 7.
    Bongartz I., Conn A.R., Gould N., Toint P.L.: CUTE: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 21(1), 123–160 (1995)CrossRefMATHGoogle Scholar
  8. 8.
    Conn A.R., Gould N., Sartenaer A., Toint P.L.: Convergence properties of an augmented Lagrangian algorithm for optimization with a combination of general equality and linear constraints. SIAM J. Optim. 6(3), 674–703 (1996)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Conn A.R., Gould N.I.M., Toint P.L.: A globally convergent augmented Lagrangian algorithm for optimization with general constraints and simple bounds. SIAM J. Numer. Anal. 28(2), 545–572 (1991)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Conn A.R., Gould N.I.M., Toint P.L.: LANCELOT: A Fortran package for Large-Scale Nonlinear Optimization (Release A). Springer, Berlin (1992)CrossRefMATHGoogle Scholar
  11. 11.
    Dolan E.D., Moré J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)MathSciNetCrossRefMATHGoogle Scholar
  12. 12.
    Eckstein J.: A practical general approximation criterion for methods of multipliers based on Bregman distances. Math. Program. 96(1), 61–86 (2003)MathSciNetCrossRefMATHGoogle Scholar
  13. 13.
    Eckstein J., Silva P.J.S.: Proximal methods for nonlinear programming: double regularization and inexact subproblems. Comput. Optim. Appl. 46(2), 279–304 (2010)MathSciNetCrossRefMATHGoogle Scholar
  14. 14.
    Fernández, D., Solodov, M.V.: Local convergence of exact and inexact augmented Lagrangian methods under the second-order sufficient optimality condition. Technical Report A677, Instituto Nacional de Matemática Pura e Aplicada (IMPA), Rio de Janeiro (2011)Google Scholar
  15. 15.
    Friedlander M.P., Saunders M.A.: A globally convergent linearly constrained Lagrangian method for nonlinear optimization. SIAM J. Optim. 15(3), 863–897 (2005)MathSciNetCrossRefMATHGoogle Scholar
  16. 16.
    Hager, W.W., Zhang, H.: ASA-CG source code. http://www.math.ufl.edu/~hager/papers/CG/
  17. 17.
    Hager W.W., Zhang H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)MathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    Hager W.W., Zhang H.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17(2), 526–557 (2006)MathSciNetCrossRefMATHGoogle Scholar
  19. 19.
    Jones, E., Oliphant, T., Peterson, P., et al.: SciPy: Open source scientific tools for Python. http://www.scipy.org/ (2001)
  20. 20.
    Korpelevich G.M.: Extrapolation gradient methods and their relation to modified Lagrange functions. 19(4), 694–703 (1983)MathSciNetMATHGoogle Scholar
  21. 21.
    Rockafellar R.T.: Local boundedness of nonlinear, monotone operators. Michigan Math. J. 16, 397–407 (1969)MathSciNetCrossRefMATHGoogle Scholar
  22. 22.
    Rockafellar R.T.: Convex Analysis. Princeton University Press, Princeton, NJ (1970)MATHGoogle Scholar
  23. 23.
    Rockafellar R.T.: On the maximality of sums of nonlinear monotone operators. Trans. Am. Math. Soc. 149, 75–88 (1970)MathSciNetCrossRefMATHGoogle Scholar
  24. 24.
    Rockafellar R.T.: Conjugate Duality and Optimization. SIAM, Philadelphia (1974)CrossRefMATHGoogle Scholar
  25. 25.
    Rockafellar R.T.: Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Math. Oper. Res. 1(2), 97–116 (1976)MathSciNetCrossRefMATHGoogle Scholar
  26. 26.
    Rockafellar R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control Optim. 14(5), 877–898 (1976)MathSciNetCrossRefMATHGoogle Scholar
  27. 27.
    Solodov M.V., Svaiter B.F.: A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator. Set-Valued Anal. 7(4), 323–345 (1999)MathSciNetCrossRefMATHGoogle Scholar
  28. 28.
    Solodov M.V., Svaiter B.F.: A hybrid projection-proximal point algorithm. J. Convex Anal. 6(1), 59–70 (1999)MathSciNetMATHGoogle Scholar
  29. 29.
    Solodov M.V., Svaiter B.F.: An inexact hybrid generalized proximal point algorithm and some new results on the theory of Bregman functions. Math. Oper. Res. 25(2), 214–230 (2000)MathSciNetCrossRefMATHGoogle Scholar
  30. 30.
    Spingarn J.E.: Partial inverse of a monotone operator. Appl. Math. Optim. 10(3), 247–265 (1983)MathSciNetCrossRefMATHGoogle Scholar
  31. 31.
    van Rossum, G., et al.: Python language website. http://www.python.org/

Copyright information

© Springer and Mathematical Optimization Society 2012

Authors and Affiliations

  1. 1.Department of Management Science and Information Systems and RUTCORRutgers UniversityPiscatawayUSA
  2. 2.Department of Computer ScienceUniversity of São PauloSão PauloBrazil

Personalised recommendations