Advertisement

Augmented Lagrangian Method with Alternating Constraints for Nonlinear Optimization Problems

  • Siti Nor Habibah Binti HassanEmail author
  • Tomohiro Niimi
  • Nobuo Yamashita
Article
  • 216 Downloads

Abstract

The augmented Lagrangian method is a classical solution method for nonlinear optimization problems. At each iteration, it minimizes an augmented Lagrangian function that consists of the constraint functions and the corresponding Lagrange multipliers. If the Lagrange multipliers in the augmented Lagrangian function are close to the exact Lagrange multipliers at an optimal solution, the method converges steadily. Since the conventional augmented Lagrangian method uses inaccurate estimated Lagrange multipliers, it sometimes converges slowly. In this paper, we propose a novel augmented Lagrangian method that allows the augmented Lagrangian function and its minimization problem to have variable constraints at each iteration. This allowance enables the new method to get more accurate estimated Lagrange multipliers by exploiting Karush–Kuhn–Tucker points of the subproblems and consequently to converge more efficiently and steadily.

Keywords

Augmented Lagrangian functions Gradient descent method Large-scale problem Nonlinear optimization 

Mathematics Subject Classification

26A16 41A25 47B36 

Notes

References

  1. 1.
    Beck, A., Teboulle, M.: Mirror descent and nonlinear projected subgradient methods for convex optimization. Oper. Res. Lett. 31, 167–175 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Tseng, P.: Approximation accuracy gradient methods, and error bound for structured convex optimization. Math. Program. 125, 263–295 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Bartlett, P., Collins, M., Taskar, B., McAllester, D.: Exponentiated gradient algorithms for large-margin structured classification. In: Advances in Neural Information Processing Systems, vol. 17, pp. 113–120 (2004)Google Scholar
  4. 4.
    Beck, A., Teboulle, M.: A fast iterative shrinkage–thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Bertsekas, D.: Nonlinear Programming. Athena Scientific, Athena (1999)zbMATHGoogle Scholar
  6. 6.
    Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Powell, M.J.D.: A Method for Nonlinear Constraints in Minimization Problems. Academic, New York (1969)zbMATHGoogle Scholar
  8. 8.
    Liuzzi, G., Lucidi, S., Piccialli, V.: Exploiting derivative-free local searches in DIRECT-type algorithms for global optimization. Comput. Optim. Appl. 65, 449–475 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Knowl. Discov. Data Min. 2, 121–167 (1998)CrossRefGoogle Scholar
  10. 10.
    Cristianini, N., Shawe-Taylor, J.: Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press, Cambridge (2000)CrossRefzbMATHGoogle Scholar
  11. 11.
    Sun, M., Aronson, J., Mckeown, P., Drinka, M.: A tabu search heuristic procedure for the fixed charge transportation problem. Eur. J. Oper. Res. 106, 441–456 (1998)CrossRefzbMATHGoogle Scholar
  12. 12.
    Conn, A.R., Gould, N., Sartenaer, A., Toint, P.H.L.: Convergence properties of an augmented Lagrangian algorithm for optimization with a combination of general equality and linear constraints. SIAM J. Optim. 6, 674–703 (1996)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Ben-Tal, A., Zibulevski, M.: Penalty/Barrier multiplier methods for convex programming problems. J. Optim. 7, 347–366 (1997)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Ben-Tal, A., Margalit, T., Nemirovski, A.: The ordered subsets mirror descent optimization method with applications to tomography. J. Optim. 12, 79–108 (2001)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  • Siti Nor Habibah Binti Hassan
    • 1
    Email author
  • Tomohiro Niimi
    • 2
  • Nobuo Yamashita
    • 3
  1. 1.Faculty of Mechanical Engineering, Centre for Advanced Research on EnergyUniversiti Teknikal Malaysia Melaka, Hang Tuah JayaDurian TunggalMalaysia
  2. 2.Financial Market DepartmentBank of JapanTokyoJapan
  3. 3.Department of Applied Mathematics and Physics, Graduate School of InformaticsKyoto UniversityKyotoJapan

Personalised recommendations