Skip to main content
Log in

Second-order negative-curvature methods for box-constrained and general constrained optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

A Nonlinear Programming algorithm that converges to second-order stationary points is introduced in this paper. The main tool is a second-order negative-curvature method for box-constrained minimization of a certain class of functions that do not possess continuous second derivatives. This method is used to define an Augmented Lagrangian algorithm of PHR (Powell-Hestenes-Rockafellar) type. Convergence proofs under weak constraint qualifications are given. Numerical examples showing that the new method converges to second-order stationary points in situations in which first-order methods fail are exhibited.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Andreani, R., Martínez, J.M., Schuverdt, M.L.: On the relation between the Constant Positive Linear Dependence condition and quasinormality constraint qualification. J. Optim. Theory Appl. 125, 473–485 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  2. Andreani, R., Birgin, E.G., Martínez, J.M., Schuverdt, M.L.: On Augmented Lagrangian Methods with general lower-level constraints. SIAM J. Optim. 18, 1286–1309 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  3. Andreani, R., Martínez, J.M., Schuverdt, M.L.: On second-order optimality conditions for nonlinear programming. Optimization 56, 529–542 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  4. Andretta, M., Birgin, E.G., Martínez, J.M.: Practical active-set Euclidean trust-region method with spectral projected gradients for bound-constrained minimization. Optimization 54, 305–325 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  5. Anitescu, M.: Degenerate nonlinear programming with a quadratic growth condition. SIAM J. Optim. 1116–1135 (2000)

  6. Birgin, E.G., Martínez, J.M.: A box constrained optimization algorithm with negative curvature directions and spectral projected gradients. Computing (Suppl.) 15, 49–60 (2001)

    Google Scholar 

  7. Birgin, E.G., Martínez, J.M.: Large-scale active-set box-constrained optimization method with spectral projected gradients. Comput. Optim. Appl. 23, 101–125 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  8. Birgin, E.G., Martínez, J.M.: Structured minimal-memory inexact quasi-Newton method and secant preconditioners for Augmented Lagrangian optimization. Comput. Optim. Appl. 39, 1–16 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  9. Birgin, E.G., Martínez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10, 1196–1211 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  10. Birgin, E.G., Martínez, J.M., Raydan, M.: Algorithm 813: SPG—Software for convex-constrained optimization. ACM Trans. Math. Softw. 27, 340–349 (2001)

    Article  MATH  Google Scholar 

  11. Birgin, E.G., Martínez, J.M., Raydan, M.: Inexact Spectral Projected Gradient methods on convex sets. IMA J. Numer. Anal. 23, 539–559 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  12. Byrd, R.H., Schnabel, R.B., Shultz, G.A.: A trust region algorithm for nonlinearly constrained optimization. SIAM J. Numer. Anal. 24, 1152–1170 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  13. Coleman, T.F., Liu, J., Yuan, W.: A new trust-region algorithm for equality constrained optimization. Comput. Optim. Appl. 21, 177–199 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  14. Conn, A.R., Gould, N.I.M., Orban, D., Toint, Ph.L.: A primal-dual trust region algorithm for non-convex nonlinear programming. Math. Program. 87, 215–249 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  15. Conn, A.R., Gould, N.I.M., Toint, Ph.L.: Trust Region Methods. MPS/SIAM Series on Optimization. SIAM, Philadelphia (2000)

    MATH  Google Scholar 

  16. Dennis, J.E., Vicente, L.N.: On the convergence theory of trust-region-based algorithms for equality-constrained optimization. SIAM J. Optim. 7, 527–550 (1997)

    Article  MathSciNet  Google Scholar 

  17. Dennis, J.E., Heinkenschloss, M., Vicente, L.N.: Trust-region interior-point SQP algorithms for a class of nonlinear programming problems. SIAM J. Control Optim. 36, 1750–1794 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  18. Di Pillo, G., Lucidi, S., Palagi, L.: Convergence to second-order stationary points of a primal-dual algorithm model for nonlinear programming. Math. Oper. Res. 30, 897–915 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  19. Facchinei, F., Lucidi, S.: Convergence to second order stationary points in inequality constrained optimization. Math. Oper. Res. 23, 746–766 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  20. Fletcher, R.: Practical Methods of Optimization. Wiley, New York (1987)

    MATH  Google Scholar 

  21. Forsgren, A., Murray, W.: Newton methods for large-scale linear inequality constrained problems. SIAM J. Optim. 7, 132–176 (1997)

    Article  MathSciNet  Google Scholar 

  22. Gould, N.I.M., Lucidi, S., Roma, M., Toint, Ph.L.: Exploiting negative curvature directions in linesearch methods for unconstrained optimization. Optim. Methods Softw. 14, 75–98 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  23. Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr and SifDec: A constrained and unconstrained testing environment (revisited). ACM Trans. Math. Softw. 29, 373–394 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  24. Hager, W.W., Zhang, H.C.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17, 526–557 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  25. Hestenes, M.R.: Multiplier and gradient methods. J. Optim. Theory Appl. 4, 303–320 (1969)

    Article  MATH  MathSciNet  Google Scholar 

  26. Levitin, E.S., Milyutin, A.A., Osmolovskii, N.P.: Higher order conditions for a local minimum in problems with constraints. Russian Math. Surv. 33, 97–168 (1978)

    Article  MATH  MathSciNet  Google Scholar 

  27. Mangasarian, O.L., Fromovitz, S.: The Fritz-John necessary optimality conditions in presence of equality and inequality constraints. J. Math. Anal. Appl. 17, 37–47 (1967)

    Article  MATH  MathSciNet  Google Scholar 

  28. Mccormick, G.P.: A modification of Armijo’s step-size rule for negative curvature. Math. Program. 13, 111–115 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  29. Moré, J.J., Sorensen, D.C.: On the use of directions of negative curvature in a modified Newton method. Math. Program. 16, 1–20 (1979)

    Article  MATH  Google Scholar 

  30. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)

    Book  MATH  Google Scholar 

  31. Powell, M.J.D.: A method for nonlinear constraints in minimization problems. In: Fletcher, R. (ed.) Optimization, pp. 283–298. Academic Press, New York (1969)

    Google Scholar 

  32. Qi, L., Wei, Z.: On the constant positive linear dependence condition and its application to SQP methods. SIAM J. Optim. 10, 963–981 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  33. Rockafellar, R.T.: Augmented Lagrange multiplier functions and duality in nonconvex programming. SIAM J. Control Optim. 12, 268–285 (1974)

    Article  MATH  MathSciNet  Google Scholar 

  34. Rockafellar, R.T.: Lagrange multipliers and optimality. SIAM Rev. 35, 183–238 (1993)

    Article  MATH  MathSciNet  Google Scholar 

  35. Shultz, G.A., Schnabel, R.B., Byrd, R.H.: A family of trust-region-based algorithms for unconstrained minimization with strong global convergence properties. SIAM J. Numer. Anal. 22, 47–67 (1985)

    Article  MATH  MathSciNet  Google Scholar 

  36. Sorensen, D.C.: Newton’s method with a model trust region modification. SIAM J. Numer. Anal. 19, 409–426 (1982)

    Article  MATH  MathSciNet  Google Scholar 

  37. Wächter, A., Biegler, L.T.: On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Math. Program. 106, 25–57 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  38. Zhang, J., Xu, C.: A class of indefinite dogleg path methods for unconstrained minimization. SIAM J. Optim. 9, 646–667 (1999)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. G. Birgin.

Additional information

This work was supported by PRONEX-Optimization (PRONEX—CNPq/FAPERJ E-26/171.510/2006—APQ1), FAPESP (Grants 2006/53768-0 and 2005/57684-2) and CNPq.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Andreani, R., Birgin, E.G., Martínez, J.M. et al. Second-order negative-curvature methods for box-constrained and general constrained optimization. Comput Optim Appl 45, 209–236 (2010). https://doi.org/10.1007/s10589-009-9240-y

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-009-9240-y

Keywords

Navigation