Skip to main content
Log in

A second-order optimality condition with first- and second-order complementarity associated with global convergence of algorithms

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

We develop a new notion of second-order complementarity with respect to the tangent subspace related to second-order necessary optimality conditions by the introduction of so-called tangent multipliers. We prove that around a local minimizer, a second-order stationarity residual can be driven to zero while controlling the growth of Lagrange multipliers and tangent multipliers, which gives a new second-order optimality condition without constraint qualifications stronger than previous ones associated with global convergence of algorithms. We prove that second-order variants of augmented Lagrangian (under an additional smoothness assumption based on the Lojasiewicz inequality) and interior point methods generate sequences satisfying our optimality condition. We present also a companion minimal constraint qualification, weaker than the ones known for second-order methods, that ensures usual global convergence results to a classical second-order stationary point. Finally, our optimality condition naturally suggests a definition of second-order stationarity suitable for the computation of iteration complexity bounds and for the definition of stopping criteria.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Anandkumar, A., Ge, R.: Efficient approaches for escaping higher order saddle points in non-convex optimization. arXiv:1602.05908v1 (2016)

  2. Andreani, R., Birgin, E.G., Martínez, J.M., Schuverdt, M.L.: On augmented lagrangian methods with general lower-level constraints. SIAM J. Optim. 18, 1286–1309 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  3. Andreani, R., Birgin, E.G., Martínez, J.M., Schuverdt, M.L.: Second-order negative-curvature methods for box-constrained and general constrained optimization. Comput. Optim. Appl. 45, 209–236 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  4. Andreani, R., Haeser, G., Martínez, J.M.: On sequencial optimality conditions for smooth constrained optimization. Optimization 60(5), 627–641 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  5. Andreani, R., Haeser, G., Ramos, A., Silva, P.: A second-order sequential optimality condition associated to the convergence of algorithms. IMA J. Numer. Anal. 37(4), 1902–1929 (2017)

    Article  MathSciNet  Google Scholar 

  6. Andreani, R., Haeser, G., Schuverdt, M.L., Silva, P.J.S.: Two new weak constraint qualifications and applications. SIAM J. Optim. 22, 1109–1135 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  7. Andreani, R., Martínez, J.M., Ramos, A., Silva, P.J.S.: A cone-continuity constraint qualification and algorithmic consequences. SIAM J. Optim. 26(1), 96–110 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  8. Andreani, R., Martínez, J.M., Ramos, A., Silva, P.J.S.: Strict constraint qualifications and sequential optimality conditions for constrained optimization. Math. Oper. Res. (2018). https://doi.org/10.1287/moor.2017.0879

    Google Scholar 

  9. Andreani, R., Martínez, J.M., Schuverdt, M.L.: On second-order optimality conditions for nonlinear programming. Optimization 56, 529–542 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  10. Andreani, R., Martínez, J.M., Svaiter, B.F.: A new sequencial optimality condition for constrained optimization and algorithmic consequences. SIAM J. Optim. 20(6), 3533–3554 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  11. Bertsekas, D.P.: Nonlinear Programming. Athenas Scientific, Belmont (1999)

    MATH  Google Scholar 

  12. Bian, W., Chen, X., Ye, Y.: Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization. Math. Program. 149(1), 301–327 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  13. Birgin, E., Martínez, J.M.: Practical Augmented Lagrangian Methods for Constrained Optimization. SIAM Publications, Philadelphia (2014)

    Book  MATH  Google Scholar 

  14. Birgin, E.G., Gardenghi, J.L., Martínez, J.M., Santos, S.A., Toint, P.L.: Evaluation complexity for nonlinear constrained optimization using unscaled KKT conditions and high-order models. SIAM J. Optim. 26(2), 951–967 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  15. Birgin, E.G., Haeser, G., Ramos, A.: Augmented lagrangians with constrained subproblems and convergence to second-order stationary points. Comput. Optim. Appl. 69(1), 51–75 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  16. Bolte, J., Daniilidis, A., Lewis, A.S.: The lojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems. SIAM J. Optim. 17, 1205–1223 (2007)

    Article  MATH  Google Scholar 

  17. Bonnans, J.F., Shapiro, A.: Pertubation Analysis of Optimization Problems. Springer, Berlin (2000)

    Book  MATH  Google Scholar 

  18. Cartis, C., Gould, N.I.M., Toint, P.L.: Second-order optimality and beyond: characterization and evaluation complexity in convexly-constrained nonlinear optimization. Found. Comput. Math. (2017). https://doi.org/10.1007/s10208-017-9363-y

  19. Chen, L., Goldfarb, D.: Interior-point \(\ell \)2-penalty methods for nonlinear programming with strong global convergence properties. Math. Program. 108(1), 1–36 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  20. Coleman, T.F., Liu, J., Yuan, W.: A new trust-region algorithm for equality constrained optimization. Comput. Optim. Appl. 21, 177–199 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  21. Conn, A.R., Gould, N.I.M., Toint, P.L.: Lancelot: A Fortran Package for Large-Scale Nonlinear Optimization (Release A). Springer, Berlin (1992)

    Book  MATH  Google Scholar 

  22. Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust Region Methods. MPS/SIAM Series on Optimization. SIAM, Philadelphia (2000)

    Book  Google Scholar 

  23. Dennis, J.E., Vicente, L.N.: On the convergence theory of trust-region-based algorithms for equality-constrained optimization. SIAM J. Optim. 7(4), 927–950 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  24. Facchinei, F., Lucidi, S.: Convergence to second order stationary points in inequality constrained optimization. Math. Oper. Res. 23(3), 746–766 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  25. Fiacco, A.V.: Nonlinear Programming: Sequential Unconstrained Minimization Techniques. Wiley, New York (1968)

    MATH  Google Scholar 

  26. Gill, P.E., Kungurtsev, V., Robinson, D.P.: A stabilized SQP method: global convergence. IMA J. Numer. Anal. 37(1), 407–443 (2016)

    Article  MathSciNet  Google Scholar 

  27. Gould, N.I.M., Conn, A.R., Toint, P.L.: A note on the convergence of barrier algorithms for second-order necessary points. Math. Program. 85, 433–438 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  28. Haeser, G.: On the global convergence of interior-point nonlinear programming algorithms. Comput. Appl. Math. 29, 125–138 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  29. Haeser, G.: Some theoretical limitations of second-order algorithms for smooth constrained optimization. Oper. Res. Lett. 46(3), 295–299 (2018)

    Article  MathSciNet  Google Scholar 

  30. Haeser, G., Liu, H., Ye, Y.: Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary. Optimization Online (2017). http://www.optimization-online.org/DB_HTML/2017/02/5861

  31. Haeser, G., Ramos, A.: A survey of constraint qualifications with second-order properties in nonlinear optimization. Optimization Online (2018). http://www.optimizationonline.org/DB_HTML/2018/01/6409.html

  32. Izmailov, A.F., Kurennoy, A.S., Solodov, M.V.: A note on upper Lipschitz stability, error bounds, and critical multipliers for Lipschitz-continuous KKT systems. Math. Program. 142(1), 591–604 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  33. Krantz, S.G., Parks, H.G.: A Primer on Real Analytic Functions. Birkhäuser, Basel (2002)

    Book  MATH  Google Scholar 

  34. Liu, H., Yao, T., Li, R., Ye, Y.: Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions. Math. Program. 166(1–2), 207–240 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  35. Moguerza, J.M., Prieto, F.J.: An augmented lagrangian interior-point method using directions of negative curvature. Math. Program. 95(3), 573–616 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  36. Pillo, G.D., Liuzzi, G., Lucidi, S.: A primal-dual algorithm for nonlinear programming exploiting negative curvature directions. Numer. Algebra Control Optim. 1(3), 509–528 (2011)

    Article  MATH  Google Scholar 

  37. Pillo, G.D., Lucidi, S., Palagi, L.: Convergence to second-order stationary points of a primal-dual algorithm model for nonlinear programming. Math. Oper. Res. 30(4), 897–915 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  38. Tseng, P.: Convergent infeasible interior-point trust-region methods for constrained minimization. SIAM J. Optim. 13(2), 432–469 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  39. Ye, Y.: On affine scaling algorithms for nonconvex quadratic programming. Math. Program. 56(1), 285–300 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  40. Ye, Y.: On the complexity of approximating a KKT point of quadratic programming. Math. Program. 80(2), 195–211 (1998)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported by FAPESP (Grants 2013/05475-7 and 2016/02092-8) and CNPq.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gabriel Haeser.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Haeser, G. A second-order optimality condition with first- and second-order complementarity associated with global convergence of algorithms. Comput Optim Appl 70, 615–639 (2018). https://doi.org/10.1007/s10589-018-0005-3

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-018-0005-3

Keywords

Mathematics Subject Classification

Navigation