Skip to main content
Log in

A reduced Hessian SQP method for inequality constrained optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

This paper develops a reduced Hessian method for solving inequality constrained optimization problems. At each iteration, the proposed method solves a quadratic subproblem which is always feasible by introducing a slack variable to generate a search direction and then computes the steplength by adopting a standard line search along the direction through employing the l penalty function. And a new update criterion is proposed to generate the quasi-Newton matrices, whose dimensions may be variable, approximating the reduced Hessian of the Lagrangian. The global convergence is established under mild conditions. Moreover, local R-linear and superlinear convergence are shown under certain conditions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Anitescu, M.: Degenerate nonlinear programming with quadratic growth condition. SIAM J. Optim. 10, 1116–1135 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  2. Biegler, L.T., Nocedal, J., Schmid, C.: A reduced Hessian method for large-scale constrained optimization. SIAM J. Optim. 5, 314–347 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  3. Boggs, P.T., Tolle, J.W.: A strategy for global convergence in a sequential quadratic programming algorithm. SIAM J. Numer. Anal. 600–623 (1989)

  4. Boggs, P.T., Tolle, J.W., Wang, P.: On the local convergence of quasi-Newton methods for constrained optimization. SIAM J. Control Optim. 2, 161–171 (1982)

    Article  MathSciNet  Google Scholar 

  5. Burke, J.V.: An exact penalization viewpoint of constrained optimization. SIAM J. Control Optim. 29, 968–998 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  6. Byrd, R.H., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26, 727–739 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  7. Byrd, R.H., Nocedal, J.: An analysis of reduced Hessian methods for constrained optimization. Math. Program. 49, 285–323 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  8. Chamberlain, R.M., Lemarechal, C., Pedersen, H.C., Powell, M.J.D.: The watchdog technique for forcing convergence in algorithms for constrained optimization. Math. Program. Stud. 16, 1–17 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  9. Coleman, T.F., Conn, A.R.: On the local convergence of a quasi-Newton method for the nonlinear programming problem. SIAM J. Numer. Anal. 2, 755–769 (1984)

    Article  MathSciNet  Google Scholar 

  10. Coleman, T.F., Sorensen, D.: A note on the computation of an orthonormal basis for the null space of a matrix. Math. Progam. 29, 234–242 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  11. De Pantoja, J.F.A.O., Mayne, D.Q.: Exact penalty function algorithm with simple updating of the penalty parameter. J. Optim. Theory Appl. 69(3), 441–467 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  12. Fischer, A.: Modified Wilson method for nonlinear programmings with nonunique multipliers. Math. Oper. Res. 24, 699–727 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  13. Gabay, D.: Reduced quasi-Newton methods with feasibility improvement for nonlinear constrained optimization. Math. Program. Stud. 16, 18–44 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gill, P.F., Murray, W., Saunders, M.A., Stewart, G.W., Wright, M.H.: Properties of a representation of a basis for null space. Math. Program. 33, 172–186 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  15. Janin, R.: Directional derivative of the marginal function in nonlinear programming. Math. Program. Stud. 21, 110–126 (1984)

    MathSciNet  MATH  Google Scholar 

  16. Li, D.H., Fukushima, M.: On the global convergence of the BFGS method for unconstrained optimization problems. SIAM J. Optim. 11, 1054–1064 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  17. Liu, T.W., Li, D.H.: A practical update criterion for SQP method. Optim. Methods Softw. 2, 253–266 (2007)

    Article  MATH  Google Scholar 

  18. Liu, T.W., Li, D.H.: A cautious BFGS update for reduced Hessian SQP. Numer. Algorithms 44, 11–28 (2007)

    Article  MathSciNet  Google Scholar 

  19. Murray, W., Wright, M.H.: Projected Lagrangian methods based on the trajectories of penalty and barrier functions. Systems Optimization Laboratory Report, 78-23, Stanford University, Stanford, CA (1978)

  20. Nocedal, J., Overton, M.L.: Projected Hessian updating algorithms for nonlinearly constrained optimization. SIAM J. Numer. Anal. 2, 821–850 (1985)

    Article  MathSciNet  Google Scholar 

  21. Penot, J.P.: A new constraint qualification condition. J. Optim. Theory Appl. 48, 459–468 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  22. Powell, M.J.D.: The convergence of variable metric methods for nonlinearly constrained optimization calculations. In: Mangasarian, O., Meyer, R., Robinson, S. (eds.) Nonlinear Programming, vol. 3, pp. 27–63. Academic Press, New York (1978)

    Google Scholar 

  23. Powell, M.J.D., Yuan, Y.: A recursive quadratic programming algorithm that uses differentiable exact penalty functions. Math. Program. 35, 265–278 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  24. Schulz, V.: Solving discretized optimization problems by partially reduced SQP methods. Comput. Vis. Sci. 1, 83–96 (1998)

    Article  MATH  Google Scholar 

  25. Solodov, M.V.: On the sequential quadratically constrained quadratic programming methods. Math. Oper. Res. 29, 64–79 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  26. Wright, St.J.: Superlinear convergence of a stabilized SQP method to a degenerate solution. Comput. Optim. Appl. 11, 253–275 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  27. Wright, St.J.: Modifying SQP for degenerate problems. SIAM J. Optim. 13, 470–497 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  28. Xie, Y.F., Byrd, R.H.: Practical update criteria for reduced Hessian SQP: global analysis. SIAM J. Optim. 9, 578–604 (1999)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tao-Wen Liu.

Additional information

This work was partially supported by the National Natural Science Foundation of China via grant 10671060.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Liu, TW. A reduced Hessian SQP method for inequality constrained optimization. Comput Optim Appl 49, 31–59 (2011). https://doi.org/10.1007/s10589-009-9285-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-009-9285-y

Keywords

Navigation