Skip to main content
Log in

A Piecewise Line-Search Technique for Maintaining the Positive Definitenes of the Matrices in the SQP Method

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

A technique for maintaining the positive definiteness of the matrices in the quasi-Newton version of the SQP algorithm is proposed. In our algorithm, matrices approximating the Hessian of the augmented Lagrangian are updated. The positive definiteness of these matrices in the space tangent to the constraint manifold is ensured by a so-called piecewise line-search technique, while their positive definiteness in a complementary subspace is obtained by setting the augmentation parameter. In our experiment, the combination of these two ideas leads to a new algorithm that turns out to be more robust and often improves the results obtained with other approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. L.T. Biegler, J. Nocedal, and C. Schmid, “Areduced Hessian method for large-scale constrained optimization,” SIAM Journal on Optimization,” vol. 5, pp. 314–347, 1995.

    Google Scholar 

  2. P.T. Boggs and J.W. Tolle, “Sequential quadratic programming,” in Acta Numerica 1995, Cambridge University Press, pp. 1-51, 1995.

  3. J.F. Bonnans, “Asymptotic admissibility of the unit stepsize in exact penalty methods,” SIAM Journal on Control and Optimization, vol. 27, pp. 631–641, 1989.

    Google Scholar 

  4. J.F. Bonnans, J.Ch. Gilbert, C. Lemaréchal, and C. Sagastizábal, Mathématiques et Applications, vol. 27: Optimisation Numérique-Aspects théoriques et pratiques, Springer Verlag: Berlin, 1997.

    Google Scholar 

  5. R.H. Byrd, J.Ch. Gilbert, and J. Nocedal, “A trust region method based on interior point techniques for nonlinear programming,” Rapport de Recherche 2896, INRIA, 1996. (Submitted to Mathematical Programming)

  6. R.H. Byrd and J. Nocedal, “A tool for the analysis of quasi-Newton methods with application to unconstrained minimization,” SIAM Journal on Numerical Analysis, vol. 26, pp. 727–739, 1989.

    Google Scholar 

  7. R.H. Byrd and J. Nocedal, “An analysis of reduced Hessian methods for constrained optimization,” Mathematical Programming, vol. 49, pp. 285–323, 1991.

    Google Scholar 

  8. R.H. Byrd, R.A. Tapia, and Y. Zhang, “An SQP augmented Lagrangian BFGS algorithm for constrained optimization,” SIAM Journal on Optimization, vol. 2, pp. 210–241, 1992.

    Google Scholar 

  9. F.H. Clarke, Optimization and Nonsmooth Analysis, John Wiley & Sons: New York, 1983.

    Google Scholar 

  10. T.F. Coleman and A.R. Conn, “On the local convergence of a quasi-Newton method for the nonlinear programming problem,” SIAM Journal on Numerical Analysis, vol. 21, pp. 755–769, 1984.

    Google Scholar 

  11. T.F. Coleman and P.A. Fenyes, “Partitioned quasi-Newton methods for nonlinear constrained optimization,” Mathematical Programming, vol. 53, pp. 17–44, 1992.

    Google Scholar 

  12. J.E. Dennis and R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Prentice-Hall: Englewood Cliffs, 1983.

    Google Scholar 

  13. J.E. Dennis and H. Wolkowicz, “Sizing and least-change secant methods,” SIAM Journal on Numerical Analysis, vol. 30, pp. 1291–1314, 1993.

    Google Scholar 

  14. R. Fletcher, Practical Methods of Optimization, 2nd ed., John Wiley & Sons: Chichester, 1987.

    Google Scholar 

  15. D. Gabay, “Minimizing a differentiable function over a differential manifold,” Journal of Optimization Theory and Applications, vol. 37, pp. 177–219, 1982.

    Google Scholar 

  16. D. Gabay, “Reduced quasi-Newton methods with feasibility improvement for nonlinearly constrained optimization,” Mathematical Programming Study, vol. 16, pp. 18–44, 1982.

    Google Scholar 

  17. J.Ch. Gilbert, “Mise à jour de la métrique dans les méthodes de quasi-Newton réduites en optimisation avec contraintes d'égalité,” Modélisation Mathématique et Analyse Numérique, vol. 22, pp. 251–288, 1988.

    Google Scholar 

  18. J.Ch. Gilbert, “Maintaining the positive definiteness of the matrices in reduced secant methods for equality constrained optimization,” Mathematical Programming, vol. 50, pp. 1–28, 1991.

    Google Scholar 

  19. J.Ch. Gilbert, “Superlinear convergence of a reduced BFGS method with piecewise line-search and update criterion,” Rapport de Recherche 2140, INRIA, BP 105, 78153 Le Chesnay, France, 1993. http://www.inria.fr/RRRT/RR-2140.html; ftp://ftp.inria.fr/INRIA/publication/RR, RR-2140.ps.gz.

  20. J.Ch. Gilbert, “On the realization of the Wolfe conditions in reduced quasi-Newton methods for equality constrained optimization,” SIAM Journal on Optimization, vol. 7, pp. 780–813, 1997.

    Google Scholar 

  21. C.B. Gurwitz, “Local convergence of a two-piece update of a projected Hessian matrix,” SIAM Journal on Optimization, vol. 4, pp. 461–485, 1994.

    Google Scholar 

  22. S.-P. Han, “Superlinearly convergent variable metric algorithms for general nonlinear programming problems,” Mathematical Programming, vol. 11, pp. 263–282, 1976.

    Google Scholar 

  23. S.-P. Han and O.L. Mangasarian, “Exact penalty functions in nonlinear programming,” Mathematical Programming, vol. 17, pp. 251–269, 1979.

    Google Scholar 

  24. J.-B. Hiriart-Urruty and C. Lemaréchal, Convex Analysis and Minimization Algorithms, Springer-Verlag, 1993, Grundlehren der mathematischen Wissenschaften 305-306.

  25. W. Hock and K. Schittkowski, Lecture Notes in Economics and Mathematical Systems, vol. 187: Test Examples for Nonlinear Programming Codes, Springer-Verlag: Berlin, 1981.

    Google Scholar 

  26. O.L. Mangasarian, Nonlinear Programming, McGraw-Hill: New York, 1969.

    Google Scholar 

  27. J. Nocedal and M.L. Overton, “Projected Hessian updating algorithms for nonlinearly constrained optimization,” SIAM Journal on Numerical Analysis, vol. 22, pp. 821–850, 1985.

    Google Scholar 

  28. J.D. Pearson, “Variable metric methods of minimization,” The Computer Journal, vol. 12, pp. 171–178, 1969.

    Google Scholar 

  29. M.J.D. Powell, “The convergence of variable metric methods for nonlinearly constrained optimization calculations,” in Nonlinear Programming, vol. 3, O.L. Mangasarian, R.R. Meyer, and S.M. Robinson (Eds.), Academic Press: New York, 1978, pp. 27–63.

    Google Scholar 

  30. M.J.D. Powell, “A fast algorithm for nonlinearly constrained optimization calculations,” in Lecture Notes in Mathematics, vol. 630: Numerical Analysis Dundee 1977, G.A.Watson (Ed.), Springer-Verlag: Berlin, 1978, pp. 144–157.

    Google Scholar 

  31. M.J.D. Powell, “The performance of two subroutines for constrained optimization on some difficult test problems,” in Numerical Optimization 1984, P.T. Boggs, R.H. Byrd, and R.B. Schnabel (Eds.), SIAM Publication: Philadelphia, 1985, pp. 160–177.

    Google Scholar 

  32. M.J.D. Powell, “A view of nonlinear optimization,” in History of Mathematical Programming, A Collection of Personal Reminiscences, J.K. Lenstra, A.H.G. Rinnooy Kan, and A. Schrijver (Eds.), CWI North-Holland: Amsterdam, 1991, pp. 119–125.

    Google Scholar 

  33. K. Schittkowski, Lecture Notes in Economics and Mathematical Systems, vol. 282: More Test Examples for Nonlinear Programming Codes, Springer-Verlag, 1987.

  34. L. Schwartz, Analyse II-Calcul Différentiel et Équations Différentielles, Hermann, Paris, 1992.

    Google Scholar 

  35. R.A. Tapia, “Diagonalized multiplier methods and quasi-Newton methods for constrained optimization,” Journal of Optimization Theory and Applications, vol. 22, pp. 135–194, 1977.

    Google Scholar 

  36. R.A. Tapia, “On secant updates for use in general constrained optimization,” Mathematics of Computation, vol. 51, pp. 181–202, 1988.

    Google Scholar 

  37. P. Wolfe, “Convergence conditions for ascent methods,” SIAM Review, vol. 11, pp. 226–235, 1969.

    Google Scholar 

  38. P.Wolfe, “Convergence conditions for ascent methods II: Some corrections,” SIAM Review, vol. 13, pp. 185–188, 1971.

    Google Scholar 

  39. Y. Xie and R.H. Byrd “Practical update criteria for reduced Hessian SQP: global analysis,” SIAM Journal on Optimization, vol. 9, pp. 578–604, 1999.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Armand, P., Gilbert, J.C. A Piecewise Line-Search Technique for Maintaining the Positive Definitenes of the Matrices in the SQP Method. Computational Optimization and Applications 16, 121–158 (2000). https://doi.org/10.1023/A:1008701324575

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1008701324575

Navigation