Skip to main content
Log in

New Theoretical Results on Recursive Quadratic Programming Algorithms

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

Recursive quadratic programming is a family of techniques developed by Bartholomew-Biggs and other authors for solving nonlinear programming problems. The first-order optimality conditions for a local minimizer of the augmented Lagrangian are transformed into a nonlinear system where both primal and dual variables appear explicitly. The inner iteration of the algorithm is a Newton-like procedure that updates simultaneously primal variables and Lagrange multipliers. In this way, as observed by Gould, the implementation of the Newton method becomes stable, in spite of the possibility of having large penalization parameters. In this paper, the inner iteration is analyzed from a different point of view. Namely, the size of the convergence region and the speed of convergence of the inner process are considered and it is shown that, in some sense, both are independent of the penalization parameter when an adequate version of the Newton method is used. In other words, classical Newton-like iterations are improved, not only in relation to stability of the linear algebra involved, but also with regard to the ovearll convergence of the nonlinear process. Some numerical experiments suggset that, in fact, practical efficiency of the methods is related to these theoretical results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bartholomew-Biggs, M. C., Recursive Quadratic Programming Methods Based on the Augmented Lagrangian, Mathematical Programming Study, Vol. 31, pp. 21–41, 1987.

    Google Scholar 

  2. Bartholomew-Biggs, M. C., and HernÁndez, M. F. G., Some Improvements of the Subroutine OPALQP for Dealing with Large Problems, Journal of Economic Dynamics and Control, Vol. 18, pp. 185–203, 1994.

    Google Scholar 

  3. Bartholomew-Biggs, M. C., and HernÁndez, M. F. G., Using the KKT Matrix in an Augmented Lagrangian SQP Method for Sparse Constrained Optimization, Journal of Optimization Theory and Applications, Vol. 85, pp. 201–220, 1995.

    Google Scholar 

  4. Gill, P. E., Murray, W., and Wright, M. H., Practical Optimization, Academic Press, London, England, 1981.

    Google Scholar 

  5. Gould, N. I. M., On the Accurate Determination of Search Directions for Simple Differentiable Penalty Functions, IMA Journal of Numerical Analysis, Vol. 6, pp. 357–372, 1986.

    Google Scholar 

  6. McCormick, G. P., Nonlinear Programming, John Wiley and Sons, New York, New York, 1983.

    Google Scholar 

  7. Fletcher, R., Practical Methods of Optimization, John Wiley and Sons, New York, New York, 1987.

    Google Scholar 

  8. Dennis, J. E., and Schnabel, R. B., Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Pretice-Hall, Englewoods Cliffs, New Jersey, 1983.

    Google Scholar 

  9. Ortega, J. M., and Rheinboldt, W. C., Interative Solution of Nonlinear Equations in Several Variables, Academic Press, New York, New York, 1970.

    Google Scholar 

  10. Conn, A. R., Gould, N. I. M., and Toint, Ph. L., A Note on Using Alternatives Second-Order Models for the Subproblems Arising in Barrier Function Methods for Minimization, Numerische Mathematik, Vol. 68, pp. 17–33, 1994.

    Google Scholar 

  11. Dussault, J. P., Numerical Stability and Efficiency of Penalty Algorithms, SIAM Journal of Numerical Analysis, Vol. 32, pp. 296–317, 1995.

    Google Scholar 

  12. Gould, N. I. M., On the Convergence of the Sequential Penalty Function Method for Constrained Minimization, SIAM Journal on Numerical Analysis, Vol. 26, pp. 107–128, 1989.

    Google Scholar 

  13. McCormick, G. P., The Superlinear Convergence of a Nonlinear Primal-Dual Algorithm, Technical Report OR-T-550/91. Department of Operations Research, George Washington University, 1991.

  14. Gonzaga, C. C., Path Following Methods for Linear Programming, SIAM Review, Vol. 34, pp. 167–224, 1992.

    Google Scholar 

  15. Forster, W., Homotopy Methods, Handbook of Global Optimization, Edited by R. Horst and P. M. Pardalos, Kluwer Academic Publishers, Dordrecht, Holland, 1995.

    Google Scholar 

  16. Rheinboldt, W. C., Numerical Analysis of Parametrized Nonlinear Equations, John Wiley and Sons, New York, New York, 1986.

    Google Scholar 

  17. Watson, L. T., Billups, S. C., and Morgan, A. P., Algorithm 652—HOMPACK: A Suite of Codes for Globally Convergent Homotopy Algorithms, ACM Transactions Mathematical Software, Vol. 13, pp. 281–310, 1987.

    Google Scholar 

  18. Lootsma, F. A., A Survey of Methods for Solving Constrained Minimization Problems via Unconstrained Minimization, Numerical Methods for Nonlinear Optimization, Edited by F. A. Lootsma, Academic Press, New York, New York, pp. 313–347, 1972.

    Google Scholar 

  19. Griewank, A., Direct Calculation of Newton Steps without Accumulating Jacobians, Large-Scale Numerical Optimization, Edited by T. F. Coleman and Y. Li, SIAM, Philadelphia, Phennsylvania, pp. 115–137, 1990.

    Google Scholar 

  20. Dennis, J. E., and Walker, H. F., Convergence Theorems for Least-Change Secant Update Methods, SIAM Journal on Numerical Analysis, Vol. 18, pp. 949–987, 1981.

    Google Scholar 

  21. MartÍnez, J. M., Local Convergence Theory of Inexact Newton Methods Based on Structured Least Change Updates, Mathematics of Computation, Vol. 55, pp. 143–167, 1990.

    Google Scholar 

  22. MartÍnez, J. M., Fixed-Point Quasi-Newton Methods, SIAM Journal on Numerical Analysis, Vol. 29, pp. 1413–1434, 1992.

    Google Scholar 

  23. Tapia, R. A., Diagonalized Multiplier Methods and Quasi-Newton Methods for Constrained Optimization, Journal of Optimization Theory and Applications, Vol. 22, pp. 135–194, 1977.

    Google Scholar 

  24. Wright, M. H., Why a Pure Primal Newton Barrier Step May Be Infeasible, SIAM Journal on Optimization, Vol. 5, pp. 1–13, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Martínez, J.M., Santos, L.T. New Theoretical Results on Recursive Quadratic Programming Algorithms. Journal of Optimization Theory and Applications 97, 435–454 (1998). https://doi.org/10.1023/A:1022686919295

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1022686919295

Navigation