Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

# The Newton modified barrier method for QP problems

• 94 Accesses

• 14 Citations

## Abstract

The Modified Barrier Functions (MBF) have elements of both Classical Lagrangians (CL) and Classical Barrier Functions (CBF). The MBF methods find an unconstrained minimizer of some smooth barrier function in primal space and then update the Lagrange multipliers, while the barrier parameter either remains fixed or can be updated at each step. The numerical realization of the MBF method leads to the Newton MBF method, where the primal minimizer is found by using Newton's method. This minimizer is then used to update the Lagrange multipliers. In this paper, we examine the Newton MBF method for the Quadratic Programming (QP) problem. It will be shown that under standard second-order optimality conditions, there is a ball around the primal solution and a cut cone in the dual space such that for a set of Lagrange multipliers in this cut cone, the method converges quadratically to the primal minimizer from any point in the aforementioned ball, and continues, to do so after each Lagrange multiplier update. The Lagrange multipliers remain within the cut cone and converge linearly to their optimal values. Any point in this ball will be called a “hot start”. Starting at such a “hot start”, at mostO(In Inɛ -1) Newton steps are sufficient to perform the primal minimization which is necessary for the Lagrange multiplier update. Here, ɛ>0 is the desired accuracy. Because of the linear convergence of the Lagrange multipliers, this means that onlyO(Inɛ -1)O(In Inɛ -1) Newton steps are required to reach an ɛ-approximation to the solution from any “hot start”. In order to reach the “hot start”, one has to perform$$\mathcal{O}(\sqrt m {\text{ }}In{\text{ }}C)$$ Newton steps, wherem characterizes the size of the problem andC>0 is the condition number of the QP problem. This condition number will be characterized explicitly in terms of key parameters of the QP problem, which in turn depend on the input data and the size of the problem.

This is a preview of subscription content, log in to check access.

## References

1. [1]

M. Abramowitz and I.A. Stegun,Handbook of Mathematical Functions (Dover, New York, 1972.

2. [2]

A. Ben-Tal, I. Yuzefovich and M. Zibulevsky, Penalty/barrier multiplier methods for minimax and constrained smooth convex programs, Research Report 9/92, Optimization Laboratory, Technion, Haifa, Israel (1992).

3. [3]

M.G. Breitfeld and D.F. Shanno, Preliminary computational experience with modified log-barrier functions for large-scale nonlinear programming, in:Large Scale Optimization, State of the Art, ed. W. Hager, D. Hearn and P. Pardalos (Kluwer Academic, 1994).

4. [4]

D.P. Bertsekas,Constrained Optimization and Lagrange Multiplier Methods, (Academic Press, New York, 1982).

5. [5]

D. den Hertog, C. Roos and T. Terlaky, On the classical logarithmic barrier function method for a class of smooth, convex programming problems, Technical Report 90-28, Delft University of Technology, Delft, The Netherlands (1990).

6. [6]

V.N. Fadeeva,Computational Methods of Linear Algebra (Dover, New York, 1959).

7. [7]

A. V. Fiacco and G.P. McCormick,Nonlinear Programming: Sequential Unconstrained Minimization Techniques (Wiley, New York, 1968).

8. [8]

R.M. Freund, Theoretical efficiency of a shifted barrier function algorithm for linear programming, Linear Algebra and its Applications 152(1991)19–41.

9. [9]

P.E. Gill, W. Murray, M.A. Saunders, J.A. Tomlin and M.H. Wright, On projected Newton barrier methods for linear programming and an equivalence to Karmarkar's projective method. Mathematical Programming 36(1986)183–209.

10. [10]

P.E. Gill, W. Murray, M.A. Saunders, J.A. Tomlin and M.H. Wright, Shifted barrier methods for linear programming, Technical Report SOL 88-9, Systems Optimization Laboratory, Dept. of Operations Research, Stanford University, Stanford, CA 94305 (August 1988).

11. [11]

C.C. Gonzaga, Path-following methods for linear programming, SIAM Review 34(1992)167–224.

12. [12]

D. Goldfarb and S. Liu, AnO(n 3 L) primal interior point algorithm for convex quadratic programming, Mathematical Programming 49(1990/1991)325–340.

13. [13]

F. Jarre, The method of analytic centers for smooth convex programs, Ph.D. Thesis, Institut für Angewandte Mathematik und Statistik, Universität Würzburg, Würzburg, Germany (1989).

14. [14]

D. Jensen, R. Polyak and R. Schneur, Numerical experience with MBF methods for LP, Research Report, IBM, T.J. Watson Research Center, Yorktown Heights, NY 10598 (1992) pp. 1–31.

15. [15]

D. Jensen and R. Polyak, Modified Barrier Method convergence in quadratic programming,16th IFIP Conference on System Modelling and Optimization, vol. 2 (1993).

16. [16]

D. Jensen and R. Polyak, The convergence of a Modified Barrier Method for convex programming, IBM Journal of Research and Development 38(1994)307–321.

17. [17]

L. Kantorovich and G. Akilov,Functional Analysis in Normed Spaces (McMillan, New York, 1964).

18. [18]

S. Nash, R. Polyak and A. Sofer, A numerical comparison of Barrier and Modified Barrier Methods for large-scale bound-constrained optimization, in:Large Scale Optimization, State of the Art, ed. W. Hager, D. Hearn and P. Pardalos (Kluwer Academic, 1994).

19. [19]

Yu.E. Nesterov and A.S. Nemirovsky,Interior Point Polynomial Methods in Convex Programming: Theory and Algorithms (SIAM, Philadelphia, 1994).

20. [20]

B.T. Polyak,Introduction to Optimization (Nauka, Moscow, 1983) (in Russian).

21. [21]

B.T. Polyak and N.V. Tret'yakov, The method of penalty estimates for conditional extremum problems, Zhurnal Vycheslitel'noi Matematiki i Matematicheskoi Fiziki 13(1973)34–46 (in Russian).

22. [22]

R. Polyak and I. Polyak,Constrained Optimization Problems and Multiplier Methods (Znanie, Kiev, 1985) (in Russian).

23. [23]

R. Polyak, Controlled processes in extremal and equilibrium problems, VINITI, Deposited manuscript, Moscow (1986) 420 p. (in Russian).

24. [24]

R. Polyak, Modified barrier functions (theory and methods), Mathematical Programming 54(1992)177–222.

25. [25]

R. Polyak, Modified barrier functions in LP, IBM Research Report RC 17790 (#78331), Yorktown Heights, NY (1992).

26. [26]

R. Polyak and M. Teboulle, Nonlinear rescaling principle and proximal-like methods in convex optimization, Research Report, Dept. Operations Research, George Mason University, Fairfax, VA (1995) to appear in Mathematical Programming.

27. [27]

M. Powell, Some convergence properties of a shifted log barrier method for linear programming, Technical Report NA7, DAMTP, Cambridge UK (1992).

28. [28]

J. Renegar and M. Shub, Unified complexity analysis for Newton LP methods, Mathematical Programming 53(1992)1–16.

29. [29]

R.T. Rockafellar, Augmented Lagrange multiplier functions and duality in nonconvex programming, SIAM Journal on Control 12(1974)268–285.

30. [30]

C. Roos and J.-Ph. Vial, Long steps with the logarithmic penalty barrier function in linear programming, Technical Report No. 89-44, Faculty of Mathematics and Informatics/Computer Science, Delft University of Technology, Delft, The Netherlands, (1989).

31. [31]

S. Smale, Newton method estimates from data at one point, The merging of disciplines in pure, applied and computational mathematics, ed. R.E. Ewing et al. (Springer, New York/Berlin, 1986).

32. [32]

S. Smale and M. Shub, private communication.

33. [33]

P. Wolfe, A duality theorem for nonlinear programming, Quarterly of Applied Mathematics 19(1961)239–244.

Download references

## Additional information

Partially supported by NASA Grant NAG3-1397 and National Science Foundation Grant DMS-9403218.

## Rights and permissions

Reprints and Permissions

## About this article

### Cite this article

Melman, A., Polyak, R. The Newton modified barrier method for QP problems. Ann Oper Res 62, 465–519 (1996). https://doi.org/10.1007/BF02206827

Download citation

• Issue Date:

### Key words

• Modified Barrier Function
• Lagrange multipliers
• relaxation operator
• condition number
• “hot start”
• complexity