Advertisement

Mathematical Programming

, Volume 20, Issue 1, pp 1–13 | Cite as

Variable metric methods for minimizing a class of nondifferentiable functions

  • S. P. Han
Article

Abstract

We develop a class of methods for minimizing a nondifferentiable function which is the maximum of a finite number of smooth functions. The methods proceed by solving iteratively quadratic programming problems to generate search directions. For efficiency the matrices in the quadratic programming problems are suggested to be updated in a variable metric way. By doing so, the methods possess many attractive features of variable metric methods and can be viewed as their natural extension to the nondifferentiable case. To avoid the difficulties of an exact line search, a practical stepsize procedure is also introduced. Under mild assumptions the resulting method converge globally.

Key words

Minimax Nondifferentiable Optimization Quasi-Newton Methods Variable Metric Methods 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    J.M. Danskin, “The theory of max—min, with applications”,SIAM Journal on Applied Mathematics 14 (1966) 641–664.Google Scholar
  2. [2]
    V.F. Dem'yanov and V.N. Malozemov,Introduction to minimax (John Wiley & Sons, New York, 1974).Google Scholar
  3. [3]
    J.E. Dennis Jr. and J.J. Moré, “Quasi-Newton methods, motivation and theory”,SIAM Review 19 (1977) 46–89.Google Scholar
  4. [4]
    S.P. Han, “Superlinearly convergent variable metric methods for general nonlinear programming”,Mathematical Programming 11 (1976) 263–282.Google Scholar
  5. [5]
    S.P. Han, “Dual variable metric methods for constrained optimization”,SIAM Journal on Control and Optimization 15 (1977) 546–565.Google Scholar
  6. [6]
    S.P. Han, “A global convergent method for nonlinear programming”,Journal of Optimization Theory and Applications 22 (1977) 297–309.Google Scholar
  7. [7]
    S.P. Han, “A hybrid method of nonlinear programming”, in O.L. Mangasarian, R.R. Meyer and S.M. Robinson, Eds.,Nonlinear Programming 3 (Academic Press, New York, 1978) 65–95.Google Scholar
  8. [8]
    S.P. Han, “Superlinear convergence of a minimax method”, Cornell University, Computer Science TR78-336 (1978).Google Scholar
  9. [9]
    C. Lemarechal, “An extension of Davidon methods to nondifferentiable problems”,Mathematical Programming Study 3 (1975) 95–109.Google Scholar
  10. [10]
    J.M. Ortega and W.C. Rheinboldt, Iterative solution of nonlinear equations in several variables (Academic Press, New York, 1970).Google Scholar
  11. [11]
    M.R. Osborne and G.A. Watson, “An algorithm for minimax approximation in the nonlinear case”,Computer Journal 12 (1969) 64–69.Google Scholar
  12. [12]
    M.J.D. Powell, “A fast algorithm for nonlinear constrained optimization calculations”, presented at the 1977 Dundee Conference on Numerical Analysis.Google Scholar
  13. [13]
    P. Wolfe, “A method of conjugate subgradients for minimizing nondifferentiable functions”,Mathematical Programming Study 3 (1975) 145–173.Google Scholar

Copyright information

© North-Holland Publishing Company 1981

Authors and Affiliations

  • S. P. Han
    • 1
  1. 1.Department of MathematicsUniversity of IllinoisUrbanaUSA

Personalised recommendations