# On Methods for Solving Optimization Problems without Using Derivatives

## Abstract

‘Smooth’ methods have been developed and used because under the assumption of smoothness it is possible to use the methods of differential calculus. For example, there are a great number of methods for solving convex optimization problems in which both the minimized objective and the set of feasible points can be expressed with the aid of differentiable convex functions. In some cases, however, the problems connected with the calculation of gradients have led to the development of algorithms which do not use derivatives. (Nevertheless, differentiability is still necessary to prove optimality, convergence assertions, etc.) The most successful optimization method — the well-known simplex method of linear programming — does not use derivatives. On the other hand, there are methods which make partial use of gradients, linearization etc., but which do not depend on differentiability assertions to prove their convergence.

## Keywords

Feasible Point Differential Calculus Solve Optimization Problem Polyhedral Cone Independent Point## Preview

Unable to display preview. Download preview PDF.

## References

- [1]Huard, P.: Programmation mathematique convexe, RIRO 2 (1968) 7, pp. 43–59.Google Scholar
- [2]Lommatzsch, K.: Ein Gradienten-und Schwerpunktverfahren der linearen und nichtlinearen Optimierung, Aplikace Mat. 11 (1966), pp. 303–343.Google Scholar
- [3]N.V. Thoai, H. Puy: Convergent Algorithms for Minimizing a Concave Function, Math. of O.R 5 (1980), pp. 556–566.CrossRefGoogle Scholar
- [4]N.V. Thoai: Verfahren zur Lösung kô.nkaver Optimierungsaufgaben auf der Basis eines verallgemeinerten Erweiterungsprinzips, Diss. (B ), Humboldt-Universität Berlin, 1984.Google Scholar