# On Methods for Solving Optimization Problems without Using Derivatives

• K. Lommatzsch
• Nguyen Van Thoai
Conference paper
Part of the Lecture Notes in Economics and Mathematical Systems book series (LNE, volume 255)

## Abstract

‘Smooth’ methods have been developed and used because under the assumption of smoothness it is possible to use the methods of differential calculus. For example, there are a great number of methods for solving convex optimization problems in which both the minimized objective and the set of feasible points can be expressed with the aid of differentiable convex functions. In some cases, however, the problems connected with the calculation of gradients have led to the development of algorithms which do not use derivatives. (Nevertheless, differentiability is still necessary to prove optimality, convergence assertions, etc.) The most successful optimization method — the well-known simplex method of linear programming — does not use derivatives. On the other hand, there are methods which make partial use of gradients, linearization etc., but which do not depend on differentiability assertions to prove their convergence.

## Keywords

Feasible Point Differential Calculus Solve Optimization Problem Polyhedral Cone Independent Point
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

## References

1. [1]
Huard, P.: Programmation mathematique convexe, RIRO 2 (1968) 7, pp. 43–59.Google Scholar
2. [2]
Lommatzsch, K.: Ein Gradienten-und Schwerpunktverfahren der linearen und nichtlinearen Optimierung, Aplikace Mat. 11 (1966), pp. 303–343.Google Scholar
3. [3]
N.V. Thoai, H. Puy: Convergent Algorithms for Minimizing a Concave Function, Math. of O.R 5 (1980), pp. 556–566.
4. [4]
N.V. Thoai: Verfahren zur Lösung kô.nkaver Optimierungsaufgaben auf der Basis eines verallgemeinerten Erweiterungsprinzips, Diss. (B ), Humboldt-Universität Berlin, 1984.Google Scholar