Skip to main content

On Methods for Solving Optimization Problems without Using Derivatives

  • Conference paper
  • 84 Accesses

Part of the Lecture Notes in Economics and Mathematical Systems book series (LNE,volume 255)

Abstract

‘Smooth’ methods have been developed and used because under the assumption of smoothness it is possible to use the methods of differential calculus. For example, there are a great number of methods for solving convex optimization problems in which both the minimized objective and the set of feasible points can be expressed with the aid of differentiable convex functions. In some cases, however, the problems connected with the calculation of gradients have led to the development of algorithms which do not use derivatives. (Nevertheless, differentiability is still necessary to prove optimality, convergence assertions, etc.) The most successful optimization method — the well-known simplex method of linear programming — does not use derivatives. On the other hand, there are methods which make partial use of gradients, linearization etc., but which do not depend on differentiability assertions to prove their convergence.

Keywords

  • Feasible Point
  • Differential Calculus
  • Solve Optimization Problem
  • Polyhedral Cone
  • Independent Point

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Huard, P.: Programmation mathematique convexe, RIRO 2 (1968) 7, pp. 43–59.

    Google Scholar 

  2. Lommatzsch, K.: Ein Gradienten-und Schwerpunktverfahren der linearen und nichtlinearen Optimierung, Aplikace Mat. 11 (1966), pp. 303–343.

    Google Scholar 

  3. N.V. Thoai, H. Puy: Convergent Algorithms for Minimizing a Concave Function, Math. of O.R 5 (1980), pp. 556–566.

    CrossRef  Google Scholar 

  4. N.V. Thoai: Verfahren zur Lösung kô.nkaver Optimierungsaufgaben auf der Basis eines verallgemeinerten Erweiterungsprinzips, Diss. (B ), Humboldt-Universität Berlin, 1984.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 1985 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lommatzsch, K., Van Thoai, N. (1985). On Methods for Solving Optimization Problems without Using Derivatives. In: Demyanov, V.F., Pallaschke, D. (eds) Nondifferentiable Optimization: Motivations and Applications. Lecture Notes in Economics and Mathematical Systems, vol 255. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-12603-5_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-12603-5_21

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-15979-7

  • Online ISBN: 978-3-662-12603-5

  • eBook Packages: Springer Book Archive