Nonlinear Optimization pp 91-116 | Cite as

# Unconstrained Optimization

Chapter

First Online:

- 1.3k Downloads

## Abstract

This chapter studies a collection of optimization problems without functional constraints, that is, problems of the form where \(\emptyset \ne C\subset \mathbb {R}^{n}\) represents a given constraint set (typically \(\mathbb {R}^{n}\) or \(\mathbb {R}_{++}^{n}\)) and \( f:C\rightarrow \mathbb {R}\) is the objective function. Unconstrained optimization problems have been widely used in astronomy and engineering. On the one hand, data fitting by least squares was a method developed by astronomers as Laplace and Gauss, in the second half of the eighteenth century, by using unconstrained quadratic optimization in order to get an accurate description of the behavior of celestial bodies to facilitate navigating the Earth’s oceans. On the other hand, at the beginning of the twentieth century, the head of the technical department of the Danish branch of the International Bell Telephone Company, an engineer and amateur mathematician called Johan Jensen proved an inequality covering several classical ones as special cases, which allows to rigorously solve isoperimetric and design problems even though the constraint set is open. Jensen’s paper “Sur les fonctions convexes et les inégalités entre les valeurs moyennes” [Acta Mathematica 30(1906) 175–193] is generally considered the inception of convex analysis.

$$\begin{aligned} \begin{array}{lll} P: &{} \text {Min} &{} f\left( x\right) \\ &{} \text {s.t.} &{} x\in C, \end{array} \end{aligned}$$

## Copyright information

© Springer Nature Switzerland AG 2019