Skip to main content

Nonlinear Optimization

  • Chapter
  • First Online:
Lectures on Convex Optimization

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 137))

Abstract

In this chapter, we introduce the main notations and concepts used in Continuous Optimization. The first theoretical results are related to Complexity Analysis of the problems of Global Optimization. For these problems, we start with a very pessimistic lower performance guarantee. It implies that for any method there exists an optimization problem in \(\mathbb {R}^n\) which needs at least \(O\left ({1 \over \epsilon ^n}\right )\) computations of the function values in order to approximate its global solution up to accuracy 𝜖. Therefore, in the next section we pass to local optimization, and consider two main methods, the Gradient Method and the Newton Method. For both of them, we establish some local rates of convergence. In the last section, we present some standard methods in General Nonlinear Optimization: the conjugate gradient methods, quasi-Newton methods, theory of Lagrangian relaxation, barrier methods and penalty function methods. For some of them, we prove global convergence results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 64.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Sometimes, problems with a “simple” basic feasible set Q and no functional constraints are also treated as “unconstrained” problems. In this case, we need to know how to solve some auxiliary optimization problems over the set Q in a closed form.

  2. 2.

    We keep this calculation unchanged from the first version of this book [39]. In this example, the processor performance corresponds to a Sun Station, which was the most powerful personal computer at the beginning of the 1990s. Now, after twenty five years of intensive progress in the abilities of hardware, modern personal computers have reached a speed level of 108 a.o. per second. Thus indeed, our time estimate remains valid for n = 11.

  3. 3.

    In fact, in our example they are global solutions.

  4. 4.

    In fact, this is not absolutely true. We will see that, in order to apply the unconstrained minimization methods to solve constrained problems, we need to be able to find a global minimum of some auxiliary problem, and we have already seen (Example 1.2.2) that this could be difficult.

  5. 5.

    We are not going to discuss the correctness of this statement for general nonlinear problems. We just prevent the reader from extending it to other problem classes. In the following chapters, we will see that this statement is valid only up to a certain point.

  6. 6.

    If we assume that it is a strict local minimum, then the results are much weaker.

References

  1. Yu. Nesterov, Introductory Lectures on Convex Optimization. A Basic Course (Kluwer, Boston, 2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Nesterov, Y. (2018). Nonlinear Optimization. In: Lectures on Convex Optimization. Springer Optimization and Its Applications, vol 137. Springer, Cham. https://doi.org/10.1007/978-3-319-91578-4_1

Download citation

Publish with us

Policies and ethics