Advertisement

Nonsmooth Convex Optimization

  • Yurii Nesterov
Chapter
Part of the Springer Optimization and Its Applications book series (SOIA, volume 137)

Abstract

In this chapter, we consider the most general convex optimization problems, which are formed by non-differentiable convex functions. We start by studying the main properties of these functions and the definition of subgradients, which are the main directions used in the corresponding optimization schemes. We also prove the necessary facts from Convex Analysis, including different variants of Minimax Theorems. After that, we establish the lower complexity bounds and prove the convergence rate of the Subgradient Method for constrained and unconstrained optimization problems. This method appears to be optimal uniformly in the dimension of the space of variables. In the next section, we consider other optimization methods, which can work in spaces of moderate dimension (the Method of Centers of Gravity, the Ellipsoid Algorithm). The chapter concludes with a presentation of methods based on a complete piece-wise linear model of the objective function (Kelley’s method, the Level Method).

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Yurii Nesterov
    • 1
  1. 1.CORE/INMACatholic University of LouvainLouvain-la-NeuveBelgium

Personalised recommendations