Alternance Form of Optimality Conditions in the Finite-Dimensional Space

Chapter
Part of the Springer Optimization and Its Applications book series (SOIA, volume 87)

Abstract

In solving optimization problems, necessary and sufficient optimality conditions play an outstanding role. They allow, first of all, to check whether a point under study satisfies the conditions, and, secondly, if it does not, to find a “better” point. This is why such conditions should be “constructive” letting to solve the above-mentioned problems. For the class of directionally differentiable functions in I​​R n , a necessary condition for an unconstrained minimum requires for the directional derivative to be non-negative in all directions. This condition becomes efficient for special classes of directionally differentiable functions. For example, in the case of convex and max-type functions, the necessary condition for a minimum takes the form 0 n C where CI​​R n is a convex compact set. The problem of verifying this condition is reduced to that of finding the point of C which is the nearest to the origin. If the origin does not belong to C, we easily find the steepest descent direction and are able to construct a numerical method. For the classical Chebyshev approximation problem (the problem of approximating a function f(t): GI​​R by a polynomial P(t)), the condition for a minimum takes the so-called alternance form: for a polynomial P (t) to be a solution to the Chebyshev approximation problem, a collection of points \(\{t_{i}\ \vert \ t_{i} \in G\}\) should exist at which the difference P (t) − f(t) attains its maximal absolute value with alternating signs. This condition can easily be verified, and if it does not hold, one can find a “better” polynomial. In the present paper, it will be demonstrated that the alternance form of the necessary conditions is valid not only for Chebyshev approximation problems but also in the general case of directionally differentiable functions. Here only unconstrained optimization problems are discussed. In many cases a constrained optimization problem can be reduced (via Exact Penalization Techniques) to an unconstrained one. In the paper, optimality conditions are first formulated in terms of directional derivatives. Next, the notions of upper and lower exhausters are introduced, and optimality conditions are stated by means of upper and lower exhausters. In all these cases the optimality conditions are presented in the form 0 n C where CI​​R n is a convex closed bounded set (or a family of such sets). It is proved that the condition 0 n C can be formulated in the alternance form. The result obtained is applied to deduce the well-known Chebyshev alternation rule in the problem of Chebyshev approximation of a function by a polynomial. The problem of Chebyshev approximation of several functions by a polynomial is also discussed, and optimality conditions are stated in the alternance form.

Keywords

Necessary optimality conditions Alternance form Directionally differentiable functions 

Notes

Acknowledgements

The authors are thankful to two unknown Referees whose remarks and suggestions were very fruitful and useful. The work was supported by the Russian Foundation for Basic Research (RFFI) under Grant No 12-01-00752.

References

  1. [1]
    Abbasov, M.E., Demyanov, V.F.: Extremum Conditions for a Nonsmooth Function in Terms of Exhausters and Coexhausters, (In Russian) Trudy Instituta Matematiki i Mekhaniki UrO RAN, 2009, Vol. 15, No. 4. Institute of Mathematics and Mechanics Press, Ekaterinburg. English translation: Proceedings of the Steklov Institute of Mathematics, 2010, Suppl. 2, pp. S1–S10. Pleiades Publishing, Ltd. (2010)Google Scholar
  2. [2]
    Castellani, M.: A dual representation for proper positively homogeneous functions, J. Global Optim. 16 (4), 393–400 (2000)MathSciNetCrossRefMATHGoogle Scholar
  3. [3]
    Daugavet, V.A., Malozemov, V.N.: Nonlinear approximation problems. In: Moiseev, N.N. (ed.) The State-of-the-Art of Operations Research Theory. (in Russian) pp. 336–363, Moscow, Nauka Publishers (1979)Google Scholar
  4. [4]
    Demyanov, V.F.: Exhausters and convexificators—new tools in nonsmooth analysis. In: Demyanov, V.F., Rubinov, A.M. (eds.) Quasidifferentiability and Related Topics, pp. 85–137. Kluwer, Dordrecht (2000)CrossRefGoogle Scholar
  5. [5]
    Demyanov, V.F., Malozemov, V.N.: Introduction to Minimax, p. 368. Nauka, Moscow (1972) [English translation: John Wiley and Sons, 1974. vii+307 p. Second Edition by Dover, New York (1990)]Google Scholar
  6. [6]
    Demyanov, V.F., Roshchina, V.A.: Exhausters and subdifferentials in non-smooth analysis. Optimization 57(1), 41–56 (2008)MathSciNetCrossRefMATHGoogle Scholar
  7. [7]
    Demyanov, V.F., Rubinov, A.M.: Quasidifferential Calculus. Springer—Optimization Software, New York (1986)Google Scholar
  8. [8]
    Demyanov, V.F., Rubinov, A.M.: Constructive Nonsmooth Analysis. Peter Lang Verlag, Frankfurt a/M (1995)Google Scholar
  9. [9]
    Demyanov, V.F., Rubinov, A.M.: Exhaustive families of approximations revisited. In: Gilbert, R.P., Panagiotopoulos, P.D., Pardalos, P.M. (eds.) From Convexity to Nonconvexity, pp. 43–50. Kluwer, Dordrecht (2001)CrossRefGoogle Scholar
  10. [10]
    Demyanov, V. F., Vasilev, L. V.: Nondifferentiable Optimization, Translated from the Russian by Tetsushi Sasagawa. Translation Series in Mathematics and Engineering, pp. xvii+452. Optimization Software, Inc., Publications Division, New York (1985)Google Scholar
  11. [11]
    Hiriart-Urruty, J.-B., Lemarechal, C.: Convex Analysis and Minimization Algorithms, Parts I and II. Springer, Berlin (1993)Google Scholar
  12. [12]
    Ioffe, A.D., Tikhomirov, V.M.: Theory of Extremal Problems, p. 480.Nauka Publishers, Moscow (1974). [English transl. by North-Holland, 1979]Google Scholar
  13. [13]
    Polyak, B.T.: Introduction to Optimization, (In Russian) p. 384. Nauka Publishers, Moscow (1983)Google Scholar
  14. [14]
    Pschenichnyi, B.N.: Convex Analysis and Extremal Problems. Nauka, Moscow (1980)Google Scholar
  15. [15]
    Rockafellar, R.T.: Convex Analysis. p. xviii+451. Princeton University Press, Princeton (1970)Google Scholar
  16. [16]
    Shor N.Z.: Methods for Minimizing Nondifferentiable Functions and Their Applications. p. 200. Naukova dumka, Kiev (1979)Google Scholar
  17. [17]
    Uderzo, A.: Convex approximators, convexificators and exhausters: applications to constrained extremum problems. In: Demyanov, V.F., Rubinov, A.M. (eds.) Quasidifferentiability and Related Topics, pp. 297–327. Kluwer, Dordrecht (2000)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Applied Mathematics DepartmentSaint Petersburg State UniversitySaint-PetersburgRussia
  2. 2.Mathematics and Mechanics DepartmentSaint Petersburg State UniversitySaint-PetersburgRussia

Personalised recommendations