Unconstrained Steepest Descent Method for Multicriteria Optimization on Riemannian Manifolds

Article

Abstract

In this paper, we present a steepest descent method with Armijo’s rule for multicriteria optimization in the Riemannian context. The sequence generated by the method is guaranteed to be well defined. Under mild assumptions on the multicriteria function, we prove that each accumulation point (if any) satisfies first-order necessary conditions for Pareto optimality. Moreover, assuming quasiconvexity of the multicriteria function and nonnegative curvature of the Riemannian manifold, we prove full convergence of the sequence to a critical Pareto point.

Keywords

Steepest descent Pareto optimality Vector optimization Quasi-Fejér convergence Quasiconvexity Riemannian manifolds 

Notes

Acknowledgements

G.C. Bento was supported in part by CNPq Grant 473756/2009-9 and PROCAD/NF. O.P. Ferreira was supported in part by CNPq Grant 302618/2005-8, PRONEX–Optimization (FAPERJ/CNPq) and FUNAPE/UFG. P.R. Oliveira was supported in part by CNPq.

References

  1. 1.
    Burachik, R., Graña Drummond, L.M., Iusem, A.N., Svaiter, B.F.: Full convergence of the steepest descent method with inexact line searches. Optimization 32(2), 137–146 (1995) MathSciNetMATHCrossRefGoogle Scholar
  2. 2.
    Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000) MathSciNetMATHCrossRefGoogle Scholar
  3. 3.
    Kiwiel, K.C., Murty, K.: Convergence of the steepest descent method for minimizing quasiconvex functions. J. Optim. Theory Appl. 89(1), 221–226 (1996) MathSciNetMATHCrossRefGoogle Scholar
  4. 4.
    Graña Drummond, L.M., Svaiter, B.F.: A steepest descent method for vector optimization. J. Comput. Appl. Math. 175(2), 395–414 (2005) MathSciNetMATHCrossRefGoogle Scholar
  5. 5.
    Graña Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28(1), 5–29 (2004) MathSciNetMATHCrossRefGoogle Scholar
  6. 6.
    Rapcsák, T.: Smooth Nonlinear Optimization in R n. Kluwer Academic, Dordrecht (1997) Google Scholar
  7. 7.
    Alvarez, F., Bolte, J., Munier, J.: A unifying local convergence result for Newton’s method in Riemannian manifolds. Found. Comput. Math. 8, 197–226 (2008) MathSciNetMATHCrossRefGoogle Scholar
  8. 8.
    Attouch, H., Bolte, J., Redont, P., Teboulle, M.: Singular Riemannian barrier methods and gradient-projection dynamical systems for constrained optimization. Optimization 53(5–6), 435–454 (2004) MathSciNetMATHCrossRefGoogle Scholar
  9. 9.
    Azagra, D., Ferrera, J., López-Mesas, M.: Nonsmooth analysis and Hamilton–Jacobi equations on Riemannian manifolds. J. Funct. Anal. 220, 304–361 (2005) MathSciNetMATHCrossRefGoogle Scholar
  10. 10.
    Barani, A., Pouryayevali, M.R.: Invariant monotone vector fields on Riemannian manifolds. Nonlinear Anal. 70(5), 1850–1861 (2009) MathSciNetMATHCrossRefGoogle Scholar
  11. 11.
    Ferreira, O.P., Svaiter, B.F.: Kantorovich’s theorem on Newton’s method in Riemannian manifolds. J. Complex. 18, 304–329 (2002) MathSciNetMATHCrossRefGoogle Scholar
  12. 12.
    Da Cruz Neto, J.X., Ferreira, O.P., Lucâmbio Pérez, L.R., Németh, S.Z.: Convex-and monotone-transformable mathematical programming problems and a proximal-like point method. J. Glob. Optim. 35, 53–69 (2006) MATHCrossRefGoogle Scholar
  13. 13.
    Ferreira, O.P., Oliveira, P.R.: Subgradient algorithm on Riemannian manifolds. J. Optim. Theory Appl. 97, 93–104 (1998) MathSciNetMATHCrossRefGoogle Scholar
  14. 14.
    Ferreira, O.P.: Proximal subgradient and a characterization of Lipschitz function on Riemannian manifolds. J. Math. Anal. Appl. 313, 587–597 (2006) MathSciNetMATHCrossRefGoogle Scholar
  15. 15.
    Ledyaev, Yu.S., Zhu, Q.J.: Nonsmooth analysis on smooth manifolds. Trans. Am. Math. Soc. 359, 3687–3732 (2007) MathSciNetMATHCrossRefGoogle Scholar
  16. 16.
    Li, C., López, G., Martín-Márquez, V.: Monotone vector fields and the proximal point algorithm on Hadamard manifolds. J. Lond. Math. Soc. 79(2), 663–683 (2009) MathSciNetMATHCrossRefGoogle Scholar
  17. 17.
    Li, S.L., Li, C., Liou, Y.C., Yao, J.C.: Existence of solutions for variational inequalities on Riemannian manifolds. Nonlinear Anal. 71, 5695–5706 (2009) MathSciNetMATHCrossRefGoogle Scholar
  18. 18.
    Papa Quiroz, E.A., Quispe, E.M., Oliveira, P.R.: Steepest descent method with a generalized Armijo search for quasiconvex functions on Riemannian manifolds. J. Math. Anal. Appl. 341(1), 467–477 (2008) MathSciNetMATHCrossRefGoogle Scholar
  19. 19.
    Papa Quiroz, E.A., Oliveira, P.R.: Proximal point methods for quasiconvex and convex functions with Bregman distances on Hadamard manifolds. J. Convex Anal. 16(1), 49–69 (2009) MathSciNetMATHGoogle Scholar
  20. 20.
    Wang, J.H., Li, C.: Kantorovich’s theorems of Newton’s method for mappings and optimization problems on Lie groups. IMA J. Numer. Anal. 31, 322–347 (2011) MathSciNetMATHCrossRefGoogle Scholar
  21. 21.
    Wang, J.H.: Convergence of Newton’s method for sections on Riemannian manifolds. J. Optim. Theory Appl. 148(1), 125–145 (2011) MathSciNetMATHCrossRefGoogle Scholar
  22. 22.
    Wang, J.H., Li, C.: Newton’s method on Lie groups with applications to optimization. IMA J. Numer. Anal. 31, 322–347 (2011) MathSciNetMATHCrossRefGoogle Scholar
  23. 23.
    Wang, J.H., Lopez, G., Martin-Marquez, V., Li, C.: Monotone and accretive vector fields on Riemannian manifolds. J. Optim. Theory Appl. 146, 691–708 (2010) MathSciNetMATHCrossRefGoogle Scholar
  24. 24.
    Wang, J.H., Dedieu, J.P.: Newton’s method on Lie groups: Smale’s point estimate theory under the γ-condition. J. Complex. 25, 128–151 (2009) MathSciNetMATHCrossRefGoogle Scholar
  25. 25.
    Wang, J.H., Huang, S.C., Li, C.: Extended Newton’s algorithm for mappings on Riemannian manifolds with values in a cone. Taiwan. J. Math. 13, 633–656 (2009) MathSciNetMATHGoogle Scholar
  26. 26.
    Li, C., Wang, J.H.: Newton’s method for sections on Riemannian manifolds: generalized covariant α-theory. J. Complex. 24, 423–451 (2008) MATHCrossRefGoogle Scholar
  27. 27.
    Wang, J.H., Li, C.: Uniqueness of the singular points of vector fields on Riemannian manifolds under the γ-condition. J. Complex. 22(4), 533–548 (2006) MATHCrossRefGoogle Scholar
  28. 28.
    Li, C., Wang, J.H.: Newton’s method on Riemannian manifolds: Smale’s point estimate theory under the γ-condition. IMA J. Numer. Anal. 26(2), 228–251 (2006) MathSciNetCrossRefGoogle Scholar
  29. 29.
    Bento, G.C., Melo, J.G.: A subgradient method for convex feasibility on Riemannian manifolds. J. Optim. Theory Appl. (2011). doi: 10.1007/s10957-011-9921-4 Google Scholar
  30. 30.
    Bento, G.C., Ferreira, O.P., Oliveira, P.R.: Local convergence of the proximal point method for a special class of nonconvex functions on Hadamard manifolds. Nonlinear Anal. 73, 564–572 (2010) MathSciNetMATHCrossRefGoogle Scholar
  31. 31.
    Udriste, C.: Convex functions and optimization algorithms on Riemannian manifolds. In: Mathematics and Its Applications, vol. 297. Kluwer Academic, Norwell (1994) Google Scholar
  32. 32.
    Smith, S.T.: Optimization techniques on Riemannian Manifolds. Fields Institute Communications, vol. 3, pp. 113–146. Am. Math. Soc., Providence (1994) Google Scholar
  33. 33.
    da Cruz Neto, J.X., de Lima, L.L., Oliveira, P.R.: Geodesic algorithms in Riemannian geometry. Balk. J. Geom. Appl. 3(2), 89–100 (1998) MATHGoogle Scholar
  34. 34.
    Do Carmo, M.P.: Riemannian Geometry. Birkhauser, Boston (1992) MATHGoogle Scholar
  35. 35.
    Sakai, T.: Riemannian Geometry. Translations of Mathematical Monographs, vol. 149. Am. Math. Soc., Providence (1996) MATHGoogle Scholar
  36. 36.
    Hiriart-Urruty, J.-B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms I and II. Springer, Berlin (1993) Google Scholar
  37. 37.
    Munier, J.: Steepest descent method on a Riemannian manifold: the convex case. Balk. J. Geom. Appl. 12(2), 98–106 (2007) MathSciNetMATHGoogle Scholar
  38. 38.
    Gabay, D.: Minimizing a differentiable function over a differentiable manifold. J. Optim. Theory Appl. 37, 177–219 (1982) MathSciNetMATHCrossRefGoogle Scholar
  39. 39.
    Luc, D.T.: Theory of Vector Optimization. Lecture Notes in Economics and Mathematical Systems, vol. 319. Springer, Berlin (1989) CrossRefGoogle Scholar
  40. 40.
    Papa Quiroz, E.A., Oliveira, P.R.: New self-concordant barrier for the hypercube. J. Optim. Theory Appl. 135, 475–490 (2007) MathSciNetMATHCrossRefGoogle Scholar
  41. 41.
    Rothaus, O.S.: Domains of positivity. Abh. Math. Semin. Univ. Hamb. 24, 189–235 (1960) MathSciNetMATHCrossRefGoogle Scholar
  42. 42.
    Nesterov, Y.E., Todd, M.J.: On the Riemannian geometry defined by self-concordant barriers and interior-point methods. Found. Comput. Math. 2(4), 333–361 (2002) MathSciNetMATHCrossRefGoogle Scholar
  43. 43.
    Lang, S.: Fundamentals of Differential Geometry. Springer, New York (1998) Google Scholar
  44. 44.
    Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15(4), 953–970 (2005) MathSciNetMATHCrossRefGoogle Scholar
  45. 45.
    Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009) MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  • G. C. Bento
    • 1
  • O. P. Ferreira
    • 1
  • P. R. Oliveira
    • 2
  1. 1.IME-Universidade Federal de GoiásGoiâniaBrazil
  2. 2.COPPE/Sistemas-Universidade Federal do Rio de JaneiroRio de JaneiroBrazil

Personalised recommendations