Skip to main content

Advertisement

Log in

An Incremental Subgradient Method on Riemannian Manifolds

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose and analyze an incremental subgradient method with a diminishing stepsize rule for a convex optimization problem on a Riemannian manifold, where the object function consisted of the sum of a large number of component functions. This type of function is useful in lots of fields. We establish an important inequality about the sequence generated by the method, provided the sectional curvature of the manifold is nonnegative. Using the inequality, we prove Proposition 3.1, and then we obtain some convergence results of the incremental subgradient method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Widrow, B., Hoff, M.E.: Adaptive switching circuits. In: Institute of Radio Engineers, Western Electronic Show and Convention, Convention Record. Part 4, pp. 96–104 (1960)

  2. Luo, Z.Q., Tseng, P.: Analysis of an approximate gradient projection method with applications to the backpropagation algorithm. Optim. Methods Softw. 4, 85–101 (1994)

    Article  Google Scholar 

  3. Bertsekas, D.P.: A new class of incremental gradient methods for least squares problems. SIAM J. Optim. 7, 913–926 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  4. Tseng, P.: An incremental gradient(-projection) method with momentum term and adaptive stepsize rule. SIAM J. Optim. 8(2), 506–531 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  5. Bertsekas, D.P., Tsitsiklis, J.N.: Gradient convergence in gradient methods. SIAM J. Optim. 10(3), 627–642 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  6. Kibardin, V.M.: Decomposition into functions in the minimization problem. Autom. Remote Control 40, 1311–1323 (1980)

    MATH  Google Scholar 

  7. Solodov, M.V., Zavriev, S.K.: Error stability properties of generalized gradient-type algorithms. J. Optim. Theory Appl. 98(3), 663–680 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  8. Nedić, A., Bertsekas, D.P.: Convergence rate of incremental subgradient algorithms. Stoch. Optim. Algorithms Appl. 54, 223–264 (2001)

    MathSciNet  MATH  Google Scholar 

  9. Ben-Tal, A., Margalit, T., Nemirovski, A.: The ordered subsets mirror descent optimization method and its use for the positron emission tomography reconstruction. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications. Elsevier, Amsterdam (2001)

    Google Scholar 

  10. Nedić, A., Bertsekas, D.P., Borkar, V.S.: Distributed asynchronous incremental subgradient methods. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Studies in Computational Mathematics, Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications, vol. 8, pp. 381–407. Elsevier, Amsterdam (2001)

    Google Scholar 

  11. Kiwiel, K.C., Lindberg, P.O.: Parallel subgradient methods for convex optimization. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Studies in Computational Mathematics, Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications, vol. 8, pp. 335–344. Elsevier, Amsterdam (2001)

    Google Scholar 

  12. Nedić, A., Bertsekas, D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  13. Ram, S.S., Nedić, A., Veeravalli, V.V.: Incremental stochastic subgradient slgorithms for convex optimization. SIAM J. Optim. 20, 691–717 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  14. Shalev-Shwartz, S., Singer, Y., Srebro, N.: Pegasos: primal estimated subgradient solver for SVM. Math Program. 127, 3–30 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  15. Predd, J., Kulkarni, S., Poor, H.: Distributed learning in wireless sensor networks. IEEE Signal Process. Mag. 23, 56–69 (2006)

    Article  Google Scholar 

  16. Yousefian, F., Nedić, A., Shanbhag, U.V.: On stochastic gradient and subgradient methods with adaptive steplength sequences. Automatica 48, 56–67 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  17. Feng, H., Jiang, Z., Hu, B., Zhang, J.: The incremental subgradient methods on distributed estimations in-network. Sci. China Inf. Sci. 57, 1–10 (2014)

    Google Scholar 

  18. Wang, J.H., Lopez, G., Martin-Marquez, V., Li, C.: Monotone and accretive vector fields on Riemannian manifolds. J. Optim. Theory Appl. 146, 691–708 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  19. Rapcsák, T.: Local convexity on smooth manifolds. J. Optim. Theory Appl. 127(1), 165–176 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  20. Rapcsák, T.: Geodesic convexity in nonlinear optimization. J. Optim. Theory Appl. 69(1), 169–183 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  21. Bento, G.C., Ferreira, O.P., Oliveira, P.R.: Local convergence of the proximal point method for a special class of nonconvex functions on Hadamard manifolds. Nonlinear Anal. 73, 564–572 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  22. Da Cruz Neto, J.X., Ferreira, O.P., Oliveira, P.R.: Central paths in semidefinite programming, generalized proximal-point method and Cauchy trajectories in Riemannian manifolds. J. Optim. Theory Appl. 139, 227–242 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  23. Wang, J.H., Huang, S.C., Li, C.: Extended Newton’ s algorithm for mappings on Riemannian manifolds with values in a cone. Taiwan J. Math. 13, 633–656 (2009)

    Article  MATH  Google Scholar 

  24. Li, C., Wang, J.H.: Newton’s method for sections on Riemannian manifolds: generalized covariant \(\alpha \)-theory. J. Complex. 24, 423–451 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  25. Wang, J.H.: Convergence of Newton’ s method for sections on Riemannian manifolds. J. Optim. Theory Appl. 148(1), 125–145 (2011)

    Article  MathSciNet  Google Scholar 

  26. Bento, G.C., Ferreira, O.P., Oliveira, P.R.: Unconstrained steepest descent method for multicriteria optimization on Riemannian manifolds. J. Optim. Theory Appl. 154, 88–107 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  27. Bonnel, H., Todjihoundé, L., Udriste, C.: Semivectorial bilevel optimization on Riemannian manifolds. J. Optim Theory Appl. 167, 464–486 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  28. Li, C., Mordukhovich, B.S., Wang, J.H., Yao, J.C.: Weak sharp minima on Riemannian manifolds. SIAM J. Optim. 21(4), 1523–1560 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  29. Udriste, C.: Convex functions and optimization algorithms on Riemannian manifolds. In: Mathematics and Its Applications, vol. 297. Kluwer Academic, Dordrecht (1994)

  30. Da Cruz Neto, J.X., Ferreira, O.P., Lucâmbio Pérez, L.R., Németh, S.Z.: Convex-and monotone-transformable mathematical programming problems and a proximal-like point method. J. Glob. Optim. 35, 53–69 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  31. Colao, V., López, G., Marino, G., Martín-Márquez, V.: Equilibrium problems in Hadamard manifolds. J. Math. Anal. Appl. 388, 61–77 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  32. Ferreira, O.P., Iusem, A.N., Németh, S.Z.: Concepts and techniques of optimization on the sphere. Top 22, 1148–1170 (2014). https://doi.org/10.1007/s11750-014-0322-3

    Article  MathSciNet  MATH  Google Scholar 

  33. Ferreira, O.P., Oliveira, P.R.: Subgradient algorithm on Riemannian manifolds. J. Optim. Theory Appl. 97, 93–104 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  34. Ferreira, O.P.: Proximal subgradient and a characterization of Lipschitz function on Riemannian manifolds. J. Math. Anal. Appl. 313, 587–597 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  35. Bento, G.C., Melo, J.G.: Subgradientmethod for convex feasibility on riemannian manifolds. J. Optim. Theory Appl. 152, 773–785 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  36. Bento, G.C., Cruz Neto, J.X.: A subgradient method for multiobjective optimization on Riemannian manifolds. J. Optim. Theory Appl. 159, 125–137 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  37. Wang, X.M., Li, C., Yao, J.C.: Subgradient projection algorithms for convex feasibility on Riemannian manifolds with lower bounded curvatures. J. Optim. Theory Appl. 164, 202–217 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  38. Do Carmo, M.P.: Riemannian Geometry. Birkhauser, Boston (1992)

    Book  MATH  Google Scholar 

  39. Sakai, T.: Riemannian geometry. In: Translations of Mathematical Monographs, vol. 149. American Mathematical Society, Providence (1996)

  40. Alber, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81, 23–35 (1998)

    MathSciNet  MATH  Google Scholar 

  41. Knopp, K.: Theory and Application of Infinite Series. Dover, New York (1990)

    MATH  Google Scholar 

  42. Bertekas, D.P.: Network Optimization: Continuous and Discrete Models. Athena Scientific, Belmont (1998)

    Google Scholar 

  43. Li, Y., Yang, Z., Deng J.: Spherical parameterization of genus-zero meshes using the Lagrange–Newton method. In: IEEE International Conference on Computer-Aided Design and Computer Graphics, pp. 32–32. IEEE (2007)

Download references

Acknowledgements

This research was partially supported by the National Natural Science Foundation of China under Grant 11771107.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peng Zhang.

Additional information

Communicated by Alexandru Kristály.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, P., Bao, G. An Incremental Subgradient Method on Riemannian Manifolds. J Optim Theory Appl 176, 711–727 (2018). https://doi.org/10.1007/s10957-018-1224-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-018-1224-6

Keywords

Mathematics Subject Classification

Navigation