Abstract
In this paper, we propose and analyze an incremental subgradient method with a diminishing stepsize rule for a convex optimization problem on a Riemannian manifold, where the object function consisted of the sum of a large number of component functions. This type of function is useful in lots of fields. We establish an important inequality about the sequence generated by the method, provided the sectional curvature of the manifold is nonnegative. Using the inequality, we prove Proposition 3.1, and then we obtain some convergence results of the incremental subgradient method.
Similar content being viewed by others
References
Widrow, B., Hoff, M.E.: Adaptive switching circuits. In: Institute of Radio Engineers, Western Electronic Show and Convention, Convention Record. Part 4, pp. 96–104 (1960)
Luo, Z.Q., Tseng, P.: Analysis of an approximate gradient projection method with applications to the backpropagation algorithm. Optim. Methods Softw. 4, 85–101 (1994)
Bertsekas, D.P.: A new class of incremental gradient methods for least squares problems. SIAM J. Optim. 7, 913–926 (1997)
Tseng, P.: An incremental gradient(-projection) method with momentum term and adaptive stepsize rule. SIAM J. Optim. 8(2), 506–531 (1998)
Bertsekas, D.P., Tsitsiklis, J.N.: Gradient convergence in gradient methods. SIAM J. Optim. 10(3), 627–642 (2000)
Kibardin, V.M.: Decomposition into functions in the minimization problem. Autom. Remote Control 40, 1311–1323 (1980)
Solodov, M.V., Zavriev, S.K.: Error stability properties of generalized gradient-type algorithms. J. Optim. Theory Appl. 98(3), 663–680 (1998)
Nedić, A., Bertsekas, D.P.: Convergence rate of incremental subgradient algorithms. Stoch. Optim. Algorithms Appl. 54, 223–264 (2001)
Ben-Tal, A., Margalit, T., Nemirovski, A.: The ordered subsets mirror descent optimization method and its use for the positron emission tomography reconstruction. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications. Elsevier, Amsterdam (2001)
Nedić, A., Bertsekas, D.P., Borkar, V.S.: Distributed asynchronous incremental subgradient methods. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Studies in Computational Mathematics, Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications, vol. 8, pp. 381–407. Elsevier, Amsterdam (2001)
Kiwiel, K.C., Lindberg, P.O.: Parallel subgradient methods for convex optimization. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Studies in Computational Mathematics, Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications, vol. 8, pp. 335–344. Elsevier, Amsterdam (2001)
Nedić, A., Bertsekas, D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)
Ram, S.S., Nedić, A., Veeravalli, V.V.: Incremental stochastic subgradient slgorithms for convex optimization. SIAM J. Optim. 20, 691–717 (2009)
Shalev-Shwartz, S., Singer, Y., Srebro, N.: Pegasos: primal estimated subgradient solver for SVM. Math Program. 127, 3–30 (2011)
Predd, J., Kulkarni, S., Poor, H.: Distributed learning in wireless sensor networks. IEEE Signal Process. Mag. 23, 56–69 (2006)
Yousefian, F., Nedić, A., Shanbhag, U.V.: On stochastic gradient and subgradient methods with adaptive steplength sequences. Automatica 48, 56–67 (2012)
Feng, H., Jiang, Z., Hu, B., Zhang, J.: The incremental subgradient methods on distributed estimations in-network. Sci. China Inf. Sci. 57, 1–10 (2014)
Wang, J.H., Lopez, G., Martin-Marquez, V., Li, C.: Monotone and accretive vector fields on Riemannian manifolds. J. Optim. Theory Appl. 146, 691–708 (2010)
Rapcsák, T.: Local convexity on smooth manifolds. J. Optim. Theory Appl. 127(1), 165–176 (2005)
Rapcsák, T.: Geodesic convexity in nonlinear optimization. J. Optim. Theory Appl. 69(1), 169–183 (1991)
Bento, G.C., Ferreira, O.P., Oliveira, P.R.: Local convergence of the proximal point method for a special class of nonconvex functions on Hadamard manifolds. Nonlinear Anal. 73, 564–572 (2010)
Da Cruz Neto, J.X., Ferreira, O.P., Oliveira, P.R.: Central paths in semidefinite programming, generalized proximal-point method and Cauchy trajectories in Riemannian manifolds. J. Optim. Theory Appl. 139, 227–242 (2008)
Wang, J.H., Huang, S.C., Li, C.: Extended Newton’ s algorithm for mappings on Riemannian manifolds with values in a cone. Taiwan J. Math. 13, 633–656 (2009)
Li, C., Wang, J.H.: Newton’s method for sections on Riemannian manifolds: generalized covariant \(\alpha \)-theory. J. Complex. 24, 423–451 (2008)
Wang, J.H.: Convergence of Newton’ s method for sections on Riemannian manifolds. J. Optim. Theory Appl. 148(1), 125–145 (2011)
Bento, G.C., Ferreira, O.P., Oliveira, P.R.: Unconstrained steepest descent method for multicriteria optimization on Riemannian manifolds. J. Optim. Theory Appl. 154, 88–107 (2012)
Bonnel, H., Todjihoundé, L., Udriste, C.: Semivectorial bilevel optimization on Riemannian manifolds. J. Optim Theory Appl. 167, 464–486 (2015)
Li, C., Mordukhovich, B.S., Wang, J.H., Yao, J.C.: Weak sharp minima on Riemannian manifolds. SIAM J. Optim. 21(4), 1523–1560 (2011)
Udriste, C.: Convex functions and optimization algorithms on Riemannian manifolds. In: Mathematics and Its Applications, vol. 297. Kluwer Academic, Dordrecht (1994)
Da Cruz Neto, J.X., Ferreira, O.P., Lucâmbio Pérez, L.R., Németh, S.Z.: Convex-and monotone-transformable mathematical programming problems and a proximal-like point method. J. Glob. Optim. 35, 53–69 (2006)
Colao, V., López, G., Marino, G., Martín-Márquez, V.: Equilibrium problems in Hadamard manifolds. J. Math. Anal. Appl. 388, 61–77 (2012)
Ferreira, O.P., Iusem, A.N., Németh, S.Z.: Concepts and techniques of optimization on the sphere. Top 22, 1148–1170 (2014). https://doi.org/10.1007/s11750-014-0322-3
Ferreira, O.P., Oliveira, P.R.: Subgradient algorithm on Riemannian manifolds. J. Optim. Theory Appl. 97, 93–104 (1998)
Ferreira, O.P.: Proximal subgradient and a characterization of Lipschitz function on Riemannian manifolds. J. Math. Anal. Appl. 313, 587–597 (2006)
Bento, G.C., Melo, J.G.: Subgradientmethod for convex feasibility on riemannian manifolds. J. Optim. Theory Appl. 152, 773–785 (2012)
Bento, G.C., Cruz Neto, J.X.: A subgradient method for multiobjective optimization on Riemannian manifolds. J. Optim. Theory Appl. 159, 125–137 (2013)
Wang, X.M., Li, C., Yao, J.C.: Subgradient projection algorithms for convex feasibility on Riemannian manifolds with lower bounded curvatures. J. Optim. Theory Appl. 164, 202–217 (2015)
Do Carmo, M.P.: Riemannian Geometry. Birkhauser, Boston (1992)
Sakai, T.: Riemannian geometry. In: Translations of Mathematical Monographs, vol. 149. American Mathematical Society, Providence (1996)
Alber, Y.I., Iusem, A.N., Solodov, M.V.: On the projected subgradient method for nonsmooth convex optimization in a Hilbert space. Math. Program. 81, 23–35 (1998)
Knopp, K.: Theory and Application of Infinite Series. Dover, New York (1990)
Bertekas, D.P.: Network Optimization: Continuous and Discrete Models. Athena Scientific, Belmont (1998)
Li, Y., Yang, Z., Deng J.: Spherical parameterization of genus-zero meshes using the Lagrange–Newton method. In: IEEE International Conference on Computer-Aided Design and Computer Graphics, pp. 32–32. IEEE (2007)
Acknowledgements
This research was partially supported by the National Natural Science Foundation of China under Grant 11771107.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Alexandru Kristály.
Rights and permissions
About this article
Cite this article
Zhang, P., Bao, G. An Incremental Subgradient Method on Riemannian Manifolds. J Optim Theory Appl 176, 711–727 (2018). https://doi.org/10.1007/s10957-018-1224-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-018-1224-6