Skip to main content
Log in

An Adaptive Riemannian Gradient Method Without Function Evaluations

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

In this paper, we present an adaptive gradient method for the minimization of differentiable functions on Riemannian manifolds. The method is designed to minimize functions with Lipschitz continuous gradient field, but it does not required the knowledge of the Lipschitz constant. In contrast with line search schemes, the dynamic adjustment of the stepsizes is done without the use of function evaluations. We prove worst-case complexity bounds for the number of gradient evaluations that the proposed method needs to find an approximate stationary point. Preliminary numerical results are also presented and illustrate the potential advantages of different versions of our method in comparison with a Riemannian gradient method with Armijo line search.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Data Availability

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

Notes

  1. We thank an anonymous reviewer for pointing this out.

  2. See, e.g., Lemma 3.2 in [4].

  3. This toolbox is freely available in the website https://www.manopt.org/. Specifically, we used the codes steepestdescent.m and linesearch.m. In the initialization, we substituted the manifold retraction (M.retr) by the exponential map (M.exp).

  4. See Example 4 in [11].

  5. The performance profiles were generated using the code perf.m freely available in the website http://www.mcs.anl.gov/~more/cops/.

  6. See Section 5.2.1 in [11].

References

  1. Absil, P.-A., Baker, C.G., Gallivan, K.A.: Trust-region methods on Riemannian manifolds. Found. Comput. Math. 7, 303–330 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  2. Absil, P.-A., Mahony, R.: Sepulchre: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    Book  MATH  Google Scholar 

  3. Armijo, L.: Minimization of functions having Lipschitz continuous first partial derivatives. Pac. J. Math. 16, 1–3 (1966)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bento, G.C., Ferreira, O.R., Melo, J.G.: Iteration-complexity of gradient, subgradient and proximal point methods on Riemannian manifolds. J. Optim. Theory Appl. 173, 548–562 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  5. Boumal, N.: An Introduction to Optimization on Smooth Manifolds. Cambridge University Press, Cambridge (2023)

    Book  MATH  Google Scholar 

  6. Boumal, N., Absil, P.-A., Cartis, C.: Global rates of convergence for nonconvex optimization on manifolds. IMA J. Numer. Anal. 39, 1–33 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  7. Boumal, N., Mishra, B., Absil, P.-A., Sepulchre, R.: Manopt, a MATLAB toolbox for optimization on manifolds. J. Mach. Learn. Res. 15, 1455–1459 (2014)

    MATH  Google Scholar 

  8. Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. C. R. Acad. Sci. Paris 25, 536–538 (1847)

    Google Scholar 

  9. Cruz Neto, J.X., Lima, L.L., Oliveira, P.R.: Geodesic algorithms in Riemannian geometry. Balkan J. Geom. Its Appl. 3, 89–100 (1998)

    MathSciNet  MATH  Google Scholar 

  10. Dolan, E., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–2013 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  11. Ferreira, O.P., Louzeiro, M.S., Prudente, L.F.: Gradient method for optimization on Riemannian manifolds with lower bounded curvature. SIAM J. Optim. 29, 2517–2541 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  12. Grapiglia, G.N., Stella, G.F.D.: An adaptive trust-region method without function evaluations. Comput. Optim. Appl. 82, 31–60 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  13. Gratton, S., Jerad, S., Toint, Ph.L.: First-order objective-free optimization algorithms and their complexity. arXiv:2203.01757v1, (2022)

  14. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23, 707–716 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  15. Lojasiewicz, S.: Une propriété topologique des sous-ensembles analytiques réels. Les équations aux dérivées partielles 117, 87–89 (1963)

    MATH  Google Scholar 

  16. Polyak, B.T.: Gradient methods for minimizing functionals. Zhurnal Vychislitel’noi Matematiki i Matematicheskoi Fiziki 3, 643–653 (1963)

    MathSciNet  MATH  Google Scholar 

  17. Sachs, E.W., Sachs, S.M.: Nonmonotone line searches for optimization algorithms. Control Cybern. 40, 1059–1075 (2011)

    MathSciNet  MATH  Google Scholar 

  18. Sato, H.: Riemannian Optimization and Its Applications. Springer, Berlin (2021)

    Book  MATH  Google Scholar 

  19. Udriste, C.: Convex Functions and Optimization Methods on Riemannian Manifolds, vol. 297. Springer, Berlin (1994)

    Book  MATH  Google Scholar 

  20. Ward, R., Wu, X., Bottou, L.: Adagrad Stepsizes: sharp convergence over nonconvex landscapes. J. Mach. Learn. Res. 21, 1–30 (2020)

    MathSciNet  MATH  Google Scholar 

  21. Wu, X., Ward, R., Bottou, L.: WNGrad: learn the learning rate in gradient descent. arXiv:1803.02865, (2020)

  22. Zhang, H., Sra, S.: First-order methods for geodesically convex optimization. In: Proceedings of the 29th annual conference on learning theory (2016)

  23. Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are very grateful to the two anonymous referees, whose comments helped to improve the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Geovani N. Grapiglia.

Additional information

Communicated by Alexandru Kristály.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

G. N. Grapiglia was partially supported by the National Council for Scientific and Technological Development (CNPq) - Brazil (Grant 312777/2020-5). G.F.D. Stella was supported by the Coordination for the Improvement of Higher Education Personnel (CAPES) - Brazil

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Grapiglia, G.N., Stella, G.F.D. An Adaptive Riemannian Gradient Method Without Function Evaluations. J Optim Theory Appl 197, 1140–1160 (2023). https://doi.org/10.1007/s10957-023-02227-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-023-02227-y

Keywords

Navigation