Skip to main content
Log in

Practical gradient and conjugate gradient methods on flag manifolds

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

Flag manifolds, sets of nested sequences of linear subspaces with fixed dimensions, are rising in numerical analysis and statistics. The current optimization algorithms on flag manifolds are based on the exponential map and parallel transport, which are expensive to compute. In this paper we propose practical optimization methods on flag manifolds without the exponential map and parallel transport. Observing that flag manifolds have a similar homogeneous structure with Grassmann and Stiefel manifolds, we generalize some typical retractions and vector transports to flag manifolds, including the Cayley-type retraction and vector transport, the QR-based and polar-based retractions, the projection-type vector transport and the projection of the differentiated polar-based retraction as a vector transport. Theoretical properties and efficient implementations of the proposed retractions and vector transports are discussed. Then we establish Riemannian gradient and Riemannian conjugate gradient algorithms based on these retractions and vector transports. Numerical results on the problem of nonlinear eigenflags demonstrate that our algorithms have a great advantage in efficiency over the existing ones.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Algorithm 2
Fig. 1

Similar content being viewed by others

Data availability

Data supporting the results and analysis of this paper is available from the corresponding author upon request.

References

  1. Absil, P.-A., Amodei, L., Meyer, G.: Two Newton methods on the manifold of fixed-rank matrices endowed with Riemannian quotient geometries. Comput. Stat. 29, 569–590 (2014)

    Article  MathSciNet  Google Scholar 

  2. Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton, NJ (2008)

    Book  Google Scholar 

  3. Absil, P.-A., Oseledets, I.V.: Low-rank retractions: a survey and new results. Comput. Optim. Appl. 62, 5–29 (2015)

    Article  MathSciNet  Google Scholar 

  4. Agarwal, N., Boumal, N., Bullins, B., Cartis, C.: Adaptive regularization with cubics on manifolds. Math. Program. 188, 85–134 (2021)

    Article  MathSciNet  Google Scholar 

  5. Ammar, G., Martin, C.: The geometry of matrix eigenvalue methods. Acta Appl. Math. 5, 239–278 (1986)

    Article  MathSciNet  Google Scholar 

  6. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  7. Boumal, N.: An Introduction to Optimization on Smooth Manifolds. Cambridge University Press, Cambridge (2023)

    Book  Google Scholar 

  8. Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM J. Optim. 30, 210–239 (2020)

    Article  MathSciNet  Google Scholar 

  9. Criscitiello, C., Boumal, N.: An accelerated first-order method for non-convex optimization on manifolds. Found. Comput. Math. 23, 1433–1509 (2023)

    Article  MathSciNet  Google Scholar 

  10. Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1998)

    Article  MathSciNet  Google Scholar 

  11. Fiori, S., Kaneko, T., Tanaka, T.: Tangent-bundle maps on the Grassmann manifold: application to empirical arithmetic averaging. IEEE Trans. Signal Process. 63, 155–168 (2015)

    Article  ADS  MathSciNet  Google Scholar 

  12. Gao, B., Liu, X., Chen, X., Yuan, Y.: A new first-order algorithmic framework for optimization problems with orthogonality constraints. SIAM J. Optim. 28, 302–332 (2018)

    Article  MathSciNet  Google Scholar 

  13. Gao, B., Thanh Son, N., Absil, P.-A., Stykel, T.: Riemannian optimization on the symplectic Stiefel manifold. SIAM J. Optim. 31, 1546–1575 (2021)

    Article  MathSciNet  Google Scholar 

  14. Higham, N.J.: Functions of Matrices: Theory and Computation. SIAM, Philadelphia (2008)

    Book  Google Scholar 

  15. Hosseini, R., Sra, S.: An alternative to EM for Gaussian mixture models: batch and stochastic Riemannian optimization. Math. Program. 181, 187–223 (2020)

    Article  MathSciNet  Google Scholar 

  16. Hosseini, S., Uschmajew, A.: A Riemannian gradient sampling algorithm for nonsmooth optimization on manifolds. SIAM J. Optim. 27, 173–189 (2017)

    Article  MathSciNet  Google Scholar 

  17. Hu, J., Liu, X., Wen, Z., Yuan, Y.: A brief introduction to manifold optimization. J. Oper. Res. Soc. China 8, 199–248 (2020)

    Article  MathSciNet  Google Scholar 

  18. Huang, W., Gallivan, K.A., Absil, P.-A.: A Broyden class of quasi-Newton methods for Riemannian optimization. SIAM J. Optim. 25, 1660–1685 (2015)

    Article  MathSciNet  Google Scholar 

  19. Huang, W., Wei, K.: Riemannian proximal gradient methods. Math. Program. 194, 371–413 (2022)

    Article  MathSciNet  Google Scholar 

  20. Jiang, B., Dai, Y.: A framework of constraint preserving update schemes for optimization on Stiefel manifold. Math. Program. 153, 535–575 (2015)

    Article  MathSciNet  Google Scholar 

  21. Kressner, D., Steinlechner, M., Vandereycken, B.: Low-rank tensor completion by Riemannian optimization. BIT Numer. Math. 54, 447–468 (2014)

    Article  MathSciNet  Google Scholar 

  22. Liu, C., Boumal, N.: Simple algorithms for optimization on Riemannian manifolds with constraints. Appl. Math. Optim. 82, 949–981 (2020)

    Article  MathSciNet  Google Scholar 

  23. Ma, X., Kirby, M., Peterson, C.: Self-organizing mappings on the flag manifold with applications to hyper-spectral image data analysis. Neural Comput. Appl. 34, 39–49 (2022)

    Article  Google Scholar 

  24. Mishra, B., Sepulchre, R.: Riemannian preconditioning. SIAM J. Optim. 26, 635–660 (2016)

    Article  MathSciNet  Google Scholar 

  25. Nguyen, D.: Closed-form geodesics and optimization for Riemannian logarithms of Stiefel and flag manifolds. J. Optim. Theory Appl. 194, 142–166 (2022)

    Article  MathSciNet  Google Scholar 

  26. Nishimori, Y., Akaho, S.: Learning algorithms utilizing quasi-geodesic flows on the Stiefel manifold. Neurocomputing 67, 106–135 (2005)

    Article  Google Scholar 

  27. Nishimori, Y., Akaho, S., Plumbley, M. D.: Riemannian optimization method on the flag manifold for independent subspace analysis. In: International Conference on Independent Component Analysis and Signal Separation, pp. 295–302. Springer (2006)

  28. Nishimori, Y., Akaho, S., Plumbley, M. D.: Natural conjugate gradient on complex flag manifolds for complex independent subspace analysis. In: International Conference on Artificial Neural Networks, pp. 165–174. Springer (2008)

  29. Obara, M., Okuno, T., Takeda, A.: Sequential quadratic optimization for nonlinear optimization problems on Riemannian manifolds. SIAM J. Optim. 32, 822–853 (2022)

    Article  MathSciNet  Google Scholar 

  30. O’Neill, B.: Semi-Riemannian Geometry with Applications to General Relativity. Academic Press, New York (1983)

    Google Scholar 

  31. Pennec, X.: Barycentric subspace analysis on manifolds. Ann. Stat. 46, 2711–2746 (2018)

    Article  MathSciNet  Google Scholar 

  32. Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22, 596–627 (2012)

    Article  MathSciNet  Google Scholar 

  33. Sato, H.: A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput. Optim. Appl. 64, 101–118 (2016)

    Article  MathSciNet  Google Scholar 

  34. Sato, H.: Riemannian Optimization and Its Applications. Springer Nature, New York (2021)

    Book  Google Scholar 

  35. Sato, H.: Riemannian conjugate gradient methods: general framework and specific algorithms with convergence analyses. SIAM J. Optim. 32, 2690–2717 (2022)

    Article  MathSciNet  Google Scholar 

  36. Sato, H., Iwai, T.: Optimization algorithms on the Grassmann manifold with application to matrix eigenvalue problems. Japan J. Indust. Appl. Math. 31, 355–400 (2014)

    Article  MathSciNet  Google Scholar 

  37. Sato, H., Kasai, H., Mishra, B.: Riemannian stochastic variance reduced gradient algorithm with retraction and vector transport. SIAM J. Optim. 29, 1444–1472 (2019)

    Article  MathSciNet  Google Scholar 

  38. Steinlechner, M.: Riemannian optimization for high-dimensional tensor completion. SIAM J. Optim. 38, 3461–3484 (2016)

    MathSciNet  Google Scholar 

  39. Vandereycken, B.: Low-rank matrix completion by Riemannian optimization. SIAM J. Optim. 23, 1214–1236 (2013)

    Article  MathSciNet  Google Scholar 

  40. Weber, M., Sra, S.: Riemannian optimization via Frank–Wolfe methods. Math. Program. 199, 525–556 (2023)

    Article  MathSciNet  Google Scholar 

  41. Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Math. Program. 142, 397–434 (2013)

    Article  MathSciNet  Google Scholar 

  42. Ye, K., Wong, K.S.-W., Lim, L.-H.: Optimization on flag manifolds. Math. Program. 194, 621–660 (2022)

    Article  MathSciNet  Google Scholar 

  43. Zhang, J., Ma, S., Zhang, S.: Primal-dual optimization algorithms over Riemannian manifolds: an iteration complexity analysis. Math. Program. 184, 445–490 (2020)

    Article  MathSciNet  Google Scholar 

  44. Zhou, Y., Bao, C., Ding, C., Zhu, J.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Math. Program. 201, 1–61 (2023)

    Article  MathSciNet  Google Scholar 

  45. Zhu, X.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67, 73–110 (2017)

    Article  MathSciNet  Google Scholar 

  46. Zhu, X., Sato, H.: Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds. Adv. Comput. Math. 47, 56 (2021)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the associate editor and two anonymous referees for their valuable comments and suggestions, as well as Dr. Ke Ye for sharing some of the MATLAB code in [42].

Funding

This work was financially supported by the National Natural Science Foundation of China (Grant Nos. 12271342 and 11601317).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaojing Zhu.

Ethics declarations

Conflict of interest

  The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, X., Shen, C. Practical gradient and conjugate gradient methods on flag manifolds. Comput Optim Appl (2024). https://doi.org/10.1007/s10589-024-00568-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10589-024-00568-6

Keywords

Mathematics Subject Classification

Navigation