Skip to main content

Hybrid Riemannian conjugate gradient methods with global convergence properties

Abstract

This paper presents Riemannian conjugate gradient methods and global convergence analyses under the strong Wolfe conditions. The main idea of the proposed methods is to combine the good global convergence properties of the Dai–Yuan method with the efficient numerical performance of the Hestenes–Stiefel method. One of the proposed algorithms is a generalization to Riemannian manifolds of the hybrid conjugate gradient method of the Dai and Yuan in Euclidean space. The proposed methods are compared well numerically with the existing methods for solving several Riemannian optimization problems. Python implementations of the methods used in the numerical experiments are available at https://github.com/iiduka-researches/202008-hybrid-rcg.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2

Availability of data and materials

Not applicable.

Notes

  1. 1.

    The formulas defined by (26) and (27) satisfy \(-\sigma \le r_k \le 1\).

References

  1. 1.

    Absil, P.-A., Gallivan, K.A.: Joint diagonalization on the oblique manifold for independent component analysis. In: 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings, vol. 5, p. V (2006)

  2. 2.

    Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    MATH  Book  Google Scholar 

  3. 3.

    Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    MathSciNet  MATH  Article  Google Scholar 

  4. 4.

    Dai, Y.-H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    MathSciNet  MATH  Article  Google Scholar 

  5. 5.

    Dai, Y.-H., Yuan, Y.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103(1–4), 33–47 (2001)

    MathSciNet  MATH  Article  Google Scholar 

  6. 6.

    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    MathSciNet  MATH  Article  Google Scholar 

  7. 7.

    Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    MathSciNet  MATH  Article  Google Scholar 

  8. 8.

    Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  9. 9.

    Hawe, S., Kleinsteuber, M., Diepold, K.: Analysis operator learning and its application to image reconstruction. IEEE Trans. Image Process. 22(6), 2138–2150 (2013)

    MathSciNet  MATH  Article  Google Scholar 

  10. 10.

    Hestenes, M.R., Stiefel, E.: Methods of Conjugate Gradients for Solving Linear Systems. NBS, Washington (1952)

    MATH  Book  Google Scholar 

  11. 11.

    Hu, Y., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)

    MathSciNet  MATH  Article  Google Scholar 

  12. 12.

    Motzkin, T.S., Straus, E.G.: Maxima for graphs and a new proof of a theorem of Turán. Can. J. Math. 17, 533–540 (1965)

    MATH  Article  Google Scholar 

  13. 13.

    Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. ESAIM Math. Model. Numer. Anal.-Modélisation Mathématique et Analyse Numérique 3(R1), 35–43 (1969)

    MATH  Google Scholar 

  14. 14.

    Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012)

    MathSciNet  MATH  Article  Google Scholar 

  15. 15.

    Sato, H.: A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput. Optim. Appl. 64(1), 101–118 (2016)

    MathSciNet  MATH  Article  Google Scholar 

  16. 16.

    Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4), 1011–1031 (2015)

    MathSciNet  MATH  Article  Google Scholar 

  17. 17.

    Selvan, S.E., Amato, U., Gallivan, K.A., Qi, C., Carfora, M.F., Larobina, M., Alfano, B.: Descent algorithms on oblique manifold for source-adaptive ICA contrast. IEEE Trans. Neural Netw. Learn. Syst. 23(12), 1930–1947 (2012)

    Article  Google Scholar 

  18. 18.

    Smith, S.T.: Optimization techniques on Riemannian manifolds. Fields Inst. Commun. 3(3), 113–135 (1994)

    MathSciNet  MATH  Google Scholar 

  19. 19.

    Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64(2), 379–397 (1990)

    MathSciNet  MATH  Article  Google Scholar 

  20. 20.

    Townsend, J., Koep, N., Weichwald, S.: Pymanopt: a python toolbox for optimization on manifolds using automatic differentiation. J. Mach. Learn. Res. 17(1), 4755–4759 (2016)

    MathSciNet  MATH  Google Scholar 

  21. 21.

    Vandereycken, B.: Low-rank matrix completion by Riemannian optimization. SIAM J. Optim. 23(2), 1214–1236 (2013)

    MathSciNet  MATH  Article  Google Scholar 

  22. 22.

    Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    MathSciNet  MATH  Article  Google Scholar 

  23. 23.

    Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13(2), 185–188 (1971)

    MathSciNet  MATH  Article  Google Scholar 

  24. 24.

    Yuan, H., Gu, X., Lai, R., Wen, Z.: Global optimization with orthogonality constraints via stochastic diffusion on manifold. J. Sci. Comput. 80(2), 1139–1170 (2019)

    MathSciNet  MATH  Article  Google Scholar 

Download references

Acknowledgements

We are sincerely grateful to the editor and the anonymous reviewer for helping us improve the original manuscript. This work was supported by JSPS KAKENHI Grant Number JP18K11184.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Hiroyuki Sakai.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Code availability

Python implementations of the methods used in the numerical experiments are available at https://github.com/iiduka-researches/202008-hybrid-rcg.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Sakai, H., Iiduka, H. Hybrid Riemannian conjugate gradient methods with global convergence properties. Comput Optim Appl 77, 811–830 (2020). https://doi.org/10.1007/s10589-020-00224-9

Download citation

Keywords

  • Conjugate gradient method
  • Riemannian optimization
  • Hybrid conjugate gradient method
  • Global convergence
  • Strong Wolfe conditions