Skip to main content
Log in

A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

This article describes a new Riemannian conjugate gradient method and presents a global convergence analysis. The existing Fletcher–Reeves-type Riemannian conjugate gradient method is guaranteed to be globally convergent if it is implemented with the strong Wolfe conditions. On the other hand, the Dai–Yuan-type Euclidean conjugate gradient method generates globally convergent sequences under the weak Wolfe conditions. This article deals with a generalization of Dai–Yuan’s Euclidean algorithm to a Riemannian algorithm that requires only the weak Wolfe conditions. The global convergence property of the proposed method is proved by means of the scaled vector transport associated with the differentiated retraction. The results of numerical experiments demonstrate the effectiveness of the proposed algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    Book  MATH  Google Scholar 

  2. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  3. Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20(2), 303–353 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  4. Fletcher, R.: Practical Methods of Optimization. Wiley, New York (2013)

    MATH  Google Scholar 

  5. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  6. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  7. Lemaréchal, C.: A view of line-searches. Optimization and Optimal Control, pp. 59–78. Springer, Berlin (1981)

    Chapter  Google Scholar 

  8. Nocedal, J., Wright, S.: Numerical Optimization. Series in Operations Research and Financial Engineering. Springer, New York (2006)

    MATH  Google Scholar 

  9. Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  10. Sato, H., Iwai, T.: A new, globally convergent Riemannian conjugate gradient method. Optimization 64(4), 1011–1031 (2015)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

The author would like to thank the anonymous referees for their valuable comments that helped improve the paper significantly. This work was supported by JSPS (Japan Society for the Promotion of Science) KAKENHI Grant Number 26887037.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hiroyuki Sato.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sato, H. A Dai–Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Comput Optim Appl 64, 101–118 (2016). https://doi.org/10.1007/s10589-015-9801-1

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-015-9801-1

Keywords

Mathematics Subject Classification

Navigation