Abstract
In this chapter, we discuss the conjugate gradient (CG) methods on Riemannian manifolds, which we also call Riemannian CG methods. They can be considered to be a modified version of the Riemannian steepest descent method. In particular, we analyze the Fletcher–Reeves-type and Dai–Yuan-type Riemannian CG methods and prove their global convergence properties under some conditions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The statement is equivalent to \(d_k \ne 0\) for \(k = 0, 1, \dots , n-1\) and \(\nabla f(x_k)^T d_l = \nabla f(x_k)^T \nabla f(x_l) = 0\) for \(k, l = 0, 1, \dots , n\) with \(l < k\). The expression (4.12) is for the ease of proof by induction.
- 2.
In practical computation, Algorithm 4.1 may not exactly solve \(Ax = b\) within n iterations due to rounding errors.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Sato, H. (2021). Conjugate Gradient Methods on Riemannian Manifolds. In: Riemannian Optimization and Its Applications. SpringerBriefs in Electrical and Computer Engineering(). Springer, Cham. https://doi.org/10.1007/978-3-030-62391-3_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-62391-3_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-62389-0
Online ISBN: 978-3-030-62391-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)