Riemannian conjugate gradient methods with inverse retraction

We propose a new class of Riemannian conjugate gradient (CG) methods, in which inverse retraction is used instead of vector transport for search direction construction. In existing methods, differentiated retraction is often used for vector transport to move the previous search direction to the current tangent space. However, a different perspective is adopted here, motivated by the fact that inverse retraction directly measures the displacement from the current to the previous points in terms of tangent vectors at the current point. The proposed algorithm is implemented with the Fletcher–Reeves and the Dai–Yuan formulae, respectively, and global convergence is established using modifications of the Riemannian Wolfe conditions. Computational details of the practical inverse retractions over the Stiefel and fixed-rank manifolds are discussed. Numerical results obtained for the Brockett cost function minimization problem, the joint diagonalization problem, and the low-rank matrix completion problem demonstrate the potential effectiveness of Riemannian CG with inverse retraction.

Citation

2020-001, College of Mathematics and Physics, Shanghai University of Electric Power, 05/2020

Article

Download

View Riemannian conjugate gradient methods with inverse retraction