The well-known symmetric rank-one trust-region method---where the Hessian approximation is generated by the symmetric rank-one update---is generalized to the problem of minimizing a real-valued function over a $d$-dimensional Riemannian manifold. The generalization relies on basic differential-geometric concepts, such as tangent spaces, Riemannian metrics, and the Riemannian gradient, as well as on the more recent notions of (first-order) retraction and vector transport. The new method, called RTR-SR1, is shown to converge globally and $d+1$-step q-superlinearly to stationary points of the objective function. A limited-memory version, referred to as LRTR-SR1, is also introduced. In this context, novel efficient strategies are presented to construct a vector transport on a submanifold of a Euclidean space. Numerical experiments---Rayleigh quotient minimization on the sphere and a joint diagonalization problem on the Stiefel manifold---illustrate the value of the new methods.
Citation
Tech. report UCL-INMA-2013.03-v1, April 18, 2013