A worst-case complexity analysis for Riemannian non-monotone line-search methods

In this paper we deal with non-monotone line-search methods to minimize a smooth cost function on a Riemannian 
manifold. In particular, we study the number of iterations necessary for this class of algorithms to obtain 
e-approximated stationary points. Specifically, we prove that under a regularity Lipschitz-type condition
on the pullbacks of the cost function to the tangent spaces of the manifold and other mild assumptions, the 
Riemannian non-monotone line-search methods generates points with Riemannian gradient norm smaller than 
e in O(1/(e^2)) iterations. Our worst-case complexity includes a wide variety of known 
non-monotone strategies existing in the literature. Additionally, we establish the global convergence for this 
family of methods. The bounds obtained in our analysis agree with the bounds known for line-search methods in the 
field of unconstrained nonlinear optimization and hence generalizes previous work. 

Article

Download

View PDF