An Adaptive Riemannian Gradient Method Without Function Evaluations
In this paper we propose an adaptive gradient method for optimization on Riemannian manifolds. The update rule for the stepsizes relies only on gradient evaluations. Assuming that the objective function is bounded from below and that its gradient field is Lipschitz continuous, we establish worst-case complexity bounds for the number of gradient evaluations that the … Read more