In this paper we extend the nonconvex version of Nesterov’s accelerated gradient (AG) method to optimization over the Grassmann and Stiefel manifolds. We propose an exponential-based AG algorithm for the Grassmann manifold and a retraction-based AG algorithm that exploits the Cayley transform for both of the Grassmann and Stiefel manifolds. Under some mild assumptions, we obtain the global rate of convergence of the exponential-based AG algorithm. With additional but reasonable assumptions on retraction and vector transport, the same global rate of convergence is obtained for the retraction-based AG algorithm. Details of computing the geometric objects as ingredients of our AG algorithms are also discussed. Preliminary numerical results demonstrate the potential efficiency of our AG methods.
technical report, Shanghai University of Electric Power, 7/2022