In this paper we extend a nonconvex Nesterov-type accelerated gradient (AG) method to optimization over the Grassmann and Stiefel manifolds. We propose an exponential-based AG algorithm for the Grassmann manifold and a retraction-based AG algorithm that exploits the Cayley transform for both of the Grassmann and Stiefel manifolds. Under some mild assumptions, we obtain the global rate of convergence of the exponential-based AG algorithm. With additional but reasonable assumptions, the same global rate of convergence is obtained for the retraction-based AG algorithm. In these proofs, the special geometric structures of the two manifolds are fully utilized. Details of computing the geometric tools as ingredients in our AG algorithms are also discussed. Preliminary numerical results on three synthetic problems show the efficiency of our AG methods.
Citation
technical report, Shanghai University of Electric Power, 7/2022