\(\)
The Alternating Direction Method of Multipliers (ADMM) is widely recognized for its efficiency in solving separable optimization problems. However, its application to optimization on Riemannian manifolds remains a significant challenge. In this paper, we propose a novel inertial Riemannian gradient ADMM (iRG-ADMM) to solve Riemannian optimization problems with nonlinear constraints. Our key contributions are as follows: (i) we introduce an inertial strategy applied to the Riemannian gradient, enabling faster convergence for smooth subproblems constrained on Riemannian manifolds; (ii) for nonsmooth subproblems in Euclidean space, we incorporate existing well-established algorithms for efficient solution; and (iii) we establish the $\epsilon$-stationarity of iRG-ADMM under mild conditions. Finally, we demonstrate the effectiveness of iRG-ADMM through extensive numerical experiments, including applications to Sparse Principal Component Analysis (SPCA), highlighting its superior performance compared to existing methods.
Article
Download