In this paper, we consider a class of optimization problems with orthogonality constraints, the feasible region of which is called the Stiefel manifold. Our new framework combines a function value reduction step with a correction step. Different from the existing approaches, the function value reduction step of our algorithmic framework searches along the standard Euclidean descent directions instead of the vectors in the tangent space of the Stiefel manifold, and the correction step further reduces the function value and guarantees a symmetric dual variable at the same time. We construct two types of algorithms based on this new framework. The first type is based on gradient reduction including the gradient reflection (GR) and the gradient projection (GP) algorithms. The other one adopts a column-wise block coordinate descent (CBCD) scheme with a novel idea for solving the corresponding CBCD subproblem inexactly. We prove that both GR/GP with a fixed stepsize and CBCD belong to our algorithmic framework, and any clustering point of the iterates generated by the proposed framework is a first-order stationary point. Preliminary experiments illustrate that our new framework is of great potential.
Citation
09/2016