A Strengthened SDP Relaxation for Quadratic Optimization Over the Stiefel Manifold

We study semidefinite programming (SDP) relaxations for the NP-hard problem of globally optimizing a quadratic function over the Stiefel manifold. We introduce a strengthened relaxation based on two recent ideas in the literature: (i) a tailored SDP for objectives with a block-diagonal Hessian; (ii) and the use of the Kronecker matrix product to construct SDP relaxations. Using synthetic instances on … Read more

Accelerated gradient methods on the Grassmann and Stiefel manifolds

In this paper we extend a nonconvex version of Nesterov’s accelerated gradient (AG) method to optimization over the Grassmann and Stiefel manifolds. We propose an exponential-based AG algorithm for the Grassmann manifold and a retraction-based AG algorithm that exploits the Cayley transform for both of the Grassmann and Stiefel manifolds. Under some mild assumptions, we … Read more

A Constraint Dissolving Approach for Nonsmooth Optimization over the Stiefel Manifold

This paper focus on the minimization of a possibly nonsmooth objective function over the Stiefel manifold. The existing approaches either lack efficiency or can only tackle prox-friendly objective functions. We propose a constraint dissolving function named NCDF and show that it has the same first-order stationary points and local minimizers as the original problem in … Read more

Solving Optimization Problems over the Stiefel Manifold by Smooth Exact Penalty Function

In this paper, we present a novel penalty model called ExPen for optimization over the Stiefel manifold. Different from existing penalty functions for orthogonality constraints, ExPen adopts a smooth penalty function without using any first-order derivative of the objective function. We show that all the first-order stationary points of ExPen with a sufficiently large penalty … Read more

Completely Positive Factorization by a Riemannian Smoothing Method

Copositive optimization is a special case of convex conic programming, and it optimizes a linear function over the cone of all completely positive matrices under linear constraints. Copositive optimization provides powerful relaxations of NP-hard quadratic problems or combinatorial problems, but there are still many open problems regarding copositive or completely positive matrices. In this paper, … Read more

Proximal Point Algorithm on the Stiefel Manifold

In this paper, we consider the problem of minimizing a continuously differentiable function on the Stiefel manifold. To solve this problem, we develop a geodesic-free proximal point algorithm, which does not require the use of the Riemannian distance. The proposed method can be regarded as an iterative fixed-point method, which repeatedly applies a proximal operator … Read more

A Penalty-free Infeasible Approach for a Class of Nonsmooth Optimization Problems over the Stiefel Manifold

Transforming into an exact penalty function model with convex compact constraints yields efficient infeasible approaches for optimization problems with orthogonality constraints. For smooth and L21-norm regularized cases, these infeasible approaches adopt simple and orthonormalization-free updating schemes and show high efficiency in some numerical experiments. However, to avoid orthonormalization while enforcing the feasibility of the final … Read more

Multipliers Correction Methods for Optimization Problems over the Stiefel Manifold

We propose a class of multipliers correction methods to minimize a differentiable function over the Stiefel manifold. The proposed methods combine a function value reduction step with a proximal correction step. The former one searches along an arbitrary descent direction in the Euclidean space instead of a vector in the tangent space of the Stiefel … Read more

Exact Penalty Function for L21 Norm Minimization over the Stiefel Manifold

L21 norm minimization with orthogonality constraints, feasible region of which is called Stiefel manifold, has wide applications in statistics and data science. The state-of-the-art approaches adopt proximal gradient technique on either Stiefel manifold or its tangent spaces. The consequent subproblem does not have closed-form solution and hence requires an iterative procedure to solve which is … Read more

Riemannian conjugate gradient methods with inverse retraction

We propose a new class of Riemannian conjugate gradient (CG) methods, in which inverse retraction is used instead of vector transport for search direction construction. In existing methods, differentiated retraction is often used for vector transport to move the previous search direction to the current tangent space. However, a different perspective is adopted here, motivated … Read more