Compact Lifted Relaxations for Low-Rank Optimization

We develop tractable convex relaxations for rank-constrained quadratic optimization problems over $n \times m$ matrices, a setting for which tractable relaxations are typically only available when the objective or constraints admit spectral (permutation-invariant) structure. We derive lifted semidefinite relaxations that do not require such spectral terms. Although a direct lifting introduces a large semidefinite constraint in dimension $n^2 + nm + 1$, we prove that many blocks of moment matrix are redundant and derive an equivalent compact relaxation that only involves two semidefinite constraints of dimension $nm + 1$ and $n+m$ respectively. For matrix completion, basis pursuit, and reduced-rank regression problems, we exploit additional structure to obtain even more compact formulations involving semidefinite matrices of dimension at most $2\max(n,m)$. Overall, we obtain scalable semidefinite bounds for a broad class of low-rank quadratic problems.

Article

Download

View PDF