Fast First-Order Methods for Stable Principal Component Pursuit

The stable principal component pursuit (SPCP) problem is a non-smooth convex optimization problem, the solution of which has been shown both in theory and in practice to enable one to recover the low rank and sparse components of a matrix whose elements have been corrupted by Gaussian noise. In this paper, we first show how several existing fast first-order methods can be applied to this problem very efficiently. Specifically, we show that the subproblems that arise when applying optimal gradient methods of Nesterov, alternating linearization methods and alternating direction augmented Lagrangian methods to the SPCP problem either have closed-form solutions or have solutions that can be obtained with very modest effort. Later, we develop a new first order algorithm, NSA, based on partial variable splitting. All but one of the methods analyzed require at least one of the non-smooth terms in the objective function to be smoothed and obtain an eps-optimal solution to the SPCP problem in O(1/eps) iterations. NSA, which works directly with the fully non-smooth objective function, is proved to be convergent under mild conditions on the sequence of parameters it uses. Our preliminary computational tests show that the latter method, NSA, although its complexity is not known, is the fastest among the four algorithms described and substantially outperforms ASALM, the only existing method for the SPCP problem. To best of our knowledge, an algorithm for the SPCP problem that has O(1/eps) iteration complexity and has a per iteration complexity equal to that of a singular value decomposition is given for the first time.

Citation

IEOR Department, Columbia University, May 2011

Article

Download

View PDF