We study algorithms for robust principal component analysis (RPCA) for a partially observed data matrix. The aim is to recover the data matrix as a sum of a low-rank matrix and a sparse matrix so as to eliminate erratic noise (outliers). This problem is known to be NP-hard in general. A classical way to solve RPCA is to consider convex relaxations, e.g., to minimize the sum of the nuclear norm, to obtain a low-rank component, and the 1 norm, to obtain a sparse component. This is a well-structured convex problem that can be efficiently solved by modern first-order methods. However, first-order methods often yield low accuracy solutions. In this paper, we propose a novel nonconvex reformulation of the original NP-hard RPCA model. The new model imposes a semidefinite cone constraint and utilizes a facial reduction technique. we are thus able to reduce the size significantly. This makes the problem amenable to efficient algorithms in order to obtain high-level accuracy. We include numerical results that confirm the efficacy of our approach.
Citation
Research Report
Article
View Robust Principal Component Analysis using Facial Reduction