Global Convergence of Sub-gradient Method for Robust Matrix Recovery: Small Initialization, Noisy Measurements, and Over-parameterization

In this work, we study the performance of sub-gradient method (SubGM) on a natural nonconvex and nonsmooth formulation of low-rank matrix recovery with $\ell_1$-loss, where the goal is to recover a low-rank matrix from a limited number of measurements, a subset of which may be grossly corrupted with noise. We study a scenario where the … Read more

Implicit Regularization of Sub-Gradient Method in Robust Matrix Recovery: Don’t be Afraid of Outliers

It is well-known that simple short-sighted algorithms, such as gradient descent, generalize well in the over-parameterized learning tasks, due to their implicit regularization. However, it is unknown whether the implicit regularization of these algorithms can be extended to robust learning tasks, where a subset of samples may be grossly corrupted with noise. In this work, … Read more

Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence

The task of recovering a low-rank matrix from its noisy linear measurements plays a central role in computational science. Smooth formulations of the problem often exhibit an undesirable phenomenon: the condition number, classically defined, scales poorly with the dimension of the ambient space. In contrast, we here show that in a variety of concrete circumstances, … Read more

Sufficient Conditions for Low-rank Matrix Recovery,Translated from Sparse Signal Recovery

The low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is $NP$-hard and a popular approach replaces the rank function with the nuclear norm of the … Read more

Exact Low-rank Matrix Recovery via Nonconvex Mp-Minimization

The low-rank matrix recovery (LMR) arises in many fields such as signal and image processing, statistics, computer vision, system identification and control, and it is NP-hard. It is known that under some restricted isometry property (RIP) conditions we can obtain the exact low-rank matrix solution by solving its convex relaxation, the nuclear norm minimization. In … Read more

New Bounds for Restricted Isometry Constants in Low-rank Matrix Recovery

In this paper, we establish new bounds for restricted isometry constants (RIC) in low-rank matrix recovery. Let $\A$ be a linear transformation from $\R^{m \times n}$ into $\R^p$, and $r$ the rank of recovered matrix $X\in \R^{m \times n}$. Our main result is that if the condition on RIC satisfies $\delta_{2r+k}+2(\frac{r}{k})^{1/2}\delta_{\max\{r+\frac{3}{2}k,2k\}}