On Low Rank Matrix Approximations with Applications to Synthesis Problem in Compressed Sensing

We consider the synthesis problem of Compressed Sensing: given $s$ and an $M\times n$ matrix $A$, extract from $A$ an $m\times n$ submatrix $A_m$, with $m$ as small as possible, which is $s$-good, that is, every signal $x$ with at most $s$ nonzero entries can be recovered from observation $A_m x$ by $\ell_1$ minimization: $x = \argmin_u \{\|u\|_1 : A_m u = A_m x \}$. We show that under reasonable assumptions the synthesis problem can be reformulated as the problem of entry-wise approximation, within a given accuracy, of $n \times n$ matrix $W = Y^TA$, with $Y \in R^{M\times n}$ given, by a matrix of the form $Y^T_m A_m$, with $A_m$ comprised of $m$ rows of $A$. We propose randomized algorithms for efficiently solving the latter problem with accuracy guaranties $E\{\|W-Y^T_m A_m\|_\infty\} \leq O(1)L(Y;A)\sqrt{\ln(n) \over m}$. Here $L(Y;A)$ is an easy-to-specify quantity which in good cases is a moderate absolute constant (e.g., $L(A;A) = 1$ when $A$ is the Hadamard matrix, and similarly for the matrix of Fourier transform on any finite Abelian group). We also supply derandomized versions of the approximation algorithms which do not require random sampling of matrices and attain the same accuracy bounds. We further demonstrate that in terms of approximation accuracy our algorithms are optimal up to logarithmic in n factors. Finally, we provide preliminary numerical results on the performance of our algorithms for the synthesis problem.

Article

Download

View On Low Rank Matrix Approximations with Applications to Synthesis Problem in Compressed Sensing