Many problems in signal processing and statistical inference involve finding sparse solutions to under-determined, or ill-conditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared $\ell_2$) error term combined with a sparseness-inducing ($\ell_1$) regularization term.{\it Basis pursuit}, the {\it least absolute shrinkage and selection operator} (LASSO), wavelet-based deconvolution, and {\it compressed sensing} are a few well-known examples of this approach. This paper proposes gradient projection (GP) algorithms for the bound-constrained quadratic programming (BCQP) formulation of these problems. We test variants of this approach that select the line search parameters in different ways, including techniques based on the Barzilai-Borwein method. Computational experiments show that these GP approaches perform well in a wide range of applications, often being significantly faster (in terms of computation time) than competing methods. Although the performance of GP methods tends to degrade as the regularization term is de-emphasized, we show how they can be embedded in a continuation scheme to recover their efficient practical performance.
Citation
Technical Report, University of Wisconsin, September 2007. (Revised Version of a Technical Report of Fabruary 2007)