We propose a novel steplength selection rule in proximal gradient methods for minimizing the sum of a differentiable function plus an ℓ1-norm penalty term. The proposed rule modifies one of the classical Barzilai-Borwein steplength, extending analogous results obtained in the context of gradient projection methods for constrained optimization.
We analyze the spectral properties of the Barzilai-Borwein-like steplength when the differentiable part is quadratic, showing that its reciprocal lies in the spectrum of the submatrix of the Hessian matrix that depends on both the nonzero and the nonoptimal zero components of the current iterate, allowing for acceleration effects when the optimal zero components start to be identified.
Furthermore, we insert the modified rule into a proximal gradient method with a nonmonotone linesearch, for which we prove global convergence towards a stationary point. Numerical experiments show the ability of the proposed rule to sweep the spectrum of the reduced Hessian on a series of quadratic ℓ1-regularized problems, as well as its effectiveness in recovering the ground truth in a least squares regularized problem arising in image restoration.
View Barzilai-Borwein-like rules in proximal gradient schemes for ℓ1−regularized problems