Alternating Iteratively Reweighted \(\ell_1\) and Subspace Newton Algorithms for Nonconvex Sparse Optimization

\(\)

This paper presents a novel hybrid algorithm for minimizing the sum of a continuously differentiable loss function and a nonsmooth, possibly nonconvex, sparsity‑promoting regularizer. The proposed method adaptively switches between solving a reweighted \(\ell_1\)-regularized subproblem and performing an inexact subspace Newton step. The reweighted \(\ell_1\)-subproblem admits an efficient closed-form solution via the soft-thresholding operator, thereby avoiding the computational overhead of proximity operators. As the algorithm approaches an optimal solution, it identifies a smooth active manifold and guarantees that nonzero components remain uniformly bounded away from zero. On this manifold, the algorithm transitions to a perturbed regularized Newton step, accelerating local convergence. We establish global convergence to a critical point. Under the Kurdyka-\L{}ojasiewicz property, the algorithm converges locally at a linear rate; when the subspace Newton subproblem is solved exactly, the local convergence rate improves to quadratic. Numerical experiments show that the proposed algorithm outperforms existing methods in both efficiency and solution quality across various model prediction problems.

Article

Download

View PDF