A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer

We consider the problem of minimizing an objective function that is the sum of a convex function and a group sparsity-inducing regularizer. Problems that integrate such regularizers arise in modern machine learning applications, often for the purpose of obtaining models that are easier to interpret and that have higher predictive accuracy. We present a new method for solving such problems that utilize subspace acceleration, domain decomposition, and support identification. Our analysis shows, under common assumptions, that the iterate sequence generated by our framework is globally convergent, converges to an $\epsilon$-approximate solution in at most $O(\epsilon^{-(1+p)})$ (respectively, $O(\epsilon^{-(2+p)})$) iterations for all $\epsilon$ bounded above and large enough (respectively, all $\epsilon$ bounded above) where $p > 0$ is an algorithm parameter, and exhibits superlinear local convergence. Preliminary numerical results for the task of binary classification based on regularized logistic regression show that our approach is efficient and robust, with the ability to outperform a state-of-the-art method.

Citation

Lehigh University, ISE Technical Report 20T-015

Article

Download

View PDF