A Family of Second-Order Methods for Convex L1-Regularized Optimization

This paper is concerned with the minimization of an objective that is the sum of a convex function $f$ and an $\ell_1$ regularization term. Our interest is in methods that incorporate second-order information about the function $f$ to accelerate convergence. We describe a semi-smooth Newton framework that can be used to generate a variety of second-order methods, including block active-set methods, orthant-based methods and a second-order iterative soft-thresholding method. We also propose a new active set method that performs multiple changes in the active manifold estimate, and incorporates a novel mechanism for correcting estimates, when needed. This corrective mechanism is also evaluated in an orthant-based method. Numerical tests comparing the performance of several second-order methods are presented.

Citation

Unpublished: Optimization Center: Northwestern University, Tech Report, June 2012

Article

Download

View A Family of Second-Order Methods for Convex L1-Regularized Optimization