In this paper a robust second-order method is developed for the solution of strongly convex l1-regularized problems. The main aim is to make the proposed method as inexpensive as possible, while even difficult problems can be efficiently solved. The proposed method is a primal-dual Newton Conjugate Gradients (pdNCG) method. Convergence properties of pdNCG are studied and worst-case iteration complexity is established. Numerical results are presented on a synthetic sparse least-squares problem and two real world machine learning problems.
Citation
Technical Report ERGO-13-011
Article
View A Second-Order Method for Strongly Convex L1-Regularization Problems