Recently several methods were proposed for sparse optimization which make careful use of second-order information [11, 30, 17, 3] to improve local convergence rates. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent. Here we propose a general framework, which includes slightly modified versions of existing algorithms and also a new algorithm, which uses limited memory BFGS Hessian approximations, and provide a global convergence rate analysis in the spirit of proximal gradient methods, which includes analysis of method based on coordinate descent.
Citation
COR@L Technical Report 13T-02-R1, ISE Department, Lehigh
Article
View Practical Inexact Proximal Quasi-Newton Method with Global Complexity Analysis