Global non-asymptotic super-linear convergence rates of regularized proximal quasi-Newton methods on non-smooth composite problems

\(\)
In this paper, we propose two regularized proximal quasi-Newton methods with symmetric rank-1 update of the metric (SR1 quasi-Newton) to solve non-smooth convex additive composite problems. Both algorithms avoid using line search or other trust region strategies. For each of them, we prove a super-linear convergence rate that is independent of the initialization of the algorithm. The cubic regularized method achieves a rate of order $\left(\frac{C}{N^{1/2}}\right)^{N/2}$, where $N$ is the number of iterations and $C$ is some constant, and the other gradient regularized method shows a rate of the order $\left(\frac{C}{N^{1/4}}\right)^{N/2}$. To the best of our knowledge, these are the first global non-asymptotic super-linear convergence rates for regularized quasi-Newton methods and regularized proximal quasi-Newton methods. The theoretical properties are also demonstrated in two applications from machine learning.

Citation

https://arxiv.org/abs/2410.11676

Article

Download

View Global non-asymptotic super-linear convergence rates of regularized proximal quasi-Newton methods on non-smooth composite problems