Finite convergence of the inexact proximal gradient method to sharp minima

Attractive properties of subgradient methods, such as robust stability and linear convergence, has been emphasized when they are used to solve nonsmooth optimization problems with sharp minima [12, 13]. In this letter we extend the robustness results to the composite convex models and show that the basic proximal gradient algorithm under the presence of a sufficiently low noise still converges in finite time, even if the noise is persistent.

Article

Download

View Finite convergence of the inexact proximal gradient method to sharp minima