Error bounds, quadratic growth, and linear convergence of proximal methods

We show that the the error bound property, postulating that the step lengths of the proximal gradient method linearly bound the distance to the solution set, is equivalent to a standard quadratic growth condition. We exploit this equivalence in an analysis of asymptotic linear convergence of the proximal gradient algorithm for structured problems, which lack strong convexity. More generally still, we analyze local linear convergence guarantees of a proximal method for minimizing compositions of convex functions with smooth mappings.

Article

Download

View PDF