On the convergence rate of the Douglas-Rachford splitting algorithm

This work is concerned with the convergence rate analysis of the Dou- glas–Rachford splitting (DRS) method for finding a zero of the sum of two maximally monotone operators. We obtain an exact rate of convergence for the DRS algorithm and demonstrate its sharpness in the setting of convex feasibility problems. Further- more, we investigate the … Read more

Accelerated Gradient Descent via Long Steps

Recently Grimmer [1] showed for smooth convex optimization by utilizing longer steps periodically, gradient descent’s state-of-the-art O(1/T) convergence guarantees can be improved by constant factors, conjecturing an accelerated rate strictly faster than O(1/T) could be possible. Here we prove such a big-O gain, establishing gradient descent’s first accelerated convergence rate in this setting. Namely, we … Read more

The exact worst-case convergence rate of the alternating direction method of multipliers

Recently, semidefinite programming performance estimation has been employed as a strong tool for the worst-case performance analysis of first order methods. In this paper, we derive new non-ergodic convergence rates for the alternating direction method of multipliers (ADMM) by using performance estimation. We give some examples which show the exactness of the given bounds. We … Read more

Exact Worst-case Performance of First-order Methods for Composite Convex Optimization

We provide a framework for computing the exact worst-case performance of any algorithm belonging to a broad class of oracle-based first-order methods for composite convex optimization, including those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. We simultaneously obtain tight worst-case guarantees and explicit instances of optimization problems on which the algorithm reaches this … Read more