Exact Worst-case Performance of First-order Methods for Composite Convex Optimization

We provide a framework for computing the exact worst-case performance of any algorithm belonging to a broad class of oracle-based first-order methods for composite convex optimization, including those performing explicit, projected, proximal, conditional and inexact (sub)gradient steps. We simultaneously obtain tight worst-case guarantees and explicit instances of optimization problems on which the algorithm reaches this … Read more

Tight global linear convergence rate bounds for operator splitting methods

In this paper we establish necessary and sufficient conditions for linear convergence of operator splitting methods for a general class of convex optimization problems where the associated fixed-point operator is averaged. We also provide a tight bound on the achievable convergence rate. Most existing results establishing linear convergence in such methods require restrictive assumptions regarding … Read more

An accelerated HPE-type algorithm for a class of composite convex-concave saddle-point problems

This article proposes a new algorithm for solving a class of composite convex-concave saddle-point problems. The new algorithm is a special instance of the hybrid proximal extragradient framework in which a Nesterov’s accelerated variant is used to approximately solve the prox subproblems. One of the advantages of the new method is that it works for … Read more