RandProx: Primal-Dual Optimization Algorithms with Randomized Proximal Updates

Proximal splitting algorithms are well suited to solving large-scale nonsmooth optimization problems, in particular those arising in machine learning. We propose a new primal-dual algorithm, in which the dual update is randomized; equivalently, the proximity operator of one of the function in the problem is replaced by a stochastic oracle. For instance, some randomly chosen … Read more

Proximal splitting algorithms: Relax them all!

Convex optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, proximal splitting algorithms are particularly adequate: they consist of simple operations, by handling the terms in the objective function separately. We present several existing proximal splitting algorithms and we derive new ones, within a unified framework, which consists … Read more

Fast Projection onto the Simplex and the l1 Ball

A new algorithm is proposed to project, exactly and in finite time, a vector of arbitrary size onto a simplex or a l1-norm ball. The algorithm is demonstrated to be faster than existing methods. In addition, a wrong statement in a paper by Duchi et al. is corrected and an adversary sequence for Michelot’s algorithm … Read more

A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms

We propose a new first-order splitting algorithm for solving jointly the primal and dual formulations of large-scale convex minimization problems involving the sum of a smooth function with Lipschitzian gradient, a nonsmooth proximable function, and linear composite functions. This is a full splitting approach in the sense that the gradient and the linear operators involved … Read more