Local Linear Convergence of Forward–Backward under Partial Smoothness

In this paper, we consider the Forward–Backward proximal splitting algorithm to minimize the sum of two proper closed convex functions, one of which having a Lipschitz–continuous gradient and the other being partly smooth relatively to an active manifold $\mathcal{M}$. We propose a unified framework in which we show that the Forward–Backward (i) correctly identifies the … Read more

Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions

Splitting schemes are a class of powerful algorithms that solve complicated monotone inclusion and convex optimization problems that are built from many simpler pieces. They give rise to algorithms in which the simple pieces of the decomposition are processed individually. This leads to easily implementable and highly parallelizable algorithms, which often obtain nearly state-of-the-art performance. … Read more

Discrete Approximations of a Controlled Sweeping Process

The paper is devoted to the study of a new class of optimal control problems governed by the classical Moreau sweeping process with the new feature that the polyhedral moving set is not fixed while controlled by time-dependent functions. The dynamics of such problems is described by dissipative non-Lipschitzian differential inclusions with state constraints of … Read more

An Accelerated Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization

We consider the problem of minimizing the sum of two convex functions: one is smooth and given by a gradient oracle, and the other is separable over blocks of coordinates and has a simple known structure over each block. We develop an accelerated randomized proximal coordinate gradient (APCG) method for minimizing such convex composite functions. … Read more

Global convergence of splitting methods for nonconvex composite optimization

We consider the problem of minimizing the sum of a smooth function $h$ with a bounded Hessian, and a nonsmooth function. We assume that the latter function is a composition of a proper closed function $P$ and a surjective linear map $\M$, with the proximal mappings of $\tau P$, $\tau > 0$, simple to compute. … Read more

Projection Methods: An Annotated Bibliography of Books and Reviews

Projections onto sets are used in a wide variety of methods in optimization theory but not every method that uses projections really belongs to the class of projection methods as we mean it here. Here projection methods are iterative algorithms that use projections onto sets while relying on the general principle that when a family … Read more

Steplength Thresholds for Invariance Preserving of Discretization Methods of Dynamical Systems on a Polyhedron

Steplength thresholds for invariance preserving of three types of discretization methods on a polyhedron are considered. For Taylor approximation type discretization methods we prove that a valid steplength threshold can be obtained by finding the first positive zeros of a finite number of polynomial functions. Further, a simple and efficient algorithm is proposed to numerically … Read more

On Lipschitz optimization based on gray-box piecewise linearization

We address the problem of minimizing objectives from the class of piecewise differentiable functions whose nonsmoothness can be encapsulated in the absolute value function. They possess local piecewise linear approximations with a discrepancy that can be bounded by a quadratic proximal term. This overestimating local model is continuous but generally nonconvex. It can be generated … Read more

A Class of Randomized Primal-Dual Algorithms for Distributed Optimization

Based on a preconditioned version of the randomized block-coordinate forward-backward algorithm recently proposed in [Combettes,Pesquet,2014], several variants of block-coordinate primal-dual algorithms are designed in order to solve a wide array of monotone inclusion problems. These methods rely on a sweep of blocks of variables which are activated at each iteration according to a random rule, … Read more

A Primal-Dual Algorithmic Framework for Constrained Convex Minimization

We present a primal-dual algorithmic framework to obtain approximate solutions to a prototypical constrained convex optimization problem, and rigorously characterize how common structural assumptions affect the numerical efficiency. Our main analysis technique provides a fresh perspective on Nesterov’s excessive gap technique in a structured fashion and unifies it with smoothing and primal-dual methods. For instance, … Read more