## Local Convergence Properties of Douglas–Rachford and ADMM

The Douglas–Rachford (DR) and alternating direction method of multipliers (ADMM) are two proximal splitting algorithms designed to minimize the sum of two proper lower semi-continuous convex functions whose proximity operators are easy to compute. The goal of this work is to understand the local linear convergence behaviour of DR/ADMM when the involved functions are moreover … Read more

## A Multi-step Inertial Forward–Backward Splitting Method for Non-convex Optimization

In this paper, we propose a multi-step inertial Forward–Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient. We first prove global convergence of the scheme with the help of the Kurdyka-Lojasiewicz property. Then, when the … Read more

## Activity Identification and Local Linear Convergence of Forward–Backward-type methods

In this paper, we consider a class of Forward–Backward (FB) splitting methods that includes several variants (e.g. inertial schemes, FISTA) for minimizing the sum of two proper convex and lower semi-continuous functions, one of which has a Lipschitz continuous gradient, and the other is partly smooth relatively to a smooth active manifold $\mathcal{M}$. We propose … Read more

## Activity Identification and Local Linear Convergence of Douglas-Rachford/ADMM under Partial Smoothness

Proximal splitting algorithms are becoming popular to solve convex optimization problems in variational image processing. Within this class, Douglas-Rachford (DR) and ADMM are designed to minimize the sum of two proper lower semicontinuous convex functions whose proximity operators are easy to compute. The goal of this work is to understand the local convergence behaviour of … Read more

## Local Linear Convergence of Forward–Backward under Partial Smoothness

In this paper, we consider the Forward–Backward proximal splitting algorithm to minimize the sum of two proper closed convex functions, one of which having a Lipschitz–continuous gradient and the other being partly smooth relatively to an active manifold $\mathcal{M}$. We propose a unified framework in which we show that the Forward–Backward (i) correctly identifies the … Read more