Local Convergence Properties of Douglas–Rachford and ADMM

The Douglas–Rachford (DR) and alternating direction method of multipliers (ADMM) are two proximal splitting algorithms designed to minimize the sum of two proper lower semi-continuous convex functions whose proximity operators are easy to compute. The goal of this work is to understand the local linear convergence behaviour of DR/ADMM when the involved functions are moreover … Read more

Approximate Versions of the Alternating Direction Method of Multipliers

We present three new approximate versions of alternating direction method of multipliers (ADMM), all of which require only knowledge of subgradients of the subproblem objectives, rather than bounds on the distance to the exact subproblem solution. One version, which applies only to certain common special cases, is based on combining the operator-splitting analysis of the … Read more

Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, $\phi(x_1,\ldots,x_p,y)$, subject to linear equality constraints that couple $x_1,\ldots,x_p,y$, where $p\ge 1$ is an integer. Our ADMM sequentially updates the primal variables in the order $x_1,\ldots,x_p,y$, followed by updating the dual … Read more

Inertial Proximal ADMM for Linearly Constrained Separable Convex Optimization

The \emph{alternating direction method of multipliers} (ADMM) is a popular and efficient first-order method that has recently found numerous applications, and the proximal ADMM is an important variant of it. The main contributions of this paper are the proposition and the analysis of a class of inertial proximal ADMMs, which unify the basic ideas of … Read more

Linear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Quadratic and Semi-Definite Programming

In this paper, we aim to provide a comprehensive analysis on the linear rate convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex composite optimization problems. Under a certain error bound condition, we establish the global linear rate of convergence for a more general semi-proximal ADMM with the dual steplength … Read more

Understanding the Convergence of the Alternating Direction Method of Multipliers: Theoretical and Computational Perspectives

The alternating direction of multipliers (ADMM) is a form of augmented Lagrangian algorithm that has experienced a renaissance in recent years due to its applicability to optimization problems arising from “big data” and image processing applications, and the relative ease with which it may be implemented in parallel and distributed computational environments. While it is … Read more

Global Convergence of Unmodified 3-Block ADMM for a Class of Convex Minimization Problems

The alternating direction method of multipliers (ADMM) has been successfully applied to solve structured convex optimization problems due to its superior practical performance. The convergence properties of the 2-block ADMM have been studied extensively in the literature. Specifically, it has been proven that the 2-block ADMM globally converges for any penalty parameter $\gamma>0$. In this … Read more

First-Order Algorithms for Convex Optimization with Nonseparate Objective and Coupled Constraints

In this paper we consider a block-structured convex optimization model, where in the objective the block-variables are nonseparable and they are further linearly coupled in the constraint. For the 2-block case, we propose a number of first-order algorithms to solve this model. First, the alternating direction method of multipliers (ADMM) is extended, assuming that it … Read more

Activity Identification and Local Linear Convergence of Douglas-Rachford/ADMM under Partial Smoothness

Proximal splitting algorithms are becoming popular to solve convex optimization problems in variational image processing. Within this class, Douglas-Rachford (DR) and ADMM are designed to minimize the sum of two proper lower semicontinuous convex functions whose proximity operators are easy to compute. The goal of this work is to understand the local convergence behaviour of … Read more

Parallel Multi-Block ADMM with o(1/k) Convergence

This paper introduces a parallel and distributed extension to the alternating direction method of multipliers (ADMM). The algorithm decomposes the original problem into N smaller subproblems and solves them in parallel at each iteration. This Jacobian-type algorithm is well suited for distributed computing and is particularly attractive for solving certain large-scale problems. This paper introduces … Read more