Non-stationary Douglas-Rachford and alternating direction method of multipliers: adaptive stepsizes and convergence

We revisit the classical Douglas-Rachford (DR) method for finding a zero of the sum of two maximal monotone operators. Since the practical performance of the DR method crucially depends on the stepsizes, we aim at developing an adaptive stepsize rule. To that end, we take a closer look at a linear case of the problem … Read more

The proximal alternating direction method of multipliers in the nonconvex setting: convergence analysis and rates

We propose two numerical algorithms for minimizing the sum of a smooth function and the composition of a nonsmooth function with a linear operator in the fully nonconvex setting. The iterative schemes are formulated in the spirit of the proximal and, respectively, proximal linearized alternating direction method of multipliers. The proximal terms are introduced through … Read more

A projection algorithm based on KKT conditions for convex quadratic semidefinite programming with nonnegative constraints

The dual form of convex quadratic semidefinite programming (CQSDP) problem, with nonnegative constraints, is a 4-block separable convex optimization problem. It is known that,the directly extended 4-block alternating direction method of multipliers (ADMM4d) is very efficient to solve the dual, but its convergence is not guaranteed. In this paper, we reformulate the dual as a … Read more

Convergent Prediction-Correction-based ADMM for multi-block separable convex programming

The direct extension of the classic alternating direction method with multipliers (ADMMe) to the multi-block separable convex optimization problem is not necessarily convergent, though it often performs very well in practice. In order to preserve the numerical advantages of ADMMe and obtain convergence, many modified ADMM were proposed by correcting the output of ADMMe or … Read more

A One-Parameter Family of Middle Proximal ADMM for Constrained Separable Convex Optimization

This work is devoted to studying a family of Middle Proximal Alternating Direction Method of Multipliers (MP-ADM) for solving multi-block constrained separable convex optimization. Such one-parameter family of MP-ADM combines both Jacobian and Gauss-Seidel types of alternating direction method, and proximal point techniques are only applied to the middle subproblems to promote the convergence. We … Read more

Generalized ADMM with Optimal Inde nite Proximal Term for Linearly Constrained Convex Optimization

We consider the generalized alternating direction method of multipliers (ADMM) for linearly constrained convex optimization. Many problems derived from practical applications have showed that usually one of the subproblems in the generalized ADMM is hard to solve, thus a special proximal term is added. In the literature, the proximal term can be inde nite which plays … Read more

Optimal Linearized Alternating Direction Method of Multipliers for Convex Programming

The alternating direction method of multipliers (ADMM) is being widely used in a variety of areas; its different variants tailored for different application scenarios have also been deeply researched in the literature. Among them, the linearized ADMM has received particularly wide attention from many areas because of its efficiency and easy implementation. To theoretically guarantee … Read more

On Solving the Quadratic Shortest Path Problem

The quadratic shortest path problem is the problem of finding a path in a directed graph such that the sum of interaction costs over all pairs of arcs on the path is minimized. We derive several semidefinite programming relaxations for the quadratic shortest path problem with a matrix variable of order $m+1$, where $m$ is … Read more

On Glowinski’s Open Question of Alternating Direction Method of Multipliers

The alternating direction method of multipliers (ADMM) was proposed by Glowinski and Marrocco in 1975; and it has been widely used in a broad spectrum of areas, especially in some sparsity-driven application domains. In 1982, Fortin and Glowinski suggested to enlarge the range of the step size for updating the dual variable in ADMM from … Read more

Infeasibility detection in the alternating direction method of multipliers for convex optimization

The alternating direction method of multipliers is a powerful operator splitting technique for solving structured optimization problems. For convex optimization problems, it is well-known that the algorithm generates iterates that converge to a solution, provided that it exists. If a solution does not exist, then the iterates diverge. Nevertheless, we show that they yield conclusive … Read more