A Dual Riemannian ADMM Algorithm for Low-Rank SDPs with Unit Diagonal

This paper proposes a dual Riemannian alternating direction method of multipliers (ADMM) for solving low-rank semidefinite programs with unit diagonal constraints. We recast the ADMM subproblem as a Riemannian optimization problem over the oblique manifold by performing the Burer-Monteiro factorization. Global convergence of the algorithm is established assuming that the subproblem is solved to certain … Read more

Using Filter Methods to Guide Convergence for ADMM, with Applications to Nonnegative Matrix Factorization Problems

Nonconvex, nonlinear cost functions arise naturally in physical inverse problems and machine learning. The alternating direction method of multipliers (ADMM) has seen extensive use in these applications, despite exhibiting uncertain convergence behavior in many practical nonconvex settings, and struggling with general nonlinear constraints. In contrast, filter methods have proved effective in enforcing convergence for sequential … Read more

Faster Lagrangian-based methods: a unified prediction-correction framework

Motivated by the prediction-correction framework constructed by He and Yuan [SIAM J. Numer. Anal. 50: 700-709, 2012], we propose a unified prediction-correction framework to accelerate Lagrangian-based methods. More precisely, for strongly convex optimization, general linearized Lagrangian method with indefinite proximal term, alternating direction method of multipliers (ADMM) with the step size of Lagrangian multiplier not … Read more

Global Complexity Bound of a Proximal ADMM for Linearly-Constrained Nonseperable Nonconvex Composite Programming

This paper proposes and analyzes a dampened proximal alternating direction method of multipliers (DP.ADMM) for solving linearly-constrained nonconvex optimization problems where the smooth part of the objective function is nonseparable. Each iteration of DP.ADMM consists of: (ii) a sequence of partial proximal augmented Lagrangian (AL) updates, (ii) an under-relaxed Lagrange multiplier update, and (iii) a … Read more

Inertial-relaxed splitting for composite monotone inclusions

In a similar spirit of the extension of the proximal point method developed by Alves et al. \cite{alvegm20}, we propose in this work an Inertial-Relaxed primal-dual splitting method to address the problem of decomposing the minimization of the sum of three convex functions, one of them being smooth, and considering a general coupling subspace. A … Read more

Dual descent ALM and ADMM

Classical primal-dual algorithms attempt to solve $\max_{\mu}\min_{x} \mathcal{L}(x,\mu)$ by alternatively minimizing over the primal variable $x$ through primal descent and maximizing the dual variable $\mu$ through dual ascent. However, when $\mathcal{L}(x,\mu)$ is highly nonconvex with complex constraints in $x$, the minimization over $x$ may not achieve global optimality, and hence the dual ascent step loses … Read more

A Framework of Inertial Alternating Direction Method of Multipliers for Non-Convex Non-Smooth Optimization

In this paper, we propose an algorithmic framework dubbed inertial alternating direction methods of multipliers (iADMM), for solving a class of nonconvex nonsmooth multiblock composite optimization problems with linear constraints. Our framework employs the general minimization-majorization (MM) principle to update each block of variables so as to not only unify the convergence analysis of previous … Read more

Decomposition Methods for Global Solutions of Mixed-Integer Linear Programs

This paper introduces two decomposition-based methods for two-block mixed-integer linear programs (MILPs), which aim to take advantage of separable structures of the original problem by solving a sequence of lower-dimensional MILPs. The first method is based on the $\ell_1$-augmented Lagrangian method (ALM), and the second one is based on a modified alternating direction method of … Read more

ADMM and inexact ALM: the QP case

Embedding randomization procedures in the Alternating Direction Method of Multipliers (ADMM) has recently attracted an increasing amount of interest as a remedy to the fact that the direct n-block generalization of ADMM is not necessarily convergent ($n \geq 3$). Even if, in practice, the introduction of such techniques could mitigate the diverging behaviour of the … Read more