Convergence rate and iteration complexity on the alternating direction method of multipliers with a substitution procedure for separable convex programming

Recently, in [17] we have showed the first possibility of combining the Douglas-Rachford alternating direction method of multipliers (ADMM) with a Gaussian back substitution procedure for solving a convex minimization model with a general separable structure. This paper is a further study on theoretical aspects of this theme. We first derive a general algorithmic framework to combine ADMM with either a forward or backward substitution procedure. Then, we show that convergence of this framework can be easily proved from contraction perspective, and its local linear convergence rate is provable if certain standard error bound condition is assumed. Without such an error bound assumption, we can still estimate the worst-case iteration complexity for this framework in both ergodic and nonergodic senses.

Article

Download

View PDF