Linear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Quadratic and Semi-Definite Programming

In this paper, we aim to provide a comprehensive analysis on the linear rate convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex composite optimization problems. Under a certain error bound condition, we establish the global linear rate of convergence for a more general semi-proximal ADMM with the dual steplength being restricted to be in the open interval $( 0, (1+\sqrt{5})/2)$. In our analysis, we assume neither the strong convexity nor the strict complementarity except an error bound condition, which holds automatically for convex composite quadratic programming. This semi-proximal ADMM, which includes the classic ADMM, not only has the advantage to resolve the potentially non-solvability issue of the subproblems in the classic ADMM but also possesses the abilities of handling multi-block convex optimization problems efficiently. We shall use convex composite quadratic programming and quadratic semi-definite programming as important applications to demonstrate the significance of the obtained results. Of its own novelty in second-order variational analysis, a complete characterization is provided on the isolated calmness for the nonlinear convex semi-definite optimization problem in terms of its second order sufficient optimality condition and the strict Robinson constraint qualification for the purpose of proving the linear rate convergence of the semi-proximal ADMM when applied to two- and multi-block convex quadratic semi-definite programming.

Article

Download

View PDF