The alternating direction method of multipliers (ADMM) is being widely used in a variety of areas; its different variants tailored for different application scenarios have also been deeply researched in the literature. Among them, the linearized ADMM has received particularly wide attention from many areas because of its efficiency and easy implementation. To theoretically guarantee the convergence of the linearized ADMM, the step size for the linearized subproblems, or the reciprocal of the linearization parameter, should be sufficiently small. On the other hand, small step sizes decelerate the convergence numerically. Hence, it is crucial to determine an optimal (largest) value of the step size while the convergence of the linearized ADMM can be still ensured. Such an analysis seems to be lacked in the literature. In this paper, we show how to find this optimal step size for the linearized ADMM and hence propose the optimal linearized ADMM in the convex programming context. Its global convergence and worst-case convergence rate measured by the iteration complexity are proved as well.
View Optimal Linearized Alternating Direction Method of Multipliers for Convex Programming