The alternating direction method of multipliers (ADMM) is being widely used for various convex minimization models with separable structures arising in a variety of areas. In the literature, the proximalversion of ADMM which allows ADMM's subproblems to be proximally regularized has been well studied. Particularly the linearized version of ADMM can be yielded when the proximal terms are appropriately chosen; and for some applications it could alleviate an ADMM subproblem as easy as estimating the proximity operator of a function in the objective. This feature is significant in easing the numerical implementation and it makes the linearized version of ADMM very popular in a broad spectrum of application domains. To ensure the convergence of the proximal version of ADMM, however, existing results conventionally need to guarantee the positive definiteness of the corresponding proximal matrix. For some cases, this essentially results in small step sizes (or, over-regularization effectiveness) for the subproblems and thus inevitably decelerates the overall convergence speed of the linearized version of ADMM. In this paper, we investigate the possibility of relaxing the positive definiteness requirement of the proximal version of ADMM and show affirmatively that it is not necessary to ensure the positive definiteness of the proximal matrix. A new linearized ADMM with larger step sizes is thus proposed via choosing a positive-indefinite proximal regularization term. The global convergence of the new linearized ADMM is proved; and its worst-case convergence rate measured by the iteration complexity is also established. Since the ADMM can be regarded as a splitting version of the augmented Lagrangian method (ALM), a byproduct of our analysis is a new linearized version of ALM generated by choosing a positive-indefinite proximal regularization term for its subproblems.
View Linearized Alternating Direction Method of Multipliers via Positive-Indefinite Proximal Regularization for Convex Programming