Due to its wide applications and simple implementations, the Alternating Direction Method of Multipliers (ADMM) has been extensively investigated by researchers from different areas. In this paper, we focus on a linearized symmetric ADMM (LSADMM) for solving the multi- block separable convex minimization model. This LSADMM partitions the data into two group variables and updates the Lagrange multiplier twice in different forms and with suitable stepsizes, where two grouped variables are updated in a Gauss-Seidel scheme while variables within each group are updated in a Jacobi scheme. For the second group variables, however, linearized and relaxation techniques are used to deal with the quadratic term of subproblems and to accelerate the algorithm, respectively. Since the proximal operator is designed uncertainly (positive-definite or positive-indefinite), the region of the proximal parameter, involved in the second group subproblems, is partitioned into the union of three different sets. We show the global convergence and the sublinear ergodic convergence rate of LSADMM for the two cases, and give a counter-example to illustrate that the convergence of LSADMM for the remaining case can not be guaranteed. Theoretically, we obtain the optimal lower bound of the proximal parameter. Besides, numerical experiments on the Latent Variable Gaussian Graphical Model Selection (LVGGMS) problems are presented to demonstrate the efficiency of the proposed algorithm and the significant advantage of the optimal lower bound of the proximal parameter.
Citation
X. Chang, J. Bai and S. Liu, Optimal linearized symmetric ADMM for separable convex programming,2018.
Article
View Optimal linearized symmetric ADMM for separable convex programming