In various applications such as signal/image processing, data mining, statistical learning and etc., the multi-block linearly constrained separable convex optimization is frequently used, where the objective function is the sum of multiple individual convex functions, and the major constraints are linear. A classical method for solving such kind of optimization problem could be the alternating direction method of multipliers(ADMM). It decomposes the subproblem into a series of small-scale ones such that its per-iteration cost can be very low. However, ADMM is originally designed for two-block model, and its convergence can not be guaranteed for general multi-block model without additional assumptions. Dai et al. proposed a variant of ADMM entitled sequential updating scheme of the Lagrange multiplier (SUSLM), in which the multiplier is updated multiple times at each iteration. In order to derive its convergence property, a correction step is imposed at the end of each iteration. In this paper, we improve the SUSLM by introducing two parameters hence the condition of step sizes is relaxed. Preliminary experimental results show that the new algorithm converges faster than the original SUSLM algorithm, and it is empirically effective on solving both synthetic and real problems compared with several efficient ADMM-based algorithms.
Citation
Tech report, NUFE, Nanjing, 08/19