The direct extension of ADMM for three-block separable convex minimization models is convergent when one function is strongly convex

The alternating direction method of multipliers (ADMM) is a benchmark for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. Meanwhile, it is known that the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective function has more than two functions. Recently, some authors have actively studied the strong convexity condition on the objective function to sufficiently ensure the convergence of the direct extension of ADMM or the resulting convergence when the original scheme is appropriately twisted. However, these strong convexity conditions still seem too strict to be satisfied by some applications for which the direct extension of ADMM work well; and the twisted schemes are less efficient or convenient to implement than the original scheme of the direct extension of ADMM. We are thus motivated to understand why the original scheme of the direct extension of ADMM works for some applications and under which realistic conditions its convergence can be guaranteed. We answer this question for the three-block case where there are three separable functions in the objective; and show that when one of them is strongly convex, the direct extension of ADMM is convergent. Note that the strong convexity of one function does hold for many applications. We further estimate the worst-case convergence rate measured by the iteration complexity in both the ergodic and nonergodic senses for the direct extension of ADMM, and show that its globally linear convergence in asymptotical sense can be guaranteed under some additional conditions.

Article

Download

View The direct extension of ADMM for three-block separable convex minimization models is convergent when one function is strongly convex