A corrected semi-proximal ADMM for multi-block convex optimization and its application to DNN-SDPs

In this paper we propose a corrected semi-proximal ADMM (alternating direction method of multipliers) for the general $p$-block $(p\!\ge 3)$ convex optimization problems with linear constraints, aiming to resolve the dilemma that almost all the existing modified versions of the directly extended ADMM, although with convergent guarantee, often perform substantially worse than the directly extended ADMM itself with no convergent guarantee. Specifically, in each iteration, we use the multi-block semi-proximal ADMM with step-size at least $1$ as the prediction step to generate a good prediction point, and then make correction as small as possible for the middle $(p\!-\!2)$ blocks of the prediction point. Among others, the step-size of the multi-block semi-proximal ADMM is adaptively determined by the infeasibility ratio made up by the current semi-proximal ADMM step for the one yielded by the last correction step. For the proposed corrected semi-proximal ADMM, we establish the global convergence results under a mild assumption, and apply it to the important class of doubly nonnegative semidefinite programming (DNN-SDP) problems with many linear equality and/or inequality constraints. Our extensive numerical tests show that the corrected semi-proximal ADMM is superior to the directly extended ADMM with step-size $\tau=1.618$ and the multi-block ADMM with Gaussian back substitution \cite{HTY12,HY13}. It requires the least number of iterations for $70\%$ test instances within the comparable computing time with that of the directly extended ADMM, and for about $40\%$ tested problems, its number of iterations is only $67\%$ that of the multi-block ADMM with Gaussian back substitution \cite{HTY12,HY13}.

Article

Download

View A corrected semi-proximal ADMM for multi-block convex optimization and its application to DNN-SDPs