A fundamental proof to convergence analysis of alternating direction method of multipliers for weakly convex optimization

The convergence analysis of the alternating direction method of multipliers (ADMM) methods to convex/nonconvex combinational optimization have been well established in literature. Consider the extensive applications of weakly convex function in signal processing and machine learning(e.g. \textit{Special issue: DC-Theory, Algorithms and Applications, Mathematical Programming, 169(1):1-336,2018}), in this paper, we investigate the convergence analysis of ADMM algorithm to the strongly and weakly convex combinational optimization(SWCCO) problem. Specifically, we firstly build the convergence of the iterative sequences of the SWCCO-ADMM under a mild regularity condition; then we establish the $o(1/k)$ sublinear convergence rate to SWCCO-ADMM algorithm using the same conditions and the linear convergence rate by imposing gradient Lipschitz continuous condition to the objective function. The techniques used for convergence analysis in this paper is fundamental, and we accomplish the global convergence without using the Kurdyka-$\L$ojasiewicz (KL) inequality, which is common but complex in the proof of nonconvex ADMM

Article

Download

View PDF