Linear Convergence Rate of the Generalized Alternating Direction Method of Multipliers for a Class of Convex Optimization Problems

Rencently, the generalized aternating direction method of multipliers (GADMM) proposed by Eckstein and Bertsekas has received intensive attention from a broad spectrum of areas. In this paper, we consider the convergence rate of GADMM when applying to the convex optimization problems that the subdifferentials of the underlying functions are piecewise linear multifunctions, including LASSO, a well-known regression model in statistics, as a special case. We firstly prove some important inequalities for the sequence generated by the GADMM for solving the convex optimization problems. Secondly, we establish both the local linear convergence rate and the global linear convergence rate of GADMM for solving the convex optimization problems that the subdifferentials of the underlying functions are piecewise linear multifunctions. The maim results in this paper extend and improve some well-known ones in the literature.

Citation

Nov. 30, 2017

Article

Download

View PDF