Alternating direction methods of multipliers (ADMM) have been well studied and effectively used in various application fields. The classical ADMM must solve two subproblems exactly at each iteration. To overcome the difficulty of computing the exact solution of the subproblems, some proximal terms are added to the subproblems. Recently, Gu and Yamashita studied a special proximal ADMM whose regularized matrix in the proximal term is generated by the BFGS update (or limited memory BFGS) at every iteration for a structured quadratic optimization problem, and reported that the numbers of iterations were almost same as those by the exact ADMM in their numerical experiments. In this paper, we propose such a proximal ADMM for more general convex optimization problems, and extend the proximal term by the Broyden family update. We also show the convergence of the proposed method under standard assumptions.
Citation
Graduate School of Informatics, Kyoto University, April 2019
Article
View A proximal ADMM with the Broyden family for Convex Optimization Problems