Acceleration of Primal-Dual Methods by Preconditioning and Simple Subproblem Procedures

Primal-Dual Hybrid Gradient (PDHG) and Alternating Direction Method of Multipliers (ADMM) are two widely-used first-order optimization methods. They reduce a difficult problem to simple subproblems, so they are easy to implement and have many applications. As first-order methods, however, they are sensitive to problem conditions and can struggle to reach the desired accuracy. To improve their performance, researchers have proposed techniques such as diagonal preconditioning and inexact subproblems. This paper realizes additional speedup about one order of magnitude. Specifically, we choose non-diagonal preconditioners that are much more effective than diagonal ones. Because of this, we lose closed-form solutions to some subproblems, but we found simple procedures to replace them such as a few proximal-gradient iterations or a few epochs of proximal block-coordinate descent, which are in closed forms. We show global convergence while fixing the number of those steps in every outer iteration. Therefore, our method is reliable and straightforward. Our method opens the choices of preconditioners and maintains both low per-iteration cost and global convergence. Consequently, on several typical applications of primal-dual first-order methods, we obtain 4–95× speedup over the existing state-of-the-art.

Citation

Liu, Yanli, Yunbei Xu, and Wotao Yin. "Acceleration of Primal-Dual Methods by Preconditioning and Simple Subproblem Procedures." arXiv preprint arXiv:1811.08937 (2018).

Article

Download

View PDF