The primal-dual hybrid gradient algorithm (PDHG), which is indeed the Arrow-Hurwicz method, has been widely used in image processing areas. However, the convergence of PDHG was established only under some restrictive conditions in the literature, and it is still missing for the case without extra constraints. In this paper, from a perspective of the variational inequality, we show the original PDHG can provide some descend directions to the solution set. Such a fact naturally captures the line search method in optimization. Furthermore, inspired by Newton's method, we present a reversible PDHG to solve saddle point problems. Compared with the original PDHG, each iteration of the new method needs only an additional simple correction step. Moreover, we establish its global convergence and a worst-case $\mathcal{O}(1/t)$ convergence rate. Extensive numerical experiments, including several examples of image inpainting, demonstrate the efficient performance of this novel method.
Article
View A search direction inspired primal-dual method for saddle point problems