In this paper, we focus on the convergence analysis for the application of the Peaceman-Rachford splitting method to a convex minimization model whose objective function is the sum of a smooth and nonsmooth convex functions. The sublinear convergence rate in term of the worst-case O(1/t) iteration complexity is established if the gradient of the smooth objective function is assumed to be Lipschitz continuous; and the linear convergence rate is derived if this smooth function is further assumed to be strongly convex. A key technique to obtain these convergence rate results is that we use the primal-dual gap, rather than the objective function value to measure the accuracy of iterates. We also propose a modified Peaceman-Rachford splitting method for this convex minimization model which does not require to know the involved Lipschitz constant. Convergence analysis is conducted for this modified Peaceman-Rachford splitting method.
View Convergence analysis of the Peaceman-Rachford splitting method for nonsmooth convex optimization