He and Yuan's prediction-correction framework [SIAM J. Numer. Anal. 50: 700-709, 2012] is able to provide convergent algorithms for solving separable convex optimization problems at a rate of $O(1/t)$ ($t$ represents iteration times) in both ergodic (the average of iteration) and pointwise senses. This paper presents a faster prediction-correction framework at a rate of $O(1/t)$ in the non-ergodic sense (the last iteration) and $O(1/t^2)$ in the pointwise sense. Based the faster prediction-correction framework, we give three faster algorithms which enjoy $O(1/t)$ in the non-ergodic sense of primal-dual gap and $O(1/t^2)$ in the pointwise sense. The first algorithm updates dual variable twice when solving two-block separable convex optimization with equality linear constraints. The second algorithm solves multi-block separable convex optimization problems with linear equality constraints in Gauss-Seidel way. The third algorithm solves minmax problems with larger step sizes.
Article
View Solving separable convex optimization problems: Faster prediction-correction framework