Motivated by the prediction-correction framework constructed by He and Yuan [SIAM J. Numer. Anal. 50: 700-709, 2012], we propose a unified prediction-correction framework to accelerate Lagrangian-based methods. More precisely, for strongly convex optimization, general linearized Lagrangian method with indefinite proximal term, alternating direction method of multipliers (ADMM) with the step size of Lagrangian multiplier not larger than $(1+\sqrt{5})/2$ (or $2$ when the objective of the composite convex optimization is the sum of a strongly convex and a linear function), linearized ADMM with indefinite proximal term, symmetric ADMM, and multi-block ADMM type method (assuming the gradient of one block is Lipschitz continuous) can achieve $O(1/k^2)$ convergence rate in the ergodic sense. The non-ergodic convergence rate is also established.

## Article

View Faster Lagrangian-based methods: a unified prediction-correction framework