Improved Analysis of Restarted Accelerated Gradient and Augmented Lagrangian Methods via Inexact Proximal Point Frameworks

This paper studies a class of double-loop (inner-outer) algorithms for convex composite optimization. For unconstrained problems, we develop a restarted accelerated composite gradient method that attains the optimal first-order complexity in both the convex and strongly convex settings. For linearly constrained problems, we introduce inexact augmented Lagrangian methods, including a basic method and an outer-accelerated … Read more

ADMM and inexact ALM: the QP case

Embedding randomization procedures in the Alternating Direction Method of Multipliers (ADMM) has recently attracted an increasing amount of interest as a remedy to the fact that the direct n-block generalization of ADMM is not necessarily convergent ($n \geq 3$). Even if, in practice, the introduction of such techniques could mitigate the diverging behaviour of the … Read more