A double-accelerated proximal augmented Lagrangian method with applications in signal reconstruction

The Augmented Lagrangian Method (ALM) proposed in 1969 is still full of vitality in the fields of large-scale optimization. In this paper, we consider a linearly constrained composite convex minimization problem and propose a general proximal ALM which not only enjoys a general Nesterov acceleration and the popular relaxed acceleration, but also can exploit an indefinite proximal term. Under proper assumptions (without) requiring the prior knowledge of strongly convex modulus of the objective function, we show that the proposed method is globally convergent and has the $\C{O}( 1/k^2 )$ nonergodic convergence rate in terms of the Lagrangian residual, the objective gap, and the constraint violation, where $k$ denotes the iteration number. Compared with several well-established methods, performance of our method is verified on testing large-scale sparse signal reconstruction problems.

Article

Download

View PDF