A double-accelerated proximal augmented Lagrangian method with applications in signal reconstruction

The Augmented Lagrangian Method (ALM), firstly proposed in 1969, remains a vital framework in large-scale constrained optimization. This paper addresses a linearly constrained composite convex minimization problem and presents a general proximal ALM that incorporates both Nesterov acceleration and relaxed acceleration, while enjoying indefinite proximal terms. Under mild assumptions (potentially without requiring prior knowledge of the objective function’s strong convexity modulus), we establish global convergence and derive an $\C{O}( 1/k^2 )$ nonergodic convergence rate for the Lagrangian residual, the objective gap, and the constraint violation, where $k$ denotes the iteration number. Numerical experiments on testing large-scale sparse signal reconstruction tasks demonstrate the method’s superior performance against several well-established baselines.

Article

Download

View PDF