Lipschitz-free Projected Subgradient Method with Time-varying Step-size

We introduce a novel time-varying step-size for the classical projected subgradient method, offering optimal ergodic convergence. Importantly, this approach does not depend on the Lipschitz assumption of the objective function, thereby broadening the convergence result of projected subgradient method to non-Lipschitz case. Article Download View Lipschitz-free Projected Subgradient Method with Time-varying Step-size

Convergence Rate of Projected Subgradient Method with Time-varying Step-sizes

We establish the optimal ergodic convergence rate for the classical projected subgradient method with time-varying step-sizes. This convergence rate remains the same even if we slightly increase the weight of the most recent points, thereby relaxing the ergodic sense. Article Download View Convergence Rate of Projected Subgradient Method with Time-varying Step-sizes

Faster Lagrangian-based methods: a unified prediction-correction framework

Motivated by the prediction-correction framework constructed by He and Yuan [SIAM J. Numer. Anal. 50: 700-709, 2012], we propose a unified prediction-correction framework to accelerate Lagrangian-based methods. More precisely, for strongly convex optimization, general linearized Lagrangian method with indefinite proximal term, alternating direction method of multipliers (ADMM) with the step size of Lagrangian multiplier not … Read more