Convergence Rate of Projected Subgradient Method with Time-varying Step-sizes

We establish the optimal ergodic convergence rate for the classical projected subgradient method with time-varying step-sizes. This convergence rate remains the same even if we slightly increase the weight of the most recent points, thereby relaxing the ergodic sense. Article Download View Convergence Rate of Projected Subgradient Method with Time-varying Step-sizes

On Glowinski’s Open Question of Alternating Direction Method of Multipliers

The alternating direction method of multipliers (ADMM) was proposed by Glowinski and Marrocco in 1975; and it has been widely used in a broad spectrum of areas, especially in some sparsity-driven application domains. In 1982, Fortin and Glowinski suggested to enlarge the range of the step size for updating the dual variable in ADMM from … Read more

Barzilai-Borwein Step Size for Stochastic Gradient Descent

One of the major issues in stochastic gradient descent (SGD) methods is how to choose an appropriate step size while running the algorithm. Since the traditional line search technique does not apply for stochastic optimization algorithms, the common practice in SGD is either to use a diminishing step size, or to tune a fixed step … Read more

On the Step Size of Symmetric Alternating Directions Method of Multipliers

The alternating direction method of multipliers (ADMM) is an application of the Douglas-Rachford splitting method; and the symmetric version of ADMM which updates the Lagrange multiplier twice at each iteration is an application of the Peaceman-Rachford splitting method. Sometimes the symmetric ADMM works empirically; but theoretically its convergence is not guaranteed. It was recently found … Read more