Block Coordinate Proximal Gradient Method for Nonconvex Optimization Problems: Convergence Analysis

We propose a block coordinate proximal gradient method for a composite minimization problem with two nonconvex function components in the objective while only one of them is assumed to be differentiable. Under some per-block Lipschitz-like conditions based on Bregman distance, but without the global Lipschitz continuity of the gradient of the differentiable function, we prove … Read more

Implementing the ADMM to Big Datasets: A Case Study of LASSO

The alternating direction method of multipliers (ADMM) has been popularly used for a wide range of applications in the literature. When big datasets with high-dimensional variables are considered, subproblems arising from the ADMM must be solved inexactly even though theoretically they may have closed-form solutions. Such a scenario immediately poses mathematical ambiguities such as how … Read more

Block-wise Alternating Direction Method of Multipliers with Gaussian Back Substitution for Multiple-block Convex Programming

We consider the linearly constrained convex minimization model with a separable objective function which is the sum of m functions without coupled variables, and discuss how to design an efficient algorithm based on the fundamental technique of splitting the augmented Lagrangian method (ALM). Our focus is the specific big-data scenario where m is huge. A … Read more