A Novel Approach for Solving Convex Problems with Cardinality Constraints

In this paper we consider the problem of minimizing a convex differentiable function subject to sparsity constraints. Such constraints are non-convex and the resulting optimization problem is known to be hard to solve. We propose a novel generalization of this problem and demonstrate that it is equivalent to the original sparsity-constrained problem if a certain … Read more

On the convergence of a regularized Jacobi algorithm for convex optimization

In this paper we consider the regularized version of the Jacobi algorithm, a block coordinate descent method for convex optimization with differentiable objective function and block-separable constraints that has been recently proposed in the literature. Under certain regularity assumptions on the objective function, this algorithm has been shown to satisfy the so-called sufficient decrease condition, … Read more

Tight global linear convergence rate bounds for operator splitting methods

In this paper we establish necessary and sufficient conditions for linear convergence of operator splitting methods for a general class of convex optimization problems where the associated fixed-point operator is averaged. We also provide a tight bound on the achievable convergence rate. Most existing results establishing linear convergence in such methods require restrictive assumptions regarding … Read more