Inexact Coordinate Descent: Complexity and Preconditioning

In this paper we consider the problem of minimizing a convex function using a randomized block coordinate descent method. One of the key steps at each iteration of the algorithm is determining the update to a block of variables. Existing algorithms assume that in order to compute the update, a particular subproblem is solved exactly. … Read more

A Second-Order Method for Strongly Convex L1-Regularization Problems

In this paper a robust second-order method is developed for the solution of strongly convex l1-regularized problems. The main aim is to make the proposed method as inexpensive as possible, while even difficult problems can be efficiently solved. The proposed method is a primal-dual Newton Conjugate Gradients (pdNCG) method. Convergence properties of pdNCG are studied … Read more

On the Complexity Analysis of Randomized Block-Coordinate Descent Methods

In this paper we analyze the randomized block-coordinate descent (RBCD) methods for minimizing the sum of a smooth convex function and a block-separable convex function. In particular, we extend Nesterov’s technique (SIOPT 2012) for analyzing the RBCD method for minimizing a smooth convex function over a block-separable closed convex set to the aforementioned more general … Read more

The proximal-proximal gradient algorithm

We consider the problem of minimizing a convex objective which is the sum of a smooth part, with Lipschitz continuous gradient, and a nonsmooth part. Inspired by various applications, we focus on the case when the nonsmooth part is a composition of a proper closed convex function P and a nonzero affine map, with the … Read more

Convergence analysis of the Peaceman-Rachford splitting method for nonsmooth convex optimization

In this paper, we focus on the convergence analysis for the application of the Peaceman-Rachford splitting method to a convex minimization model whose objective function is the sum of a smooth and nonsmooth convex functions. The sublinear convergence rate in term of the worst-case O(1/t) iteration complexity is established if the gradient of the smooth … Read more

Parallel Coordinate Descent Methods for Big Data Optimization

In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially separable smooth convex function and a simple separable convex function. The theoretical speedup, as compared to the serial method, and referring to the number of iterations needed … Read more

COMPUTATIONAL COMPLEXITY OF INEXACT GRADIENT AUGMENTED LAGRANGIAN METHODS: APPLICATION TO CONSTRAINED MPC

We study the computational complexity certification of inexact gradient augmented Lagrangian methods for solving convex optimization problems with complicated constraints. We solve the augmented Lagrangian dual problem that arises from the relaxation of complicating constraints with gradient and fast gradient methods based on inexact first order information. Moreover, since the exact solution of the augmented … Read more

Convergence rate and iteration complexity on the alternating direction method of multipliers with a substitution procedure for separable convex programming

Recently, in [17] we have showed the first possibility of combining the Douglas-Rachford alternating direction method of multipliers (ADMM) with a Gaussian back substitution procedure for solving a convex minimization model with a general separable structure. This paper is a further study on theoretical aspects of this theme. We first derive a general algorithmic framework … Read more

Iteration Complexity of Randomized Block-Coordinate Descent Methods for Minimizing a Composite Function

In this paper we develop a randomized block-coordinate descent method for minimizing the sum of a smooth and a simple nonsmooth block-separable convex function and prove that it obtains an $\epsilon$-accurate solution with probability at least $1-\rho$ in at most $O(\tfrac{n}{\epsilon} \log \tfrac{1}{\rho})$ iterations, where $n$ is the number of blocks. For strongly convex functions … Read more

A Simple Variant of the Mizuno-Todd-Ye Predictor-Corrector Algorithm and its Objective-Function-Free Complexity

In this paper, we propose a simple variant of the Mizuno-Todd-Ye predictor-corrector algorithm for linear programming problem (LP). Our variant executes a natural finite termination procedure at each iteration and it is easy to implement the algorithm. Our algorithm admits an objective-function free polynomial-time complexity when it is applied to LPs whose dual feasible region … Read more