Projection: A Unified Approach to Semi-Infinite Linear Programs and Duality in Convex Programming

Fourier-Motzkin elimination is a projection algorithm for solving finite linear programs. We extend Fourier-Motzkin elimination to semi-infinite linear programs which are linear programs with finitely many variables and infinitely many constraints. Applying projection leads to new characterizations of important properties for primal-dual pairs of semi-infinite programs such as zero duality gap, feasibility, boundedness, and solvability. … Read more

On the convergence rate improvement of a primal-dual splitting algorithm for solving monotone inclusion problems

We present two modified versions of the primal-dual splitting algorithm relying on forward-backward splitting proposed in [21] for solving monotone inclusion problems. Under strong monotonicity assumptions for some of the operators involved we obtain for the sequences of iterates that approach the solution orders of convergence of ${\cal {O}}(\frac{1}{n})$ and ${\cal {O}}(\omega^n)$, for $\omega \in … Read more

An Augmented Lagrangian Method for Conic Convex Programming

We propose a new first-order augmented Lagrangian algorithm ALCC for solving convex conic programs of the form min{rho(x)+gamma(x): Ax-b in K, x in chi}, where rho and gamma are closed convex functions, and gamma has a Lipschitz continuous gradient, A is mxn real matrix, K is a closed convex cone, and chi is a “simple” … Read more

Proximal bundle methods in depth: a unified analysis for inexact oracles

The last few years have seen the advent ofa new generation of bundle methods, capable to handle inexact oracles, polluted by “noise”. Proving convergence of a bundle method is never simple and coping with inexact oracles substantially increases the technicalities. Besides, several variants exist to deal with noise, each one needing an ad hoc proof … Read more

Augmented Lagrangian and Alternating Direction Methods for Convex Optimization: A Tutorial and Some Illustrative Computational Results

The alternating direction of multipliers (ADMM) is a form of augmented Lagrangian algorithm that has experienced a renaissance in recent years due to its applicability to optimization problems arising from “big data” and image processing applications, and the relative ease with which it may be implemented in parallel and distributed computational environments. This paper aims … Read more

A Douglas-Rachford type primal-dual method for solving inclusions with mixtures of composite and parallel-sum type monotone operators

In this paper we propose two different primal-dual splitting algorithms for solving inclusions involving mixtures of composite and parallel-sum type monotone operators which rely on an inexact Douglas-Rachford splitting method, however applied in different underlying Hilbert spaces. Most importantly, the algorithms allow to process the bounded linear operators and the set-valued operators occurring in the … Read more

Convergence analysis of the Peaceman-Rachford splitting method for nonsmooth convex optimization

In this paper, we focus on the convergence analysis for the application of the Peaceman-Rachford splitting method to a convex minimization model whose objective function is the sum of a smooth and nonsmooth convex functions. The sublinear convergence rate in term of the worst-case O(1/t) iteration complexity is established if the gradient of the smooth … Read more

Parallel Coordinate Descent Methods for Big Data Optimization

In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially separable smooth convex function and a simple separable convex function. The theoretical speedup, as compared to the serial method, and referring to the number of iterations needed … Read more

COMPUTATIONAL COMPLEXITY OF INEXACT GRADIENT AUGMENTED LAGRANGIAN METHODS: APPLICATION TO CONSTRAINED MPC

We study the computational complexity certification of inexact gradient augmented Lagrangian methods for solving convex optimization problems with complicated constraints. We solve the augmented Lagrangian dual problem that arises from the relaxation of complicating constraints with gradient and fast gradient methods based on inexact first order information. Moreover, since the exact solution of the augmented … Read more

Convergence rate and iteration complexity on the alternating direction method of multipliers with a substitution procedure for separable convex programming

Recently, in [17] we have showed the first possibility of combining the Douglas-Rachford alternating direction method of multipliers (ADMM) with a Gaussian back substitution procedure for solving a convex minimization model with a general separable structure. This paper is a further study on theoretical aspects of this theme. We first derive a general algorithmic framework … Read more