Infeasibility detection in the alternating direction method of multipliers for convex optimization

The alternating direction method of multipliers is a powerful operator splitting technique for solving structured optimization problems. For convex optimization problems, it is well-known that the algorithm generates iterates that converge to a solution, provided that it exists. If a solution does not exist, then the iterates diverge. Nevertheless, we show that they yield conclusive … Read more

Regularized Nonlinear Acceleration

We describe a convergence acceleration technique for generic optimization problems. Our scheme computes estimates of the optimum from a nonlinear average of the iterates produced by any optimization method. The weights in this average are computed via a simple linear system, whose solution can be updated online. This acceleration scheme runs in parallel to the … Read more

Sharpness, Restart and Acceleration.

The Lojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost generically. Here, we show that sharpness directly controls the performance of restart schemes. The constants quantifying sharpness are of course unobservable, but we show that optimal restart strategies are fairly robust, and searching for the best scheme only increases … Read more

Integration Methods and Accelerated Optimization Algorithms

We show that accelerated optimization methods can be seen as particular instances of multi-step integration schemes from numerical analysis, applied to the gradient flow equation. In comparison with recent advances in this vein, the differential equation considered here is the basic gradient flow and we show that multi-step schemes allow integration of this differential equation … Read more

A simplicial decomposition framework for large scale convex quadratic programming

In this paper, we analyze in depth a simplicial decomposition like algorithmic framework for large scale convex quadratic programming. In particular, we first propose two tailored strategies for handling the master problem. Then, we describe a few techniques for speeding up the solution of the pricing problem. We report extensive numerical experiments on both real … Read more

Dual Dynamic Programming with cut selection: convergence proof and numerical experiments

We consider convex optimization problems formulated using dynamic programming equations. Such problems can be solved using the Dual Dynamic Programming algorithm combined with the Level 1 cut selection strategy or the Territory algorithm to select the most relevant Benders cuts. We propose a limited memory variant of Level 1 and show the convergence of DDP … Read more

A New Use of Douglas-Rachford Splitting and ADMM for Identifying Infeasible, Unbounded, and Pathological Conic Programs

In this paper, we present a method for identifying infeasible, unbounded, and pathological conic programs based on Douglas-Rachford splitting, or equivalently ADMM. When an optimization program is infeasible, unbounded, or pathological, the iterates of Douglas-Rachford splitting diverge.Somewhat surprisingly, such divergent iterates still provide useful information, which our method uses for identification. In addition, for strongly … Read more

Oracle Complexity of Second-Order Methods for Smooth Convex Optimization

Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we study the oracle complexity of such methods, or equivalently, the number of iterations required to optimize a function to a given accuracy. Focusing on smooth and convex functions, we derive … Read more

Iteration-complexity of a Jacobi-type non-Euclidean ADMM for multi-block linearly constrained nonconvex programs

This paper establishes the iteration-complexity of a Jacobi-type non-Euclidean proximal alternating direction method of multipliers (ADMM) for solving multi-block linearly constrained nonconvex programs. The subproblems of this ADMM variant can be solved in parallel and hence the method has great potential to solve large scale multi-block linearly constrained nonconvex programs. Moreover, our analysis allows the … Read more

Bad semidefinite programs with short proofs, and the closedness of the linear image of the semidefinite cone

Semidefinite programs (SDPs) — some of the most useful and pervasive optimization problems of the last few decades — often behave pathologically: the optimal values of the primal and dual problems may differ and may not be attained. Such SDPs are theoretically interesting and often impossible to solve. Yet, the pathological SDPs in the literature … Read more