Augmented Lagrangian and Alternating Direction Methods for Convex Optimization: A Tutorial and Some Illustrative Computational Results

The alternating direction of multipliers (ADMM) is a form of augmented Lagrangian algorithm that has experienced a renaissance in recent years due to its applicability to optimization problems arising from “big data” and image processing applications, and the relative ease with which it may be implemented in parallel and distributed computational environments. This paper aims … Read more

A Douglas-Rachford type primal-dual method for solving inclusions with mixtures of composite and parallel-sum type monotone operators

In this paper we propose two different primal-dual splitting algorithms for solving inclusions involving mixtures of composite and parallel-sum type monotone operators which rely on an inexact Douglas-Rachford splitting method, however applied in different underlying Hilbert spaces. Most importantly, the algorithms allow to process the bounded linear operators and the set-valued operators occurring in the … Read more

Convergence analysis of the Peaceman-Rachford splitting method for nonsmooth convex optimization

In this paper, we focus on the convergence analysis for the application of the Peaceman-Rachford splitting method to a convex minimization model whose objective function is the sum of a smooth and nonsmooth convex functions. The sublinear convergence rate in term of the worst-case O(1/t) iteration complexity is established if the gradient of the smooth … Read more

Parallel Coordinate Descent Methods for Big Data Optimization

In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially separable smooth convex function and a simple separable convex function. The theoretical speedup, as compared to the serial method, and referring to the number of iterations needed … Read more

Variational Properties of Value Functions

Regularization plays a key role in a variety of optimization formulations of inverse problems. A recurring question in regularization approaches is the selection of regularization parameters, and its effect on the solution and on the optimal value of the optimization problem. The sensitivity of the value function to the regularization parameter can be linked directly … Read more

Clarke subgradients for directionally Lipschitzian stratifiable functions

Using a geometric argument, we show that under a reasonable continuity condition, the Clarke subdifferential of a semi-algebraic (or more generally stratifiable) directionally Lipschitzian function admits a simple form: the normal cone to the domain and limits of gradients generate the entire Clarke subdifferential. The characterization formula we obtain unifies various apparently disparate results that … Read more

Convergence analysis for a primal-dual monotone + skew splitting algorithm with applications to total variation minimization

In this paper we investigate the convergence behavior of a primal-dual splitting method for solving monotone inclusions involving mixtures of composite, Lipschitzian and parallel sum type operators proposed by Combettes and Pesquet in [7]. Firstly, in the particular case of convex minimization problems, we derive convergence rates for the sequence of objective function values by … Read more

AN INEXACT PERTURBED PATH-FOLLOWING METHOD FOR LAGRANGIAN DECOMPOSITION IN LARGE-SCALE SEPARABLE CONVEX OPTIMIZATION

This paper studies an inexact perturbed path-following algorithm in the framework of Lagrangian dual decomposition for solving large-scale separable convex programming problems. Unlike the exact versions considered in the literature, we propose to solve the primal subproblems inexactly up to a given accuracy. This leads to an inexactness of the gradient vector and the Hessian … Read more

COMPUTATIONAL COMPLEXITY OF INEXACT GRADIENT AUGMENTED LAGRANGIAN METHODS: APPLICATION TO CONSTRAINED MPC

We study the computational complexity certification of inexact gradient augmented Lagrangian methods for solving convex optimization problems with complicated constraints. We solve the augmented Lagrangian dual problem that arises from the relaxation of complicating constraints with gradient and fast gradient methods based on inexact first order information. Moreover, since the exact solution of the augmented … Read more

A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints

In this paper we present a variant of random coordinate descent method for solving linearly constrained convex optimization problems with composite objective function. If the smooth part has Lipschitz continuous gradient, then the method terminates with an ϵ-optimal solution in O(N2/ϵ) iterations, where N is the number of blocks. For the class of problems with … Read more