On the Proximal Jacobian Decomposition of ALM for Multiple-block Separable Convex Minimization Problems and its Relationship to ADMM

The augmented Lagrangian method (ALM) is a benchmark for solving convex minimization problems with linear constraints. When the objective function of the model under consideration is representable as the sum of some functions without coupled variables, a Jacobian or Gauss-Seidel decomposition is often implemented to decompose the ALM subproblems so that the functions’ properties could … Read more

Practical Inexact Proximal Quasi-Newton Method with Global Complexity Analysis

Recently several methods were proposed for sparse optimization which make careful use of second-order information [11, 30, 17, 3] to improve local convergence rates. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent. Here … Read more

A feasible active set method for strictly convex problems with simple bounds

A primal-dual active set method for quadratic problems with bound constraints is presented which extends the infeasible active set approach of [K. Kunisch and F. Rendl. An infeasible active set method for convex problems with simple bounds. SIAM Journal on Optimization, 14(1):35-52, 2003]. Based on a guess of the active set, a primal-dual pair (x,α) … Read more

Iteration-Complexity of a Generalized Forward Backward Splitting Algorithm

In this paper, we analyze the iteration-complexity of a generalized forward-backward (GFB) splitting algorithm, recently proposed in~\cite{gfb2011}, for minimizing the large class of composite objectives $f + \sum_{i=1}^n h_i$ on a Hilbert space, where $f$ has a Lipschitz-continuous gradient and the $h_i$’s are simple (i.e. whose proximity operator is easily computable ). We derive iteration-complexity … Read more

Conic separation of finite sets:The homogeneous case

This work addresses the issue of separating two finite sets in $\mathbb{R}^n $ by means of a suitable revolution cone $$ \Gamma (z,y,s)= \{x \in \mathbb{R}^n : s\,\Vert x-z\Vert – y^T(x-z)=0\}.$$ The specific challenge at hand is to determine the aperture coefficient $s$, the axis $y$, and the apex $z$ of the cone. These parameters … Read more

Gauge optimization, duality, and applications

Gauge functions significantly generalize the notion of a norm, and gauge optimization, as defined by Freund (1987), seeks the element of a convex set that is minimal with respect to a gauge function. This conceptually simple problem can be used to model a remarkable array of useful problems, including a special case of conic optimization, … Read more

Conic Geometric Programming

We introduce and study conic geometric programs (CGPs), which are convex optimization problems that unify geometric programs (GPs) and conic optimization problems such as linear programs (LPs) and semidefinite programs (SDPs). A CGP consists of a linear objective function that is to be minimized subject to affine constraints, convex conic constraints, and upper bound constraints … Read more

The Direct Extension of ADMM for Multi-block Convex Minimization Problems is Not Necessarily Convergent

The alternating direction method of multipliers (ADMM) is now widely used in many fields, and its convergence was proved when two blocks of variables are alternatively updated. It is strongly desirable and practically valuable to extend ADMM directly to the case of a multi-block convex minimization problem where its objective function is the sum of … Read more

Smooth minimization of nonsmooth functions with parallel coordinate descent methods

We study the performance of a family of randomized parallel coordinate descent methods for minimizing the sum of a nonsmooth and separable convex functions. The problem class includes as a special case L1-regularized L1 regression and the minimization of the exponential loss (“AdaBoost problem”). We assume the input data defining the loss function is contained … Read more

Primal-dual methods for solving infinite-dimensional games

In this paper we show that the infinite-dimensional differential games with simple objective functional can be solved in a finite-dimensional dual form in the space of dual multipliers for the constraints related to the end points of the trajectories. The primal solutions can be easily reconstructed by the appropriate dual subgradient schemes. The suggested schemes … Read more