Strong duality in conic linear programming: facial reduction and extended duals

The facial reduction algorithm of Borwein and Wolkowicz and the extended dual of Ramana provide a strong dual for the conic linear program (P) \sup { | Ax \leq_K b} in the absence of any constraint qualification. The facial reduction algorithm solves a sequence of auxiliary optimization problems to obtain such a dual. Ramana’s dual … Read more

Distributionally Robust Convex Optimization

Distributionally robust optimization is a paradigm for decision-making under uncertainty where the uncertain problem data is governed by a probability distribution that is itself subject to uncertainty. The distribution is then assumed to belong to an ambiguity set comprising all distributions that are compatible with the decision maker’s prior information. In this paper, we propose … Read more

Sparse Recovery on Euclidean Jordan Algebras

We consider the sparse recovery problem on Euclidean Jordan algebra (SREJA), which includes sparse signal recovery and low-rank symmetric matrix recovery as special cases. We introduce the restricted isometry property, null space property (NSP), and $s$-goodness for linear transformations in $s$-sparse element recovery on Euclidean Jordan algebra (SREJA), all of which provide sufficient conditions for … Read more

Effectiveness-Equity Models for Facility Location Problems on Tree Networks

We propose models to investigate effectiveness-equity tradeoffs in tree network facility location problems. We use the commonly used median objective as a measure of effectiveness, and the Gini index as a measure of (in)equity, and formulate bicriteria problems involving these objectives. We develop procedures to identify an efficient set of solutions to these problems, analyze … Read more

S-semigoodness for Low-Rank Semidefinite Matrix Recovery

We extend and characterize the concept of $s$-semigoodness for a sensing matrix in sparse nonnegative recovery (proposed by Juditsky , Karzan and Nemirovski [Math Program, 2011]) to the linear transformations in low-rank semidefinite matrix recovery. We show that s-semigoodness is not only a necessary and sufficient condition for exact $s$-rank semidefinite matrix recovery by a … Read more

Augmented Lagrangian and Alternating Direction Methods for Convex Optimization: A Tutorial and Some Illustrative Computational Results

The alternating direction of multipliers (ADMM) is a form of augmented Lagrangian algorithm that has experienced a renaissance in recent years due to its applicability to optimization problems arising from “big data” and image processing applications, and the relative ease with which it may be implemented in parallel and distributed computational environments. This paper aims … Read more

A Douglas-Rachford type primal-dual method for solving inclusions with mixtures of composite and parallel-sum type monotone operators

In this paper we propose two different primal-dual splitting algorithms for solving inclusions involving mixtures of composite and parallel-sum type monotone operators which rely on an inexact Douglas-Rachford splitting method, however applied in different underlying Hilbert spaces. Most importantly, the algorithms allow to process the bounded linear operators and the set-valued operators occurring in the … Read more

Convergence analysis of the Peaceman-Rachford splitting method for nonsmooth convex optimization

In this paper, we focus on the convergence analysis for the application of the Peaceman-Rachford splitting method to a convex minimization model whose objective function is the sum of a smooth and nonsmooth convex functions. The sublinear convergence rate in term of the worst-case O(1/t) iteration complexity is established if the gradient of the smooth … Read more

Parallel Coordinate Descent Methods for Big Data Optimization

In this work we show that randomized (block) coordinate descent methods can be accelerated by parallelization when applied to the problem of minimizing the sum of a partially separable smooth convex function and a simple separable convex function. The theoretical speedup, as compared to the serial method, and referring to the number of iterations needed … Read more