Improving complexity of structured convex optimization problems using self-concordant barriers

The purpose of this paper is to provide improved complexity results for several classes of structured convex optimization problems using to the theory of self-concordant functions developed in [2]. We describe the classical short-step interior-point method and optimize its parameters in order to provide the best possible iteration bound. We also discuss the necessity of … Read more

Exploiting Sparsity in Semidefinite Programming via Matrix Completion II: Implementation and Numerical Results

In Part I of this series of articles, we introduced a general framework of exploiting the aggregate sparsity pattern over all data matrices of large scale and sparse semidefinite programs (SDPs) when solving them by primal-dual interior-point methods. This framework is based on some results about positive semidefinite matrix completion, and it can be embodied … Read more

Proving strong duality for geometric optimization using a conic formulation

Geometric optimization is an important class of problems that has many applications, especially in engineering design. In this article, we provide new simplified proofs for the well-known associated duality theory, using conic optimization. After introducing suitable convex cones and studying their properties, we model geometric optimization problems with a conic formulation, which allows us to … Read more

On Robust Optimization of Two-Stage Systems

Robust optimization extends stochastic programming models by incorporating measures of variability into the objective function. This paper explores robust optimization in the context of two-stage planning systems. First, we propose the use of a generalized Benders decomposition algorithm for solving robust models. Next, we argue that using an arbitrary measure for variability can lead to … Read more

A practical general approximation criterion for methods of multipliers based on Bregman distances

This paper demonstrates that for generalized methods of multipliers for convex programming based on Bregman distance kernels — including the classical quadratic method of multipliers — the minimization of the augmented Lagrangian can be truncated using a simple, generally implementable stopping criterion based only on the norms of the primal iterate and the gradient (or … Read more

Convex optimization problems involving finite autocorrelation sequences

We discuss convex optimization problems where some of the variables are constrained to be finite autocorrelation sequences. Problems of this form arise in signal processing and communications, and we describe applications in filter design and system identification. Autocorrelation constraints in optimization problems are often approximated by sampling the corresponding power spectral density, which results in … Read more

Handling Nonnegative Constraints in Spectral Estimation

We consider convex optimization problems with the constraint that the variables form a finite autocorrelation sequence, or equivalently, that the corresponding power spectral density is nonnegative. This constraint is often approximated by sampling the power spectral density, which results in a set of linear inequalities. It can also be cast as a linear matrix inequality … Read more

Non Convergence Result for Conformal Approximation ofVariational Problems Subject to a Convexity Constraint

In this article, we are interested in the minimization of functionals in the set of convex functions. We investigate the discretization of the convexity through various numerical methods and find a geometrical obstruction confirmed by numerical simulations. We prove that there exist some convex functions that cannot be the limit of any conformal $P_1$ Finite … Read more

Generalized Goal Programming: Polynomial Methods and Applications

In this paper we address a general Goal Programming problem with linear objectives, convex constraints, and an arbitrary componentwise nondecreasing norm to aggregate deviations with respect to targets. In particular, classical Linear Goal Programming problems, as well as several models in Location and Regression Analysis are modeled within this framework. In spite of its generality, … Read more

Newton Algorithms for Large-Scale Strictly Convex Separable Network Optimization

In this work we summarize the basic elements of primal and dual Newton algorithms for network optimization with continuously differentiable (strictly) convex arc cost functions. Both the basic mathematics and implementation are discussed, and hints to important tuning details are made. The exposition assumes that the reader posseses a significant level of prior knowledge in … Read more