A New Self-Dual Embedding Method for Convex Programming

In this paper we introduce a conic optimization formulation for inequality-constrained convex programming, and propose a self-dual embedding model for solving the resulting conic optimization problem. The primal and dual cones in this formulation are characterized by the original constraint functions and their corresponding conjugate functions respectively. Hence they are completely symmetric. This allows for … Read more

Complexity of Convex Optimization using Geometry-based Measures and a Reference Point

Our concern lies in solving the following convex optimization problem: minimize cx subject to Ax=b, x \in P, where P is a closed convex set, not necessarily a cone. We bound the complexity of computing an almost-optimal solution of this problem in terms of natural geometry-based measures of the feasible region and the level-set of … Read more

On the Primal-Dual Geometry of Level Sets in Linear and Conic Optimization

For a conic optimization problem: minimize cx subject to Ax=b, x \in C, we present a geometric relationship between the maximum norms of the level sets of the primal and the inscribed sizes of the level sets of the dual (or the other way around). CitationMIT Operations Research Center Working PaperArticleDownload View PDF

An Attractor-Repeller Approach to Floorplanning

The floorplanning (or facility layout) problem consists in finding the optimal positions for a given set of modules of fixed area (but perhaps varying dimensions) within a facility such that the distances between pairs of modules that have a positive connection cost are minimized. This is a hard discrete optimization problem; even the restricted version … Read more

A conic formulation for hBcnorm optimization

In this paper, we formulate the $l_p$-norm optimization problem as a conic optimization problem, derive its standard duality properties and show it can be solved in polynomial time. We first define an ad hoc closed convex cone, study its properties and derive its dual. This allows us to express the standard $l_p$-norm optimization primal problem … Read more

Improving complexity of structured convex optimization problems using self-concordant barriers

The purpose of this paper is to provide improved complexity results for several classes of structured convex optimization problems using to the theory of self-concordant functions developed in [2]. We describe the classical short-step interior-point method and optimize its parameters in order to provide the best possible iteration bound. We also discuss the necessity of … Read more

Proving strong duality for geometric optimization using a conic formulation

Geometric optimization is an important class of problems that has many applications, especially in engineering design. In this article, we provide new simplified proofs for the well-known associated duality theory, using conic optimization. After introducing suitable convex cones and studying their properties, we model geometric optimization problems with a conic formulation, which allows us to … Read more

A practical general approximation criterion for methods of multipliers based on Bregman distances

This paper demonstrates that for generalized methods of multipliers for convex programming based on Bregman distance kernels — including the classical quadratic method of multipliers — the minimization of the augmented Lagrangian can be truncated using a simple, generally implementable stopping criterion based only on the norms of the primal iterate and the gradient (or … Read more

A BFGS-IP algorithm for solving strongly convex optimization problems with feasibility enforced by an exact penalty approach

This paper introduces and analyses a new algorithm for minimizing a convex function subject to a finite number of convex inequality constraints. It is assumed that the Lagrangian of the problem is strongly convex. The algorithm combines interior point methods for dealing with the inequality constraints and quasi-Newton techniques for accelerating the convergence. Feasibility of … Read more

Newton Algorithms for Large-Scale Strictly Convex Separable Network Optimization

In this work we summarize the basic elements of primal and dual Newton algorithms for network optimization with continuously differentiable (strictly) convex arc cost functions. Both the basic mathematics and implementation are discussed, and hints to important tuning details are made. The exposition assumes that the reader posseses a significant level of prior knowledge in … Read more