Generalized Ellipsoids

We introduce a family of symmetric convex bodies called generalized ellipsoids of degree \(d\) (GE-\(d\)s), with ellipsoids corresponding to the case of \(d=0\). Generalized ellipsoids (GEs) retain many geometric, algebraic, and algorithmic properties of ellipsoids. We show that the conditions that the parameters of a GE must satisfy can be checked in strongly polynomial time, … Read more

The Role of Level-Set Geometry on the Performance of PDHG for Conic Linear Optimization

We consider solving huge-scale instances of (convex) conic linear optimization problems, at the scale where matrix-factorization-free methods are attractive or necessary. The restarted primal-dual hybrid gradient method (rPDHG) — with heuristic enhancements and GPU implementation — has been very successful in solving huge-scale linear programming (LP) problems; however its application to more general conic convex … Read more

A Clustering-based uncertainty set for Robust Optimization

Robust Optimization (RO) is an approach to tackle uncertainties in the parameters of an optimization problem. Constructing an uncertainty set is crucial for RO, as it determines the quality and the conservativeness of the solutions. In this paper, we introduce an approach for constructing a data-driven uncertainty set through volume-based clustering, which we call Minimum-Volume … Read more

Safely Learning Dynamical Systems

A fundamental challenge in learning an unknown dynamical system is to reduce model uncertainty by making measurements while maintaining safety. In this work, we formulate a mathematical definition of what it means to safely learn a dynamical system by sequentially deciding where to initialize the next trajectory. In our framework, the state of the system … Read more

Solving Two-Trust-Region Subproblems using Semidefinite Optimization with Eigenvector Branching

Semidefinite programming (SDP) problems typically utilize the constraint that X-xx’ is PSD to obtain a convex relaxation of the condition X=xx’, where x is an n-vector. In this paper we consider a new hyperplane branching method for SDP based on using an eigenvector of X-xx’. This branching technique is related to previous work of Saxeena, … Read more

An extension of the Reformulation-Linearization Technique to nonlinear optimization

We introduce a novel Reformulation-Perspectification Technique (RPT) to obtain convex approximations of nonconvex continuous optimization problems. RPT consists of two steps, those are, a reformulation step and a perspectification step. The reformulation step generates redundant nonconvex constraints from pairwise multiplication of the existing constraints. The perspectification step then convexifies the nonconvex components by using perspective … Read more

Safely Learning Dynamical Systems from Short Trajectories

A fundamental challenge in learning to control an unknown dynamical system is to reduce model uncertainty by making measurements while maintaining safety. In this work, we formulate a mathematical definition of what it means to safely learn a dynamical system by sequentially deciding where to initialize the next trajectory. In our framework, the state of … Read more

Towards practical generic conic optimization

Many convex optimization problems can be represented through conic extended formulations with auxiliary variables and constraints using only the small number of standard cones recognized by advanced conic solvers such as MOSEK 9. Such extended formulations are often significantly larger and more complex than equivalent conic natural formulations, which can use a much broader class … Read more

Data-Driven Two-Stage Conic Optimization with Zero-One Uncertainties

We address high-dimensional zero-one random parameters in two-stage convex conic optimization problems. Such parameters typically represent failures of network elements and constitute rare, high-impact random events in several applications. Given a sparse training dataset of the parameters, we motivate and study a distributionally robust formulation of the problem using a Wasserstein ambiguity set centered at … Read more

Gaddum’s test for symmetric cones

A real symmetric matrix “A” is copositive if the inner product if Ax and x is nonnegative for all x in the nonnegative orthant. Copositive programming has attracted a lot of attention since Burer showed that hard nonconvex problems can be formulated as completely-positive programs. Alas, the power of copositive programming is offset by its … Read more