Distributionally Robust Optimization with Integer Recourse: Convex Reformulations and Critical Recourse Decisions

The paper studies distributionally robust optimization models with integer recourse. We develop a unified framework that provides finite tight convex relaxations under conic moment-based ambiguity sets and Wasserstein ambiguity sets.  They provide tractable primal representations without relying on sampling or semi-infinite optimization. Beyond tractability, the relaxations offer interpretability that captures the criticality of recourse decisions. … Read more

Generalized Ellipsoids

We introduce a family of symmetric convex bodies called generalized ellipsoids of degree \(d\) (GE-\(d\)s), with ellipsoids corresponding to the case of \(d=0\). Generalized ellipsoids (GEs) retain many geometric, algebraic, and algorithmic properties of ellipsoids. We show that the conditions that the parameters of a GE must satisfy can be checked in strongly polynomial time, … Read more

The Role of Level-Set Geometry on the Performance of PDHG for Conic Linear Optimization

We consider solving huge-scale instances of (convex) conic linear optimization problems, at the scale where matrix-factorization-free methods are attractive or necessary. The restarted primal-dual hybrid gradient method (rPDHG) — with heuristic enhancements and GPU implementation — has been very successful in solving huge-scale linear programming (LP) problems; however its application to more general conic convex … Read more

A Clustering-based uncertainty set for Robust Optimization

Robust optimization is an approach for handling uncertainty in optimization problems, in which the uncertainty set determines the conservativeness of the solutions. In this paper, we propose a data-driven uncertainty set using a type of volume-based clustering, which we call Minimum-Volume Norm-Based Clustering (MVNBC). MVNBC extends the concept of minimum-volume ellipsoid clustering by allowing clusters … Read more

Safely Learning Dynamical Systems

A fundamental challenge in learning an unknown dynamical system is to reduce model uncertainty by making measurements while maintaining safety. In this work, we formulate a mathematical definition of what it means to safely learn a dynamical system by sequentially deciding where to initialize the next trajectory. In our framework, the state of the system … Read more

Solving Two-Trust-Region Subproblems using Semidefinite Optimization with Eigenvector Branching

Semidefinite programming (SDP) problems typically utilize the constraint that X-xx’ is PSD to obtain a convex relaxation of the condition X=xx’, where x is an n-vector. In this paper we consider a new hyperplane branching method for SDP based on using an eigenvector of X-xx’. This branching technique is related to previous work of Saxeena, … Read more

An extension of the Reformulation-Linearization Technique to nonlinear optimization

We introduce a novel Reformulation-Perspectification Technique (RPT) to obtain convex approximations of nonconvex continuous optimization problems. RPT consists of two steps, those are, a reformulation step and a perspectification step. The reformulation step generates redundant nonconvex constraints from pairwise multiplication of the existing constraints. The perspectification step then convexifies the nonconvex components by using perspective … Read more

Safely Learning Dynamical Systems from Short Trajectories

A fundamental challenge in learning to control an unknown dynamical system is to reduce model uncertainty by making measurements while maintaining safety. In this work, we formulate a mathematical definition of what it means to safely learn a dynamical system by sequentially deciding where to initialize the next trajectory. In our framework, the state of … Read more

Towards practical generic conic optimization

Many convex optimization problems can be represented through conic extended formulations with auxiliary variables and constraints using only the small number of standard cones recognized by advanced conic solvers such as MOSEK 9. Such extended formulations are often significantly larger and more complex than equivalent conic natural formulations, which can use a much broader class … Read more

Data-Driven Two-Stage Conic Optimization with Zero-One Uncertainties

We address high-dimensional zero-one random parameters in two-stage convex conic optimization problems. Such parameters typically represent failures of network elements and constitute rare, high-impact random events in several applications. Given a sparse training dataset of the parameters, we motivate and study a distributionally robust formulation of the problem using a Wasserstein ambiguity set centered at … Read more