A Data-Driven Distributionally Robust Bound on the Expected Optimal Value of Uncertain Mixed 0-1 Linear Programming

This paper studies the expected optimal value of a mixed 0-1 programming problem with uncertain objective coefficients following a joint distribution. We assume that the true distribution is not known exactly, but a set of independent samples can be observed. Using the Wasserstein metric, we construct an ambiguity set centered at the empirical distribution from … Read more

On Solving the Quadratic Shortest Path Problem

The quadratic shortest path problem is the problem of finding a path in a directed graph such that the sum of interaction costs over all pairs of arcs on the path is minimized. We derive several semidefinite programming relaxations for the quadratic shortest path problem with a matrix variable of order $m+1$, where $m$ is … Read more

A rounding procedure for semidefinite optimization

Recently, Mohammad-Nezhad and Terlaky studied the identification of the optimal partition for semidefinite optimization. An approximation of the optimal partition was obtained from a bounded sequence of solutions on, or in a neighborhood of the central path. Here, we use the approximation of the optimal partition in a rounding procedure to generate an approximate maximally … Read more

Semidefinite Programming and Nash Equilibria in Bimatrix Games

We explore the power of semidefinite programming (SDP) for finding additive epsilon-approximate Nash equilibria in bimatrix games. We introduce an SDP relaxation for a quadratic programming formulation of the Nash equilibrium (NE) problem and provide a number of valid inequalities to improve the quality of the relaxation. If a rank-1 solution to this SDP is … Read more

SDP-based Branch-and-Bound for Non-convex Quadratic Integer Optimization

Semidefinite programming (SDP) relaxations have been intensively used for solving discrete quadratic optimization problems, in particular in the binary case. For the general non-convex integer case with box constraints, the branch-and-bound algorithm Q-MIST has been proposed [11], which is based on an extension of the well-known SDP-relaxation for max-cut. For solving the resulting SDPs, Q-MIST … Read more

DSOS and SDSOS Optimization: More Tractable Alternatives to Sum of Squares and Semidefinite Optimization

In recent years, optimization theory has been greatly impacted by the advent of sum of squares (SOS) optimization. The reliance of this technique on large-scale semidefinite programs however, has limited the scale of problems to which it can be applied. In this paper, we introduce DSOS and SDSOS optimization as linear programming and second-order cone … Read more

Bad semidefinite programs with short proofs, and the closedness of the linear image of the semidefinite cone

Semidefinite programs (SDPs) — some of the most useful and pervasive optimization problems of the last few decades — often behave pathologically: the optimal values of the primal and dual problems may differ and may not be attained. Such SDPs are theoretically interesting and often impossible to solve. Yet, the pathological SDPs in the literature … Read more

Exact augmented Lagrangian functions for nonlinear semidefinite programming

In this paper, we study augmented Lagrangian functions for nonlinear semidefinite programming (NSDP) problems with exactness properties. The term exact is used in the sense that the penalty parameter can be taken appropriately, so a single minimization of the augmented Lagrangian recovers a solution of the original problem. This leads to reformulations of NSDP problems … Read more

Polynomial Norms

In this paper, we study polynomial norms, i.e. norms that are the dth root of a degree-d homogeneous polynomial f. We first show that a necessary and sufficient condition for f^(1/d) to be a norm is for f to be strictly convex, or equivalently, convex and positive definite. Though not all norms come from dth … Read more

The Many Faces of Degeneracy in Conic Optimization

Slater’s condition — existence of a “strictly feasible solution” — is a common assumption in conic optimization. Without strict feasibility, first-order optimality conditions may be meaningless, the dual problem may yield little information about the primal, and small changes in the data may render the problem infeasible. Hence, failure of strict feasibility can negatively impact … Read more