OPM, a collection of Optimization Problems in Matlab

OPM is a small collection of CUTEst unconstrained and bound-constrained nonlinear optimization problems, which can be used in Matlab for testing optimization algorithms directly (i.e. without installing additional software). Article Download View OPM, a collection of Optimization Problems in Matlab

The Value of Robust Assortment Optimization Under Ranking-based Choice Models

We study a class of robust assortment optimization problems that was proposed by Farias, Jagabathula, and Shah (2013). The goal in these problems is to find an assortment that maximizes a firm’s worst-case expected revenue under all ranking-based choice models that are consistent with the historical sales data generated by the firm’s past assortments. We … Read more

A Finitely Convergent Cutting Plane, and a Bender’s Decomposition Algorithm for Mixed-Integer Convex and Two-Stage Convex Programs using Cutting Planes

We consider a general mixed-integer convex program. We first develop an algorithm for solving this problem, and show its nite convergence. We then develop a finitely convergent decomposition algorithm that separates binary variables from integer and continuous variables. The integer and continuous variables are treated as second stage variables. An oracle for generating a parametric … Read more

Robust Concave Utility Maximization over Chance Constraints

This paper first studies an expected utility problem with chance constraints and incomplete information on a decision maker’s utility function. The model maximizes the worst-case expected utility of random outcome over a set of concave functions within a novel ambiguity set, while the underlying probability distribution is known. To obtain computationally tractable formulations, we employ … Read more

Tractable Robust Supervised Learning Models

At the heart of supervised learning is a minimization problem with an objective function that evaluates a set of training data over a loss function that penalizes poor fitting and a regularization function that penalizes over-fitting to the training data. More recently, data-driven robust optimization based learning models provide an intuitive robustness perspective of regularization. … Read more

Lower bound on size of branch-and-bound trees for solving lot-sizing problem

We show that there exists a family of instances of the lot-sizing problem, such that any branch-and-bound tree that solves them requires an exponential number of nodes, even in the case when the branchings are performed on general split disjunctions. Article Download View Lower bound on size of branch-and-bound trees for solving lot-sizing problem

Some Strongly Polynomially Solvable Convex Quadratic Programs with Bounded Variables

This paper begins with a class of convex quadratic programs (QPs) with bounded variables solvable by the parametric principal pivoting algorithm with $\mbox{O}(n^3)$ strongly polynomial complexity, where $n$ is the number of variables of the problem. Extension of the Hessian class is also discussed. Our research is motivated by a recent reference [7] wherein the … Read more

Facets of the Total Matching Polytope for bipartite graphs

The Total Matching Polytope generalizes the Stable Set Polytope and the Matching Polytope. In this paper, we give the perfect formulation for Trees and we derive two new families of valid inequalities, the balanced biclique inequalities which are always facet-defining and the non-balanced lifted biclique inequalities obtained by a lifting procedure, which are facet-defining for … Read more

Convergence Analysis of Block Majorize-Minimize Subspace Approaches

Majorization-Minimization (MM) consists of a class of efficient and effective optimization algorithms that benefit from solid theoretical foundations. MM methods have shown their great ability to tackle efficiently challenging optimization problems from signal processing, image processing, inverse problems and machine learning. When processing large amount of data/variable, as it may happen in 3D image processing, … Read more

Quantitative Statistical Robustness in Distributionally Robust Optimization Models

In distributionally robust optimization (DRO) models, sample data of the underlying exogenous uncertainty parameters are often used to construct an ambiguity set of plausible probability distributions. It is common to assume that the sample data do not contain noise. This assumption may not be fulfilled in some data-driven problems where the perceived data are potentially … Read more