Inverse Optimization with Discrete Decisions

Inverse optimization (IO) has emerged as a powerful framework for analyzing prescriptive model parameters that rationalize observed or prescribed decisions. Despite the prevalence of discrete decision-making models, existing work has primarily focused on continuous and convex models, for which the corresponding IO problems are far easier to solve. This paper makes three contributions that broaden … Read more

Sensitivity analysis for parametric nonlinear programming: A tutorial

This tutorial provides an overview of the current state-of-the-art in the sensitivity analysis for nonlinear programming. Building upon the fundamental work of Fiacco, it derives the sensitivity of primal-dual solutions for regular nonlinear programs and explores the extent to which Fiacco’s framework can be extended to degenerate nonlinear programs with non-unique dual solutions. The survey … Read more

Coherent Local Explanations for Mathematical Optimization

The surge of explainable artificial intelligence methods seeks to enhance transparency and explainability in machine learning models. At the same time, there is a growing demand for explaining decisions taken through complex algorithms used in mathematical optimization. However, current explanation methods do not take into account the structure of the underlying optimization problem, leading to … Read more

The Prime Programming Problem: Formulations and Solution Methods

We introduce the prime programming problem as a subclass of integer programming. These optimization models impose the restriction of feasible solutions being prime numbers. Then, we demonstrate how several classical problems in number theory can be formulated as prime programs. To solve such problems with a commercial optimization solver, we extend the branch-and-bound procedure of … Read more

Sensitivity Analysis in Dantzig-Wolfe Decomposition

Dantzig-Wolfe decomposition is a well-known classical method for solving huge linear optimization problems with a block-angular structure. The most computationally expensive process in the method is pricing: solving block subproblems for a dual variable to produce new columns. Therefore, when we want to solve a slightly perturbated problem in which the block-angular structure is preserved … Read more

Sensitivity-based decision support for critical measures using the example of COVID-19 dynamics

We parametrize public policies in the context of the COVID-19 pandemic to evaluate the effectiveness of policies through sensitivity-based methods in order to offer insights into understanding the contributions to critical measures in retrospective. The study utilizes a group-specific SEIR model with a tracing and isolation strategy and vaccination programs. Public policies are applied to … Read more

Bounding-Focused Discretization Methods for the Global Optimization of Nonconvex Semi-Infinite Programs

We use sensitivity analysis to design bounding-focused discretization (cutting-surface) methods for the global optimization of nonconvex semi-infinite programs (SIPs). We begin by formulating the optimal bounding-focused discretization of SIPs as a max-min problem and propose variants that are more computationally tractable. We then use parametric sensitivity theory to design an effective heuristic approach for solving … Read more

Strong Partitioning and a Machine Learning Approximation for Accelerating the Global Optimization of Nonconvex QCQPs

We learn optimal instance-specific heuristics for the global minimization of nonconvex quadratically-constrained quadratic programs (QCQPs). Specifically, we consider partitioning-based convex mixed-integer programming relaxations for nonconvex QCQPs and propose the novel problem of strong partitioning to optimally partition variable domains without sacrificing global optimality. Since solving this max-min strong partitioning problem exactly can be very challenging, … Read more

Fixed-Point Automatic Differentiation of Forward–Backward Splitting Algorithms for Partly Smooth Functions

A large class of non-smooth practical optimization problems can be written as minimization of a sum of smooth and partly smooth functions. We consider such structured problems which also depend on a parameter vector and study the problem of differentiating its solution mapping with respect to the parameter which has far reaching applications in sensitivity … Read more

Exponential Decay of Sensitivity in Graph-Structured Nonlinear Programs

We study solution sensitivity for nonlinear programs (NLPs) whose structure is induced by a graph $\mathcal{G}=(\mathcal{V},\mathcal{E})$. These graph-structured NLPs arise in many applications such as dynamic optimization, stochastic optimization, optimization with partial differential equations, and network optimization. We show that the sensitivity of the primal-dual solution at node $i\in \mathcal{V}$ against a data perturbation at … Read more