The Prime Programming Problem: Formulations and Solution Methods

We introduce the prime programming problem as a subclass of integer programming. These optimization models impose the restriction of feasible solutions being prime numbers. Then, we demonstrate how several classical problems in number theory can be formulated as prime programs. To solve such problems with a commercial optimization solver, we extend the branch-and-bound procedure of … Read more

Sensitivity Analysis in Dantzig-Wolfe Decomposition

Dantzig-Wolfe decomposition is a well-known classical method for solving huge linear optimization problems with a block-angular structure. The most computationally expensive process in the method is pricing: solving block subproblems for a dual variable to produce new columns. Therefore, when we want to solve a slightly perturbated problem in which the block-angular structure is preserved … Read more

Sensitivity-based decision support for critical measures using the example of COVID-19 dynamics

We parametrize public policies in the context of the COVID-19 pandemic to evaluate the effectiveness of policies through sensitivity-based methods in order to offer insights into understanding the contributions to critical measures in retrospective. The study utilizes a group-specific SEIR model with a tracing and isolation strategy and vaccination programs. Public policies are applied to … Read more

Optimality-Based Discretization Methods for the Global Optimization of Nonconvex Semi-Infinite Programs

We use sensitivity analysis to design optimality-based discretization (cutting-plane) methods for the global optimization of nonconvex semi-infinite programs (SIPs). We begin by formulating the optimal discretization of SIPs as a max-min problem and propose variants that are more computationally tractable. We then use parametric sensitivity theory to design an efficient method for solving these max-min … Read more

Strong Partitioning and a Machine Learning Approximation for Accelerating the Global Optimization of Nonconvex QCQPs

We learn optimal instance-specific heuristics for the global minimization of nonconvex quadratically-constrained quadratic programs (QCQPs). Specifically, we consider partitioning-based convex mixed-integer programming relaxations for nonconvex QCQPs and propose the novel problem of strong partitioning to optimally partition variable domains without sacrificing global optimality. Since solving this max-min strong partitioning problem exactly can be very challenging, … Read more

Fixed-Point Automatic Differentiation of Forward–Backward Splitting Algorithms for Partly Smooth Functions

A large class of non-smooth practical optimization problems can be written as minimization of a sum of smooth and partly smooth functions. We consider such structured problems which also depend on a parameter vector and study the problem of differentiating its solution mapping with respect to the parameter which has far reaching applications in sensitivity … Read more

Exponential Decay of Sensitivity in Graph-Structured Nonlinear Programs

We study solution sensitivity for nonlinear programs (NLPs) whose structure is induced by a graph $\mathcal{G}=(\mathcal{V},\mathcal{E})$. These graph-structured NLPs arise in many applications such as dynamic optimization, stochastic optimization, optimization with partial differential equations, and network optimization. We show that the sensitivity of the primal-dual solution at node $i\in \mathcal{V}$ against a data perturbation at … Read more

Parametric analysis of conic linear optimization

This paper focuses on the parametric analysis of a conic linear optimization problem with respect to the perturbation of the objective function along many fixed directions. We introduce the concept of the primal and dual conic linear inequality representable sets, which is very helpful for converting the correlation of the parametric conic linear optimization problems … Read more

Envelope Theorems for Multi-Stage Linear Stochastic Optimization

We propose a method to compute derivatives of multi-stage linear stochastic optimization problems with respect to parameters that influence the problem’s data. Our results are based on classical envelope theorems, and can be used in problems directly solved via their deterministic equivalents as well as in stochastic dual dynamic programming for which the derivatives of … Read more

Sensitivity Analysis for Nonlinear Programming in CasADi

We present an extension of the CasADi numerical optimization framework that allows arbitrary order NLP sensitivities to be calculated automatically and efficiently. The approach, which can be used together with any NLP solver available in CasADi, is based on a sparse QR factorization and an implementation of a primal-dual active set method. The whole toolchain … Read more