On parametric formulations for the Asymmetric Traveling Salesman Problem

The traveling salesman problem is a widely studied classical combinatorial problem for which there are several integer linear formulations. In this work, we consider the Miller-Tucker-Zemlin (MTZ), Desrochers-Laporte (DL) and Single Commodity Flow (SCF) formulations. We argue that the choice of some parameters of these formulations is arbitrary and, therefore, there are families of formulations … Read more

When Does Primal Interior Point Method Beat Primal-dual in Linear Optimization?

The primal-dual interior point method (IPM) is widely regarded as the most efficient IPM variant for linear optimization. In this paper, we demonstrate that the improved stability of the pure primal IPM can allow speedups relative to a primal-dual solver, particularly as the IPM approaches convergence.  The stability of the primal scaling matrix makes it … Read more

Sensitivity analysis for linear changes of the constraint matrix of a linear program

Understanding the variation of the optimal value with respect to change in the data is an old problem of mathematical optimisation. This paper focuses on the linear problem f(λ) = min ctx such that (A+λD)x ≤ b, where λ is an unknown parameter that varies within an interval and D is a matrix modifying the … Read more

Accessible Theoretical Complexity of the Restarted Primal-Dual Hybrid Gradient Method for Linear Programs with Unique Optima

The restarted primal-dual hybrid gradient method (rPDHG) has recently emerged as an important tool for solving large-scale linear programs (LPs). For LPs with unique optima, we present an iteration bound of \(\widetilde{O}\left(\kappa\Phi\cdot\ln\left(\frac{\|w^*\|}{\varepsilon}\right)\right)\), where \(\varepsilon\) is the target tolerance, \(\kappa\) is the standard matrix condition number, \(\|w^*\|\) is the norm of the optimal solution, and \(\Phi\) … Read more

Immunity to Increasing Condition Numbers of Linear Superiorization versus Linear Programming

Given a family of linear constraints and a linear objective function one can consider whether to apply a Linear Programming (LP) algorithm or use a Linear Superiorization (LinSup) algorithm on this data. In the LP methodology one aims at finding a point that fulfills the constraints and has the minimal value of the objective function … Read more

The Role of Level-Set Geometry on the Performance of PDHG for Conic Linear Optimization

We consider solving huge-scale instances of (convex) conic linear optimization problems, at the scale where matrix-factorization-free methods are attractive or necessary. The restarted primal-dual hybrid gradient method (rPDHG) — with heuristic enhancements and GPU implementation — has been very successful in solving huge-scale linear programming (LP) problems; however its application to more general conic convex … Read more

On the integrality gap of the Complete Metric Steiner Tree Problem via a novel formulation

In this work, we study the metric Steiner Tree problem on graphs focusing on computing lower bounds for the integrality gap of the bi-directed cut (DCUT) formulation and introducing a novel formulation, the Complete Metric (CM) model, specifically designed to address the weakness of the DCUT formulation on metric instances. A key contribution of our … Read more

Counterfactual Explanations for Linear Optimization

The concept of counterfactual explanations (CE) has emerged as one of the important concepts to understand the inner workings of complex AI systems. In this paper, we translate the idea of CEs to linear optimization and propose, motivate, and analyze three different types of CEs: strong, weak, and relative. While deriving strong and weak CEs … Read more

Sensitivity Analysis in Dantzig-Wolfe Decomposition

Dantzig-Wolfe decomposition is a well-known classical method for solving huge linear optimization problems with a block-angular structure. The most computationally expensive process in the method is pricing: solving block subproblems for a dual variable to produce new columns. Therefore, when we want to solve a slightly perturbated problem in which the block-angular structure is preserved … Read more

An inexact infeasible arc-search interior-point method for linear programming problems

Inexact interior-point methods (IPMs) are a type of interior-point methods that inexactly solve the linear equation system for obtaining the search direction. On the other hand, arc-search IPMs approximate the central path with an ellipsoidal arc obtained by solving two linear equation systems in each iteration, while conventional line-search IPMs solve one linear system. Therefore, … Read more