On Augmentation Algorithms for Linear and Integer-Linear Programming: From Edmonds-Karp to Bland and Beyond

Motivated by Bland’s linear-programming generalization of the renowned Edmonds-Karp efficient refinement of the Ford-Fulkerson maximum-flow algorithm, we discuss three closely-related natural augmentation rules for linear and integer-linear optimization. In several nice situations, we show that polynomially-many augmentation steps suffice to reach an optimum. In particular, when using “discrete steepest-descent augmentations” (i.e., directions with the best … Read more

A First Course in Linear Optimization, version 3.0

This is the “front matter” of a new open-source book on Linear Optimization. The book and associated Matlab/AMPL/Mathematica programs are freely available from: https://sites.google.com/site/jonleewebpage/home/publications/#book CitationJon Lee, “A First Course in Linear Optimization”, Third Edition, Reex Press, 2013-2017.ArticleDownload View PDF

On feasibility based bounds tightening

Mathematical programming problems involving nonconvexities are usually solved to optimality using a (spatial) Branch-and-Bound algorithm. Algorithmic efficiency depends on many factors, among which the widths of the bounding box for the problem variables at each Branch-and-Bound node naturally plays a critical role. The practically fastest box-tightening algorithm is known as FBBT (Feasibility-Based Bounds Tightening): an … Read more

More Branch-and-Bound Experiments in Convex Nonlinear Integer Programming

Branch-and-Bound (B&B) is perhaps the most fundamental algorithm for the global solution of convex Mixed-Integer Nonlinear Programming (MINLP) problems. It is well-known that carrying out branching in a non-simplistic manner can greatly enhance the practicality of B&B in the context of Mixed-Integer Linear Programming (MILP). No detailed study of branching has heretofore been carried out … Read more

A Probing Algorithm for MINLP with Failure Prediction by SVM

Bound tightening is an important component of algorithms for solving nonconvex Mixed Integer Nonlinear Programs. A {\em probing} algorithm is a bound-tightening procedure that explores the consequences of restricting a variable to a subinterval with the goal of tightening its bounds. We propose a variant of probing where exploration is based on iteratively applying a … Read more

Intractability of approximate multi-dimensional nonlinear optimization on independence systems

We consider optimization of nonlinear objective functions that balance $d$ linear criteria over $n$-element independence systems presented by linear-optimization oracles. For $d=1$, we have previously shown that an $r$-best approximate solution can be found in polynomial time. Here, using an extended Erdos-Ko-Rado theorem of Frankl, we show that for $d=2$, finding a $\rho n$-best solution … Read more

Nonlinear Integer Programming

Research efforts of the past fifty years have led to a development of linear integer programming as a mature discipline of mathematical optimization. Such a level of maturity has not been reached when one considers nonlinear systems subject to integrality requirements for the variables. This chapter is dedicated to this topic.  The primary goal is … Read more

An algorithmic framework for MINLP with separable non-convexity

Global optimization algorithms, e.g., spatial branch-and-bound approaches like those implemented in codes such as BARON and COUENNE, have had substantial success in tackling complicated, but generally small scale, non-convex MINLPs (i.e., mixed-integer nonlinear programs having non-convex continuous relaxations). Because they are aimed at a rather general class of problems, the possibility remains that larger instances … Read more

On convex relaxations of quadrilinear terms

The best known method to find exact or at least epsilon-approximate solutions to polynomial programming problems is the spatial Branch-and-Bound algorithm, which rests on computing lower bounds to the value of the objective function to be minimized on each region that it explores. These lower bounds are often computed by solving convex relaxations of the … Read more