Optimized choice of parameters in interior-point methods for linear programming

In this work, we propose a predictor-corrector interior point method for linear programming in a primal-dual context, where the next iterate is chosen by the minimization of a polynomial merit function of three variables: the first is the steplength, the second defines the central path and the third models the weight of a corrector direction. … Read more

The Dynamic Dispatch Waves Problem for Same-Day Delivery

We study same-day delivery systems by formulating the Dynamic Dispatch Waves Problem (DDWP), which models a distribution center where geographically located delivery orders realize dynamically throughout the day. At each decision epoch (wave), the system’s operator chooses whether or not to dispatch a vehicle route loaded with orders ready for service, to minimize vehicle travel … Read more

Linear superiorization for infeasible linear programming

Linear superiorization (abbreviated: LinSup) considers linear programming (LP) problems wherein the constraints as well as the objective function are linear. It allows to steer the iterates of a feasibility-seeking iterative process toward feasible points that have lower (not necessarily minimal) values of the objective function than points that would have been reached by the same … Read more

Can linear superiorization be useful for linear optimization problems?

Linear superiorization considers linear programming problems but instead of attempting to solve them with linear optimization methods it employs perturbation resilient feasibility-seeking algorithms and steers them toward reduced (not necessarily minimal) target function values. The two questions that we set out to explore experimentally are (i) Does linear superiorization provide a feasible point whose linear … Read more

An Augmented Lagrangian Filter Method for Real-Time Embedded Optimization

We present a filter line-search algorithm for nonconvex continuous optimization that combines an augmented Lagrangian function and a constraint violation metric to accept and reject steps. The approach is motivated by real-time optimization applications that need to be executed on embedded computing platforms with limited memory and processor speeds. In particular, the proposed method enables … Read more

Block BFGS Methods

We introduce a quasi-Newton method with block updates called Block BFGS. We show that this method, performed with inexact Armijo-Wolfe line searches, converges globally and superlinearly under the same convexity assumptions as BFGS. We also show that Block BFGS is globally convergent to a stationary point when applied to non-convex functions with bounded Hessian, and … Read more