A Unified Funnel Restoration SQP Algorithm

We consider nonlinearly constrained optimization problems and discuss a generic double-loop framework consisting of four algorithmic ingredients that unifies a broad range of nonlinear optimization solvers. This framework has been implemented in the open-source solver Uno, a Swiss-army knife-like C++ optimization framework that unifies many nonlinearly constrained nonconvex optimization solvers. We illustrate the framework with … Read more

Probabilistic Iterative Hard Thresholding for Sparse Learning

For statistical modeling wherein the data regime is unfavorable in terms of dimensionality relative to the sample size, finding hidden sparsity in the ground truth can be critical in formulating an accurate statistical model. The so-called “l0 norm”, which counts the number of non-zero components in a vector, is a strong reliable mechanism of enforcing … Read more

Time-dependent Stackelberg Protection Location Games

We study a Stackelberg game in which a government positions rapid response teams and thereafter a terrorist attacks a location on a line segment. We assume the damage associated to such an attack to be time dependent. We show that there exists a subgame perfect Nash equilibrium that balances the possible damage on all intervals … Read more

Optimizing with Column Generation: Advanced Branch-Cut-and-Price Algorithms (Part I)

We are excited to present the early release of Part I of our book “Optimizing with Column Generation: advanced Branch-Cut-and-Price Algorithms”. While the book’s ultimate goal, as suggested by its subtitle, is to describe cutting-edge techniques in these algorithms, this objective is primarily addressed in the forthcoming Part II. However, we feel that the completed … Read more

Minimum-Peak-Cost Flows Over Time

Peak cost is a novel objective for flows over time that describes the amount of workforce necessary to run a system. We focus on minimising peak costs in the context of maximum temporally repeated flows and formulate the corresponding MPC-MTRF problem. First, we discuss the limitations that emerge when restricting the solution space to integral … Read more

A Two Stepsize SQP Method for Nonlinear Equality Constrained Stochastic Optimization

We develop a Sequential Quadratic Optimization (SQP) algorithm for minimizing a stochastic objective function subject to deterministic equality constraints. The method utilizes two different stepsizes, one which exclusively scales the component of the step corrupted by the variance of the stochastic gradient estimates and a second which scales the entire step. We prove that this … Read more

Single-Loop Deterministic and Stochastic Interior-Point Algorithms for Nonlinearly Constrained Optimization

An interior-point algorithm framework is proposed, analyzed, and tested for solving nonlinearly constrained continuous optimization problems. The main setting of interest is when the objective and constraint functions may be nonlinear and/or nonconvex, and when constraint values and derivatives are tractable to compute, but objective function values and derivatives can only be estimated. The algorithm … Read more

Fast Unconstrained Optimization via Hessian Averaging and Adaptive Gradient Sampling Methods

We consider minimizing finite-sum and expectation objective functions via Hessian-averaging based subsampled Newton methods. These methods allow for gradient inexactness and have fixed per-iteration Hessian approximation costs. The recent work (Na et al. 2023) demonstrated that Hessian averaging can be utilized to achieve fast \(\mathcal{O}\left(\sqrt{\frac{\log k}{k}}\right)\) local superlinear convergence for strongly convex functions in high … Read more

Cover-based inequalities for the single-source capacitated facility location problem with customer preferences

The single-source capacitated facility location problem with customer preferences (SSCFLPCP) is known to be strongly NP-hard. Computational tests imply that state-of-the-art solvers struggle with computing exact solutions. In this paper, we contribute two novel preprocessing methods which reduce the size of the considered integer programming formulation, and introduce sets of valid inequalities which decrease the … Read more

Refining asymptotic complexity bounds for nonconvex optimization methods, including why steepest descent is o(eps^{-2}) rather than O(eps^{-2})

We revisit the standard “telescoping sum” argument ubiquitous in the final steps of analyzing evaluation complexity of algorithms for smooth nonconvex optimization, and obtain a refined formulation of the resulting bound as a function of the requested accuracy eps. While bounds obtained using the standard argument typically are of the form \(O(\epsilon^{-\alpha})\) for some positive … Read more