Learning To Scale Mixed-Integer Programs

Many practical applications require the solution of numerically challenging linear programs (LPs) and mixed-integer programs (MIPs). Scaling is a widely used preconditioning technique that aims at reducing the error propagation of the involved linear systems, thereby improving the numerical behavior of the dual simplex algorithm and, consequently, LP-based branch-and-bound. A reliable scaling method often makes … Read more

A Distributed and Secure Algorithm for Computing Dominant SVD Based on Projection Splitting

In this paper, we propose and study a distributed and secure algorithm for computing dominant (or truncated) singular value decompositions (SVD) of large and distributed data matrices. We consider the scenario where each node privately holds a subset of columns and only exchanges “safe” information with other nodes in a collaborative effort to calculate a … Read more

Finding the Sequence of Largest Small n-Polygons by Numerical Optimization

LSP(n), the largest small polygon with n vertices, is the polygon of unit diameter that has maximal area A(n). It is known that for all odd values n≥3, LSP(n) is the regular n-polygon; however, this statement is not valid for even values of n. Finding the polygon LSP(n) and A(n) for even values n≥6 has … Read more

JuDGE.jl: a Julia package for optimizing capacity expansion

We present JuDGE.jl, an open-source Julia package for solving multistage stochastic capacity expansion problems using Dantzig-Wolfe decomposition. Models for JuDGE.jl are built using JuMP, the algebraic modelling language in Julia, and solved by repeatedly applying mixed-integer programming. We illustrate JuDGE.jl by formulating and solving a toy knapsack problem, and demonstrate the performance of JuDGE.jl on … Read more

An improved randomized algorithm with noise level tuning for large-scale noisy unconstrained DFO problems

In this paper, a new randomized solver (called VRDFON) for noisy unconstrained derivative-free optimization (DFO) problems is discussed. Complexity result in the presence of noise for nonconvex functions is studied. Two effective ingredients of VRDFON are an improved derivative-free line search algorithm with many heuristic enhancements and quadratic models in adaptively determined subspaces. Numerical results … Read more

Limited-memory Common-directions Method for Large-scale Optimization: Convergence, Parallelization, and Distributed Optimization

In this paper, we present a limited-memory common-directions method for smooth optimization that interpolates between first- and second- order methods. At each iteration, a subspace of a limited dimension size is constructed using first-order information from previous iterations, and an ef- ficient Newton method is deployed to find an approximate minimizer within this subspace. With … Read more

A Two-level ADMM Algorithm for AC OPF with Convergence Guarantees

This paper proposes a two-level distributed algorithmic framework for solving the AC optimal power flow (OPF) problem with convergence guarantees. The presence of highly nonconvex constraints in OPF poses significant challenges to distributed algorithms based on the alternating direction method of multipliers (ADMM). In particular, convergence is not provably guaranteed for nonconvex network optimization problems … Read more

The confined primal integral

It is a challenging task to fairly compare local solvers and heuristics against each other and against global solvers. How does one weigh a faster termination time against a better quality of the found solution? In this paper, we introduce the confined primal integral, a new performance measure that rewards a balance of speed and … Read more

ROC++: Robust Optimization in C++

Over the last two decades, robust optimization has emerged as a popular means to address decision-making problems affected by uncertainty. This includes single- and multi-stage problems involving real-valued and/or binary decisions, and affected by exogenous (decision-independent) and/or endogenous (decision-dependent) uncertain parameters. Robust optimization techniques rely on duality theory potentially augmented with approximations to transform a … Read more

Manifold Identification for Ultimately Communication-Efficient Distributed Optimization

This work proposes a progressive manifold identification approach for distributed optimization with sound theoretical justifications to greatly reduce both the rounds of communication and the bytes communicated per round for partly-smooth regularized problems such as the $\ell_1$- and group-LASSO-regularized ones. Our two-stage method first uses an inexact proximal quasi-Newton method to iteratively identify a sequence … Read more