libDIPS — Discretization-Based Semi-Infinite and Bilevel Programming Solvers

We consider several hierarchical optimization programs: (generalized) semi-infinite and existence-constrained semi-infinite programs, minmax, and bilevel programs. Multiple adaptive discretization-based algorithms have been published for these program classes in recent decades. However, rigorous numerical performance comparisons between these algorithms are lacking. Indeed, if numerical comparisons are provided at all, they usually compare a small selection of … Read more

ODTlearn: A Package for Learning Optimal Decision Trees for Prediction and Prescription

ODTLearn is an open-source Python package that provides methods for learning optimal decision trees for high-stakes predictive and prescriptive tasks based on the mixed-integer optimization (MIO) framework proposed in Aghaei et al. (2019) and several of its extensions. The current version of the package provides implementations for learning optimal classification trees, optimal fair classification trees, … Read more

MOSDEX: A New Standard for Data Exchange with Optimization Solvers

This paper offers a new standard, called MOSDEX (Mathematical Optimization Solver Data EXchange), for managing the interaction of data with solvers for mathematical optimization. The rationale for this standard is to take advantage of modern software tools that can efficiently handle very large datasets that have become the norm for data analytics in the past … Read more

Asynchronous Iterations in Optimization: New Sequence Results and Sharper Algorithmic Guarantees

We introduce novel convergence results for asynchronous iterations that appear in the analysis of parallel and distributed optimization algorithms. The results are simple to apply and give explicit estimates for how the degree of asynchrony impacts the convergence rates of the iterates. Our results shorten, streamline and strengthen existing convergence proofs for several asynchronous optimization … Read more

Solving low-rank semidefinite programs via manifold optimization

We propose a manifold optimization approach to solve linear semidefinite programs (SDP) with low-rank solutions. This approach incorporates the augmented Lagrangian method and the Burer-Monteiro factorization, and features the adaptive strategies for updating the factorization size and the penalty parameter. We prove that the present algorithm can solve SDPs to global optimality, despite of the … Read more

Effective matrix adaptation strategy for noisy derivative-free optimization

In this paper, we introduce a new effective matrix adaptation evolution strategy (MADFO) for noisy derivative-free optimization problems. Like every MAES solver, MADFO consists of three phases: mutation, selection and recombination. MADFO improves the mutation phase by generating good step sizes, neither too small nor too large, that increase the probability of selecting mutation points … Read more

Model-Based Derivative-Free Optimization Methods and Software

This thesis studies derivative-free optimization (DFO), particularly model-based methods and software. These methods are motivated by optimization problems for which it is impossible or prohibitively expensive to access the first-order information of the objective function and possibly the constraint functions. In particular, this thesis presents PDFO, a package we develop to provide both MATLAB and Python … Read more

PDFO: A Cross-Platform Package for Powell’s Derivative-Free Optimization Solvers

The late Professor M. J. D. Powell devised five trust-region derivative-free optimization methods, namely COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. He also carefully implemented them into publicly available solvers, which are renowned for their robustness and efficiency. However, the solvers were implemented in Fortran 77 and hence may not be easily accessible to some users. … Read more

Strengthening SONC Relaxations with Constraints Derived from Variable Bounds

Nonnegativity certificates can be used to obtain tight dual bounds for polynomial optimization problems. Hierarchies of certificate-based relaxations ensure convergence to the global optimum, but higher levels of such hierarchies can become very computationally expensive, and the well-known sums of squares hierarchies scale poorly with the degree of the polynomials. This has motivated research into … Read more

Orbital Crossover

Symmetry in optimization has been known to wreak havoc in optimization algorithms. Often, some of the hardest instances are highly symmetric. This is not the case in linear programming, as symmetry allows one to reduce the size of the problem, possibly dramatically, while still maintaining the same optimal objective value. This is done by aggregating … Read more