Stochastic mesh adaptive direct search for blackbox optimization using probabilistic estimates

We present a stochastic extension of the mesh adaptive direct search (MADS) algorithm originally developed for deterministic blackbox optimization. The algorithm, called StoMADS, considers the unconstrained optimization of an objective function f whose values can be computed only through a blackbox corrupted by some random noise following an unknown distribution. The proposed method is based … Read more

Performance indicators in multiobjective optimization

In recent years, the development of new algorithms for multiobjective optimization has considerably grown. A large number of performance indicators has been introduced to measure the quality of Pareto front approximations produced by these algorithms. In this work, we propose a review of a total of 63 performance indicators partitioned into four groups according to … Read more

Selection of variables in parallel space decomposition for the mesh adaptive direct search algorithm

The parallel space decomposition of the Mesh Adaptive Direct Search algorithm (PSDMADS proposed in 2008) is an asynchronous parallel method for constrained derivative-free optimization with large number of variables. It uses a simple generic strategy to decompose a problem into smaller dimension subproblems. The present work explores new strategies for selecting subset of variables defining … Read more

The Mesh Adaptive Direct Search Algorithm for Granular and Discrete Variables

The mesh adaptive direct search (Mads) algorithm is designed for blackbox optimization problems for which the functions defining the objective and the constraints are typically the outputs of a simulation seen as a blackbox. It is a derivative-free optimization method designed for continuous variables and is supported by a convergence analysis based on the Clarke … Read more

Efficient solution of quadratically constrained quadratic subproblems within the MADS algorithm

The Mesh Adaptive Direct Search algorithm (MADS) is an iterative method for constrained blackbox optimization problems. One of the optional MADS features is a versatile search step in which quadratic models are built leading to a series of quadratically constrained quadratic subproblems. This work explores different algorithms that exploit the structure of the quadratic models: … Read more

Locally weighted regression models for surrogate-assisted design optimization

Locally weighted regression combines the advantages of polynomial regression and kernel smoothing. We present three ideas for appropriate and effective use of LOcally WEighted Scatterplot Smoothing (LOWESS) models for surrogate optimization. First, a method is proposed to reduce the computational cost of LOWESS models. Second, a local scaling coefficient is introduced to adapt LOWESS models … Read more

Robust optimization of noisy blackbox problems using the Mesh Adaptive Direct Search algorithm

Blackbox optimization problems are often contaminated with numerical noise, and direct search methods such as the Mesh Adaptive Direct Search (MADS) algorithm may get stuck at solutions artificially created by the noise. We propose a way to smooth out the objective function of an unconstrained problem using previously evaluated function evaluations, rather than resampling points. … Read more

A derivative-free trust-region augmented Lagrangian algorithm

We present a new derivative-free trust-region (DFTR) algorithm to solve general nonlinear constrained problems with the use of an augmented Lagrangian method. No derivatives are used, neither for the objective function nor for the constraints. An augmented Lagrangian method, known as an effective tool to solve equality and inequality constrained optimization problems with derivatives, is … Read more

A progressive barrier derivative-free trust-region algorithm for constrained optimization

We study derivative-free constrained optimization problems and propose a trust-region method that builds linear or quadratic models around the best feasible and and around the best infeasible solutions found so far. These models are optimized within a trust region, and the progressive barrier methodology handles the constraints by progressively pushing the infeasible solutions toward the … Read more

Order-based error for managing ensembles of surrogates in derivative-free optimization

We investigate surrogate-assisted strategies for derivative-free optimization using the mesh adaptive direct search (MADS) blackbox optimization algorithm. In particular, we build an ensemble of surrogate models to be used within the search step of MADS, and examine different methods for selecting the best model for a given problem at hand. To do so, we introduce … Read more