Linear equalities in blackbox optimization

The Mesh Adaptive Direct Search (Mads) algorithm is designed for blackbox optimization problems subject to general inequality constraints. Currently, Mads does not support equalities, neither in theory nor in practice. The present work proposes extensions to treat problems with linear equalities whose expression is known. The main idea consists in reformulating the optimization problem into … Read more

Dynamic scaling in the Mesh Adaptive Direct Search algorithm for blackbox optimization

Blackbox optimization deals with situations in which the objective function and constraints are typically computed by launching a time-consuming computer sim- ulation. The subject of this work is the Mesh Adaptive Direct Search (MADS) class of algorithms for blackbox optimization. We propose a way to dynamically scale the mesh, which is the discrete spatial structure … Read more

Direct search based on probabilistic descent

Direct-search methods are a class of popular derivative-free algorithms characterized by evaluating the objective function using a step size and a number of (polling) directions. When applied to the minimization of smooth functions, the polling directions are typically taken from positive spanning sets which in turn must have at least n+1 vectors in an n-dimensional … Read more

A globally convergent trust-region algorithm for unconstrained derivative-free optimization

In this work we explicit a derivative-free trust-region algorithm for unconstrained optimization based on the paper (Computational Optimization and Applications 53: 527-555, 2012) proposed by Powell. The objective function is approximated by quadratic models obtained by polynomial interpolation. The number of points of the interpolation set is fixed. In each iteration only one interpolation point … Read more

A derivative-free trust-funnel method for equality-constrained nonlinear optimization

In this work, we look into new derivative-free methods to solve equality-constrained optimization problems. Of particular interest, are the trust-region techniques, which have been investigated for the unconstrained and bound-constrained cases. For solving equality-constrained optimization problems, we introduce a derivative-free adaptation of the trust-funnel method combined with a self-correcting geometry scheme and present some encouraging … Read more

A Linesearch-based Derivative-free Approach for Nonsmooth Optimization

In this paper, we propose new linesearch-based methods for nonsmooth optimization problems when first-order information on the problem functions is not available. In the first part, we describe a general framework for bound-constrained problems and analyze its convergence towards stationary points, using the Clarke-Jahn directional derivative. In the second part, we consider inequality constrained optimization … Read more

Convergence of trust-region methods based on probabilistic models

In this paper we consider the use of probabilistic or random models within a classical trust-region framework for optimization of deterministic smooth general nonlinear functions. Our method and setting differs from many stochastic optimization approaches in two principal ways. Firstly, we assume that the value of the function itself can be computed without noise, in … Read more

Worst case complexity of direct search under convexity

In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same global rate or worst case complexity bound of the gradient method for the unconstrained minimization of a convex and smooth function. More precisely, it will be shown that … Read more

A merit function approach for direct search

In this paper it is proposed to equip direct-search methods with a general procedure to minimize an objective function, possibly non-smooth, without using derivatives and subject to constraints on the variables. One aims at considering constraints, most likely nonlinear or non-smooth, for which the derivatives of the corresponding functions are also unavailable. The novelty of … Read more

Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms

The mesh adaptive direct search (MADS) class of algorithms is designed for nonsmooth optimization, where the objective function and constraints are typically computed by launching a time-consuming computer simulation. Each iteration of a MADS algorithm attempts to improve the current best-known solution by launching the simulation at a finite number of trial points. Common implementations … Read more