Stochastic trust-region and direct-search methods: A weak tail bound condition and reduced sample sizing

Using tail bounds, we introduce a new probabilistic condition for function estimation in stochastic derivative-free optimization which leads to a reduction in the number of samples and eases algorithmic analyses. Moreover, we develop simple stochastic direct-search and trust-region methods for the optimization of a potentially non-smooth function whose values can only be estimated via stochastic … Read more

Retraction based Direct Search Methods for Derivative Free Riemannian Optimization

Direct search methods represent a robust and reliable class of algorithms for solving black-box optimization problems. In this paper, we explore the application of those strategies to Riemannian optimization, wherein minimization is to be performed with respect to variables restricted to lie on a manifold. More specifically, we consider classic and line search extrapolated variants … Read more

Expected complexity analysis of stochastic direct-search

This work presents the convergence rate analysis of stochastic variants of the broad class of direct-search methods of directional type. It introduces an algorithm designed to optimize differentiable objective functions $f$ whose values can only be computed through a stochastically noisy blackbox. The proposed stochastic directional direct-search (SDDS) algorithm accepts new iterates by imposing a … Read more

Exploiting problem structure in derivative free optimization

A structured version of derivative-free random pattern search optimization algorithms is introduced which is able to exploit coordinate partially separable structure (typically associated with sparsity) often present in unconstrained and bound-constrained optimization problems. This technique improves performance by orders of magnitude and makes it possible to solve large problems that otherwise are totally intractable by … Read more

Optimization of noisy blackboxes with adaptive precision

In derivative-free and blackbox optimization, the objective function is often evaluated through the execution of a computer program seen as a blackbox. It can be noisy, in the sense that its outputs are contaminated by random errors. Sometimes, the source of these errors is identified and controllable, in the sense that it is possible to … Read more

On the use of polynomial models in multiobjective directional direct search

Polynomial interpolation or regression models are an important tool in Derivative-free Optimization, acting as surrogates of the real function. In this work, we propose the use of these models in the multiobjective framework of directional direct search, namely the one of Direct Multisearch. Previously evaluated points are used to build quadratic polynomial models, which are … Read more

Direct search based on probabilistic feasible descent for bound and linearly constrained problems

Direct search is a methodology for derivative-free optimization whose iterations are characterized by evaluating the objective function using a set of polling directions. In deterministic direct search applied to smooth objectives, these directions must somehow conform to the geometry of the feasible region and typically consist of positive generators of approximate tangent cones (which then … Read more

MultiGLODS: Global and Local Multiobjective Optimization using Direct Search

The optimization of multimodal functions is a challenging task, in particular when derivatives are not available for use. Recently, in a directional direct search framework, a clever multistart strategy was proposed for global derivative-free optimization of single objective functions. The goal of the current work is to generalize this approach to the computation of global … Read more

BFO, a trainable derivative-free Brute Force Optimizer for nonlinear bound-constrained optimization and equilibrium computations with continuous and discrete variables

A direct-search derivative-free Matlab optimizer for bound-constrained problems is described, whose remarkable features are its ability to handle a mix of continuous and discrete variables, a versatile interface as well as a novel self-training option. Its performance compares favourably with that of NOMAD, a state-of-the art package. It is also applicable to multilevel equilibrium- or … Read more

A second-order globally convergent direct-search method and its worst-case complexity

Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, first-order global convergence comes from the ability of the vectors to approximate the steepest … Read more