Escaping local minima with derivative-free methods: a numerical investigation

We apply a state-of-the-art, local derivative-free solver, Py-BOBYQA, to global optimization problems, and propose an algorithmic improvement that is beneficial in this context. Our numerical findings are illustrated on a commonly-used test set of global optimization problems and associated noisy variants, and on hyperparameter tuning for a machine learning test set. As Py-BOBYQA is a model-based trust-region method, we compare mostly (but not exclusively) with other global optimization methods for which (global) models are important, such as Bayesian optimization and response surface methods; we also consider state-of-the-art representative deterministic and stochastic codes, such as DIRECT and CMA-ES. We find Py-BOBYQA to be competitive with global solvers that are provably designed for finding global optima, for all accuracy/budget regimes, in both smooth and noisy settings. In particular, Py-BOBYQA variants are best performing for smooth and multiplicative noise problems in high-accuracy regimes. As a by-product, some preliminary conclusions can be drawn on the relative performance of the global solvers we have tested with default settings.

Citation

Technical Report, Mathematical Institute, University of Oxford.

Article

Download

View PDF