Efficient global unconstrained black box optimization

For the unconstrained optimization of black box functions, this paper introduces a new randomized algorithm called VRBBO. In practice, VRBBO matches the quality of other state-of-the-art algorithms for finding, in small and large dimensions, a local minimizer with reasonable accuracy. Although our theory guarantees only local minimizers our heuristic techniques turn VRBBO into an efficient global solver. In very thorough numerical experiments, we found in most cases either a global minimizer, or where this could not be checked, at least a point of similar quality with the best competitive global solvers. For smooth, everywhere defined functions, it is proved that, with probability arbitrarily close to 1, a basic version of our algorithm finds with $O(n\epsilon^{-2})$ function evaluations a point whose unknown exact gradient 2-norm is below a given threshold $\epsilon>0$, where $n$ is the dimension. In the smooth convex case, this number improves to $O(n\log \epsilon^{-1})$ and in the smooth (strongly) convex case to $O(n\log n\epsilon^{-1})$. This matches known recent complexity results for reaching a slightly different goal, namely the expected unknown exact gradient 2-norm is below a given threshold $\epsilon>0$. Numerical results show that VRBBO is effective and robust in comparison with the state-of-the-art local and global solvers on the unconstrained CUTEst test problems by Gould for optimization and the test problems by Jamil and Yang for global optimization with 2 up to 5000 variables.

Article

Download

View PDF