Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers

We present DFO-LS, a software package for derivative-free optimization (DFO) for nonlinear Least-Squares (LS) problems, with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals. DFO-LS allows flexible initialization for expensive problems, whereby it can begin making progress from as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. DFO-LS has improved robustness to noise, allowing sample averaging, the construction of regression-based models, and multiple restart strategies together with an auto-detection mechanism. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance over both sampling and regression techniques. We also present our package Py-BOBYQA, a Python implementation of BOBYQA (Powell, 2009), which also implements robustness to noise strategies. Our numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems. In our comparisons, we introduce a new adaptive measure of accuracy for the data profiles of noisy functions that strikes a balance between measuring the true and the noisy objective improvement.

Citation

Technical Report, Mathematical Institute, Oxford University, 2018.

Article

Download

View Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers