This paper addresses the study of nonconvex derivative-free optimization problems, where only information of either smooth objective functions or their noisy approximations is available. General derivative-free methods are proposed for minimizing differentiable (not necessarily convex) functions with globally Lipschitz continuous gradients, where the accuracy of approximate gradients is interacting with stepsizes and exact gradient values. Analysis in the noiseless case guarantees convergence of the gradient sequence to the origin as well as global convergence with constructive convergence rates of the sequence of iterates under the Kurdyka-\L ojasiewicz property. In the noisy case, without any noise level information, the designed algorithms reach near-stationary points with providing estimates on the required number of iterations and function evaluations. Addressing functions with locally Lipschitzian gradients, two algorithms are introduced to handle the noiseless and noisy cases, respectively. The noiseless version is based on the standard backtracking linesearch and achieves fundamental convergence properties similarly to the global Lipschitzian case. The noisy version is based on a novel bidirectional linesearch and is shown to reach near-stationary points after a finite number of iterations when the Polyak-\L ojasiewicz inequality is imposed. Numerical experiments are conducted on a diverse set of test problems to demonstrate more robustness of the newly proposed algorithms in comparison with other finite-difference-based schemes and some highly efficient, production-ready codes from the SciPy library.
Article
View Globally Convergent Derivative-Free Methods in Nonconvex Optimization with and without Noise