We consider the problem of optimizing the sum of a smooth, nonconvex function for which derivatives are unavailable, and a convex, nonsmooth function with easy-to-evaluate proximal operator. Of particular focus is the case where the smooth part has a nonlinear least-squares structure. We adapt two existing approaches for derivative-free optimization of nonsmooth compositions of smooth functions to this setting. Our main contribution is adapting our algorithm to handle inexactly computed stationary measures, where the inexactness is adaptively adjusted as required by the algorithm (where previous approaches assumed access to exact stationary measures, which is not realistic in this setting). Numerically, we provide two extensions of the state-of-the-art DFO-LS solver for nonlinear least-squares problems and demonstrate their strong practical performance.
Article
View Black-box Optimization Algorithms for Regularized Least-squares Problems