In this work we present TRFD, a derivative-free trust-region method based on finite differences for minimizing composite functions of the form \(f(x)=h(F(x))\), where \(F\) is a black-box function assumed to have a Lipschitz continuous Jacobian, and \(h\) is a known convex Lipschitz function, possibly nonsmooth. The method approximates the Jacobian of \(F\) via forward finite differences. We establish an upper bound for the number of evaluations of \(F\) that TRFD requires to find an \(\epsilon\)-approximate stationary point. For L1 and Minimax problems, we show that our complexity bound reduces to \(\mathcal{O}(n\epsilon^{-2})\) for specific instances of TRFD, where \(n\) is the number of variables of the problem. Assuming that \(h\) is monotone and that the components of \(F\) are convex, we also establish a worst-case complexity bound, which reduces to \(\mathcal{O}(n\epsilon^{-1})\) for Minimax problems. Numerical results are provided to illustrate the relative efficiency of TRFD in comparison with existing derivative-free solvers for composite nonsmooth optimization.
TRFD: A derivative-free trust-region method based on finite differences for composite nonsmooth optimization
\(\)