In this work, we propose a method for minimizing non-convex functions with Lipschitz continuous \(p\)th-order derivatives, starting from \(p \geq 1\). The method, however, only requires derivative information up to order \((p-1)\), since the \(p\)th-order derivatives are approximated via finite differences. To ensure oracle efficiency, instead of computing finite-difference approximations at every iteration, we reuse each approximation for \(m\) consecutive iterations before recomputing it, with \(m \geq 1\) as a key parameter. As a result, we obtain an adaptive method of order \((p-1)\) that requires no more than \(O(\epsilon^{-\frac{p+1}{p}})\) iterations to find an \(\epsilon\)-approximate stationary point of the objective function and that, for the choice \(m=(p-1)n + 1\), where \(n\) is the problem dimension, takes no more than \(O(n^{1/p}\epsilon^{-\frac{p+1}{p}})\) oracle calls of order \((p-1)\). This improves previously known bounds for tensor methods with finite-difference approximations in terms of the problem dimension.
On the Complexity of Lower-Order Implementations of Higher-Order Methods
\(\)