In this work we consider unconstrained optimization problems. The objective
function is known through a zeroth order stochastic oracle that gives an estimate
of the true objective function. To solve these problems, we propose a derivativefree
algorithm based on extrapolation techniques. Under reasonable assumptions
we are able to prove convergence properties for the proposed algorithms. Furthermore,
we also give a worst-case complexity result stating that the total number of
iterations where the expected value of the norm of the objective function gradient
is above a prefixed \(\epsilon > 0\) is \({\cal O}(n^2\epsilon^{-2}/\beta^2)\) in the worst case.
Citation
arXiv:2508.00495 [math.OC], doi: 10.48550/arXiv.2508.00495