A stochastic adaptive regularization algorithm allowing random noise in derivatives and inexact function values is proposed for computing strong approximate minimizers of any order for inexpensively constrained smooth optimization problems. For an objective function with Lipschitz continuous p-th derivative in a convex neighbourhood of the feasible set and given an arbitrary optimality order q, it is shown that this algorithm will, in expectation, compute such a point in at most O((\min_{1<=j<=q} \epsilon_j )^{-(p+1)/(p-q+1)}) inexact evaluations of f and its derivatives whenever q=1 and the feasible set is convex, or q=2 and the problem is unconstrained, where \epsilon_j is the tolerance for j-th order accuracy. This bound becomes at most O( ( \min_{1<=j<=q} \epsilon_j )^{-q(p+1)/p} ) inexact evaluations in the other cases if all derivatives are Lipschitz continuous. Moreover these bounds are sharp in the order of the accuracy tolerances.
Citation
arXiv:2005.04639
Article