Sequential test sampling for stochastic derivative-free optimization

In many derivative-free optimization algorithms, a sufficient decrease condition decides
whether to accept a trial step in each iteration. This condition typically requires that the
potential objective function value decrease of the trial step, i.e., the true reduction in the
objective function value that would be achieved by moving from the current point to the
trial point, be larger than a multiple of the squared stepsize. When the objective function
is stochastic, evaluating such a condition accurately can require a large estimation cost.

In this paper, we frame the evaluation of the sufficient decrease condition in a stochastic
setting as a hypothesis test problem and solve it through a sequential hypothesis test. The
two hypotheses considered in the problem correspond to accepting or rejecting the trial step.
This test sequentially collects noisy sample observations of the potential decrease until their
sum crosses either a lower or an upper boundary depending on the noise variance and the
stepsize. When the noise of observations is Gaussian, we derive a novel sample size result,
showing that the effort to evaluate the condition explicitly depends on the potential decrease,
and that the sequential test terminates early whenever the sufficient decrease condition is
away from satisfaction. Furthermore, when the potential decrease is Theta(delta^r) for some r in (0, 2], the expected sample size decreases from Theta(delta^{−4}) to O(delta^{−2−r}).

We apply this sequential test sampling framework to probabilistic-descent direct search.
To analyze its convergence rate, we extend a renewal-reward supermartingale-based convergence
rate analysis framework to an arbitrary probability threshold. By doing so, we are able
to show that probabilistic-descent direct search has an iteration complexity of O(n/epsilon^2) for gradient norm. Our numerical experiments indicate the superiority of sequential hypothesis
testing over fixed sampling when dealing with the evaluation of stochastic sufficient decrease
conditions.

Citation

A. Ding, F. Rinaldi, and L. N. Vicente, Sequential test sampling for stochastic derivative-free optimization, ISE Technical Report 25T-016, Lehigh University

Article

Download

View PDF