Level Bundle Methods for oracles with on-demand accuracy

For nonsmooth convex optimization, we consider level bundle methods built using an oracle that computes values for the objective function and a subgradient at any given feasible point. For the problems of interest, the exact oracle information is computable, but difficult to obtain. In order to save computational effort the oracle can provide estimations with an accuracy that depends on two additional parameters, informed to the oracle together with the evaluation point. The first of such parameters is a descent target while the second one is a bound for inexactness. If the oracle can reach the target with its function estimation, then the corresponding error is bounded by the second parameter. Otherwise, if the oracle detects that the target cannot be met, the function and subgradient estimations can be rough and have an unknown accuracy. The considered methods drive the inexactness parameter to zero, thus ensuring that an exact solution to the optimization problem is asymptotically found. The approach is comprehensive and covers known exact and inexact level methods as well as some novel variants, that can handle inaccuracy in a variable manner. In particular, when the feasible set is also compact, some of the new on-demand accuracy methods have the same rate of convergence of exact level variants known in the literature. A numerical benchmark on a battery of two-stage stochastic linear programs assesses the interest of the approach, substantially faster than the L-shaped method, without any accuracy loss.

Citation

To appear in Optimization Methods & Software, 2013.