Probabilistic guarantees in Robust Optimization

We develop a general methodology to derive probabilistic guarantees for solutions of robust optimization problems. Our analysis applies broadly to any convex compact uncertainty set and to any constraint affected by uncertainty in a concave manner, under minimal assumptions on the underlying stochastic process. Namely, we assume that the coordinates of the noise vector are light-tailed (sub-Gaussian) but not necessarily independent. We introduce the notion of robust complexity of an uncertainty set, which is a robust analog of the Rademacher or Gaussian complexity encountered in high-dimensional statistics, and which connects geometry of the uncertainty set and a priori probabilistic guarantee. Interestingly, the robust complexity involves the support function of the uncertainty set, which also plays a crucial role in the robust counterpart theory for robust linear and nonlinear optimization. For a variety of uncertainty sets of practical interest, we are able to compute it in closed form or derive valid approximations. To the best of our knowledge, our methodology recovers and extends all the results available in the literature. We also derive improved a posteriori bounds, i.e., significantly tighter bounds which depend on the resulting robust solution.

Citation

Dimitris Bertsimas, Dick den Hertog, and Jean Pauphilet Probabilistic Guarantees in Robust Optimization, SIAM Journal on Optimization 2021 31:4, 2893-2920

Article

Download

View Probabilistic guarantees in Robust Optimization