Sinkhorn Distributionally Robust Optimization

We study distributionally robust optimization (DRO) with Sinkhorn distance---a variant of Wasserstein distance based on entropic regularization. We provide convex programming dual reformulation for a general nominal distribution. Compared with Wasserstein DRO, it is computationally tractable for a larger class of loss functions, and its worst-case distribution is more reasonable. We propose an efficient first-order algorithm with bisection search to solve the dual reformulation. We demonstrate that our proposed algorithm finds δ-optimal solution of the new DRO formulation with computation cost Õ(1/δ³) and memory cost Õ(1/δ²), and the computation cost further improves to Õ(1/δ²) when the loss function is smooth. Finally, we provide various numerical examples using both synthetic and real data to demonstrate its competitive performance and light computational speed.

Article

Download

View Sinkhorn Distributionally Robust Optimization