Sample average approximation-based stochastic dynamic programming and model predictive control are two different methods of approaching multistage stochastic optimization. Model predictive control---despite a lack of theoretical backing---is often used instead of stochastic dynamic programming due to computational necessity. For settings where the stage reward is a convex function of the random terms, the stage dynamics are deterministic, and the random variables are stage-wise independent, we show that model predictive control is equivalent to a distributional robustification of stochastic dynamic programming with an ambiguity set that consists of distributions with matched means. This motivates tools to compare the out-of-sample performance of each method. We study a simple inventory control problem which illustrates their differences, and find that model predictive control can outperform stochastic dynamic programming when the distribution of the underlying random variable is skewed or has weight in its tails. The results are supported by analytic and numeric examples.

## Article

Download

View Sample Average Approximation and Model Predictive Control for Multistage Stochastic Optimization