Robust Markov Decision Processes for Medical Treatment Decisions

Medical treatment decisions involve complex tradeoffs between the risks and benefits of various treatment options. The diversity of treatment options that patients can choose over time and uncertainties in future health outcomes, result in a difficult sequential decision making problem. Markov decision processes (MDPs) are commonly used to study medical treatment decisions; however, optimal policies obtained by solving MDPs may be affected by the uncertain nature of the model parameter estimates. In this article, we present a robust Markov decision process treatment model (RMDP-TM) with an uncertainty set that incorporates an uncertainty budget into the formulation for the transition probability matrices (TPMs) of the underlying Markov chain. We show that the addition of an uncertainty budget can control the tradeoff between mean performance and worst-case performance of the resulting policies. Further, we present theoretical analysis to establish computationally efficient methods to solve the RMDP-TM and we provide conditions under which the policy of nature is stationary. Finally, we present an application of the models to a medical treatment decision problem of optimizing the sequence and the start time to initiate medications for glycemic control for patients with type 2 diabetes.

Citation

Zhang, Y. Steimle, L. N. and Denton B. T. Robust Markov Decision Processes for Medical Treatment Decisions. Optimization-online, Updated on September 21, 2017

Article

Download

View Robust Markov Decision Processes for Medical Treatment Decisions