Stochastic nested primal-dual method for nonconvex constrained composition optimization

In this paper we study the nonconvex constrained composition optimization, in which the objective contains a composition of two expected-value functions whose accurate information is normally expensive to calculate. We propose a STochastic nEsted Primal-dual (STEP) method for such problems. In each iteration, with an auxiliary variable introduced to track inner layer function values we compute stochastic gradients of the nested function based on the subsampling strategy. To alleviate difficulties caused by possibly nonconvex constraints, we construct a stochastic approximation to the linearized augmented Lagrangian function to update the primal variable, which further motivates to update the dual variable in a weighted-average way. Moreover, to better understand the asymptotic dynamics of the update schemes we consider a deterministic continuous-time system from the perspective of ODE. We analyze the KKT measure at the output by the STEP method and establish its iteration and sample complexities to find an $\epsilon$-stationary point, with expected stationarity, feasibility as well as complementary slackness below accuracy $\epsilon$. Numerical results on a risk-averse portfolio optimization problem reveal the effectiveness of the proposed algorithm.