Retrospective Approximation Sequential Quadratic Programming for Stochastic Optimization with General Deterministic Nonlinear Constraints

In this paper, we propose a framework based on the Retrospective Approximation (RA) paradigm to solve optimization problems with a stochastic objective function and general nonlinear deterministic constraints. This framework sequentially constructs increasingly accurate approximations of the true problems which are solved to a specified accuracy via a deterministic solver, thereby decoupling the uncertainty from the optimization. Such frameworks retain the advantages of deterministic optimization methods, such as fast convergence, while achieving the optimal performance of stochastic methods without the need to redesign algorithmic components. For problems with general nonlinear equality constraints, we present a framework that can employ any deterministic solver and analyze its theoretical work complexity. We then present an instance of the framework that employs a deterministic Sequential Quadratic Programming (SQP) method and that achieves optimal complexity in terms of gradient evaluations and linear system solves for this class of problems. For problems with general nonlinear constraints, we present an RA-based algorithm that employs an SQP method with robust subproblems. Finally, we demonstrate the empirical performance of the proposed framework on multi-class logistic regression problems and benchmark instances from the CUTEst test set, comparing its results to established methods from the literature.

Article

Download

View PDF