Stochastic model-based minimization of weakly convex functions

We consider an algorithm that successively samples and minimizes stochastic models of the objective function. We show that under weak-convexity and Lipschitz conditions, the algorithm drives the expected norm of the gradient of the Moreau envelope to zero at the rate $O(k^{-1/4})$. Our result yields the first complexity guarantees for the stochastic proximal point algorithm on weakly convex problems and for the stochastic prox-linear algorithm for minimizing compositions of convex functions with smooth maps. Our general framework also recovers the recently obtained complexity estimate for the stochastic proximal subgradient method on weakly convex problems.

Citation

arXiv:1803.06523

Article

Download

View Stochastic model-based minimization of weakly convex functions