A two-phase stochastic momentum-based algorithm for nonconvex expectation-constrained optimization
Article Download View A two-phase stochastic momentum-based algorithm for nonconvex expectation-constrained optimization
Article Download View A two-phase stochastic momentum-based algorithm for nonconvex expectation-constrained optimization
Stochastic gradient method and its variants are simple yet effective for minimizing an expectation function over a closed convex set. However, none of these methods are applicable to solve stochastic programs with expectation constraints, since the projection onto the feasible set is prohibitive. To deal with the expectation constrained stochastic convex optimization problems, we propose … Read more
In this work, we present a globalized stochastic semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. We assume that only noisy gradient and Hessian information of the smooth part of the objective function is available via calling stochastic first and second order oracles. The … Read more
The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: i) Many well-known operator splitting methods, such as forward-backward splitting (FBS) and Douglas-Rachford splitting (DRS), actually define a possibly semi-smooth and monotone fixed-point mapping; ii) The optimal solutions … Read more
This paper focuses on the study of the second-order directional derivative of a symmetric matrix-valued function of the form $F(X)=P\mbox{diag}[f(\lambda_1(X)),\cdots,f(\lambda_n(X))]P^T$. For this purpose, we first adopt a direct way to derive the formula for the second-order directional derivative of any eigenvalue of a matrix in Torki \cite{Tor01}; Second, we establish a formula for the (parabolic) … Read more