Penalized stochastic gradient methods for stochastic convex optimization with expectation constraints

Stochastic gradient method and its variants are simple yet effective for minimizing an expectation function over a closed convex set. However, none of these methods are applicable to solve stochastic programs with expectation constraints, since the projection onto the feasible set is prohibitive. To deal with the expectation constrained stochastic convex optimization problems, we propose … Read more

A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization

In this work, we present a globalized stochastic semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. We assume that only noisy gradient and Hessian information of the smooth part of the objective function is available via calling stochastic first and second order oracles. The … Read more

Semi-Smooth Second-order Type Methods for Composite Convex Programs

The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: i) Many well-known operator splitting methods, such as forward-backward splitting (FBS) and Douglas-Rachford splitting (DRS), actually define a possibly semi-smooth and monotone fixed-point mapping; ii) The optimal solutions … Read more

The Second Order Directional Derivative of Symmetric Matrix-valued Functions

This paper focuses on the study of the second-order directional derivative of a symmetric matrix-valued function of the form $F(X)=P\mbox{diag}[f(\lambda_1(X)),\cdots,f(\lambda_n(X))]P^T$. For this purpose, we first adopt a direct way to derive the formula for the second-order directional derivative of any eigenvalue of a matrix in Torki \cite{Tor01}; Second, we establish a formula for the (parabolic) … Read more