An Exact Penalty Method for Stochastic Equality-Constrained Optimization

In this paper, we study a penalty method for stochastic equality-constrained optimization, where both the objective and constraints are expressed in general expectation form. We introduce a novel adaptive strategy for updating the penalty parameter, guided by iteration progress to balance reductions in the penalty function with improvements in constraint violation, while each penalty subproblem … Read more

Continuous-time Analysis of a Stochastic ADMM Method for Nonconvex Composite Optimization

In this paper, we focus on nonconvex composite optimization, whose objective is the sum of a smooth but possibly nonconvex function and a composition of a weakly convex function coupled with a linear operator. By leveraging a smoothing technique based on Moreau envelope, we propose a stochastic proximal linearized ADMM algorithm (SPLA). To understand its … Read more

Penalized stochastic gradient methods for stochastic convex optimization with expectation constraints

Stochastic gradient method and its variants are simple yet effective for minimizing an expectation function over a closed convex set. However, none of these methods are applicable to solve stochastic programs with expectation constraints, since the projection onto the feasible set is prohibitive. To deal with the expectation constrained stochastic convex optimization problems, we propose … Read more

A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization

In this work, we present a globalized stochastic semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. We assume that only noisy gradient and Hessian information of the smooth part of the objective function is available via calling stochastic first and second order oracles. The … Read more

Semi-Smooth Second-order Type Methods for Composite Convex Programs

The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: i) Many well-known operator splitting methods, such as forward-backward splitting (FBS) and Douglas-Rachford splitting (DRS), actually define a possibly semi-smooth and monotone fixed-point mapping; ii) The optimal solutions … Read more

The Second Order Directional Derivative of Symmetric Matrix-valued Functions

This paper focuses on the study of the second-order directional derivative of a symmetric matrix-valued function of the form $F(X)=P\mbox{diag}[f(\lambda_1(X)),\cdots,f(\lambda_n(X))]P^T$. For this purpose, we first adopt a direct way to derive the formula for the second-order directional derivative of any eigenvalue of a matrix in Torki \cite{Tor01}; Second, we establish a formula for the (parabolic) … Read more