Stochastic model-based minimization of weakly convex functions

We consider an algorithm that successively samples and minimizes stochastic models of the objective function. We show that under weak-convexity and Lipschitz conditions, the algorithm drives the expected norm of the gradient of the Moreau envelope to zero at the rate $O(k^{-1/4})$. Our result yields the first complexity guarantees for the stochastic proximal point algorithm … Read more

Entropic proximal operators for nonnegative trigonometric polynomials

Signal processing applications of semidefinite optimization are often rooted in sum-of-squares representations of nonnegative trigonometric polynomials. Interior-point solvers for semidefinite optimization can handle constraints of this form with a per-iteration-complexity that is cubic in the degree of the trigonometric polynomial. The purpose of this paper is to discuss first-order methods with a lower complexity per … Read more

Projective Splitting with Forward Steps: Asynchronous and Block-Iterative Operator Splitting

This work is concerned with the classical problem of finding a zero of a sum of maximal monotone operators. For the projective splitting framework recently proposed by Combettes and Eckstein, we show how to replace the fundamental subproblem calculation using a backward step with one based on two forward steps. The resulting algorithms have the … Read more

A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization

In this work, we present a globalized stochastic semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. We assume that only noisy gradient and Hessian information of the smooth part of the objective function is available via calling stochastic first and second order oracles. The … Read more

SOS-Convex Lyapunov Functions and Stability of Difference Inclusions

We introduce the concept of sos-convex Lyapunov functions for stability analysis of both linear and nonlinear difference inclusions (also known as discrete-time switched systems). These are polynomial Lyapunov functions that have an algebraic certificate of convexity and that can be efficiently found via semidefinite programming. We prove that sos-convex Lyapunov functions are universal (i.e., necessary … Read more

An Improved Method of Total Variation Superiorization Applied to Reconstruction in Proton Computed Tomography

Previous work showed that total variation superiorization (TVS) improves reconstructed image quality in proton computed tomography (pCT). The structure of the TVS algorithm has evolved since then and this work investigated if this new algorithmic structure provides additional benefits to pCT image quality. Structural and parametric changes introduced to the original TVS algorithm included: (1) … Read more

Inexact Successive Quadratic Approximation for Regularized Optimization

Successive quadratic approximations, or second-order proximal methods, are useful for minimizing functions that are a sum of a smooth part and a convex, possibly nonsmooth part that promotes regularization. Most analyses of iteration complexity focus on the special case of proximal gradient method, or accelerated variants thereof. There have been only a few studies of … Read more

A Simple Nearly-Optimal Restart Scheme For Speeding-Up First-Order Methods

We present a simple scheme for restarting first-order methods for convex optimization problems. Restarts are made based only on achieving specified decreases in objective values, the specified amounts being the same for all optimization problems. Unlike existing restart schemes, the scheme makes no attempt to learn parameter values characterizing the structure of an optimization problem, … Read more

ADMM for Multiaffine Constrained Optimization

We propose an expansion of the scope of the alternating direction method of multipliers (ADMM). Specifically, we show that ADMM, when employed to solve problems with multiaffine constraints that satisfy certain easily verifiable assumptions, converges to the set of constrained stationary points if the penalty parameter in the augmented Lagrangian is sufficiently large. When the … Read more

Algorithms and Convergence Results of Projection Methods for Inconsistent Feasibility Problems: A Review

The convex feasibility problem (CFP) is to find a feasible point in the intersection of finitely many convex and closed sets. If the intersection is empty then the CFP is inconsistent and a feasible point does not exist. However, algorithmic research of inconsistent CFPs exists and is mainly focused on two directions. One is oriented … Read more