A new customized proximal point algorithm for linearly constrained convex optimization

In this paper, we propose a new customized proximal point algorithm for linearly constrained convex optimization problem, and further use it to solve the separable convex optimization problem with linear constraints. Which is different to the existing customized proximal point algorithms, the proposed algorithm does not involve any relaxation step but still ensure the convergence. … Read more

Accelerated first-order methods for large-scale convex minimization

This paper discusses several (sub)gradient methods attaining the optimal complexity for smooth problems with Lipschitz continuous gradients, nonsmooth problems with bounded variation of subgradients, weakly smooth problems with H\”older continuous gradients. The proposed schemes are optimal for smooth strongly convex problems with Lipschitz continuous gradients and optimal up to a logarithmic factor for nonsmooth problems … Read more

An optimal first order method based on optimal quadratic averaging

In a recent paper, Bubeck, Lee, and Singh introduced a new first order method for minimizing smooth strongly convex functions. Their geometric descent algorithm, largely inspired by the ellipsoid method, enjoys the optimal linear rate of convergence. Motivated by their work, we propose a close variant that iteratively maintains a quadratic global under-estimator of the … Read more

On the Grassmann condition number

We give new insight into the Grassmann condition of the conic feasibility problem \[ x \in L \cap K \setminus\{0\}. \] Here $K\subseteq V$ is a regular convex cone and $L\subseteq V$ is a linear subspace of the finite dimensional Euclidean vector space $V$. The Grassmann condition of this problem is the reciprocal of the … Read more

Semi-Smooth Second-order Type Methods for Composite Convex Programs

The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: i) Many well-known operator splitting methods, such as forward-backward splitting (FBS) and Douglas-Rachford splitting (DRS), actually define a possibly semi-smooth and monotone fixed-point mapping; ii) The optimal solutions … Read more

Tight global linear convergence rate bounds for operator splitting methods

In this paper we establish necessary and sufficient conditions for linear convergence of operator splitting methods for a general class of convex optimization problems where the associated fixed-point operator is averaged. We also provide a tight bound on the achievable convergence rate. Most existing results establishing linear convergence in such methods require restrictive assumptions regarding … Read more

Coordinate Friendly Structures, Algorithms and Applications

This paper focuses on coordinate update methods, which are useful for solving problems involving large or high-dimensional datasets. They decompose a problem into simple subproblems, where each updates one, or a small block of, variables while fixing others. These methods can deal with linear and nonlinear mappings, smooth and nonsmooth functions, as well as convex … Read more

Two-sided linear chance constraints and extensions

We examine the convexity and tractability of the two-sided linear chance constraint model under Gaussian uncertainty. We show that these constraints can be applied directly to model a larger class of nonlinear chance constraints as well as provide a reasonable approximation for a challenging class of quadratic chance constraints of direct interest for applications in … Read more

Level-set methods for convex optimization

Convex optimization problems arising in applications often have favorable objective functions and complicated constraints, thereby precluding first-order methods from being immediately applicable. We describe an approach that exchanges the roles of the objective and constraint functions, and instead approximately solves a sequence of parametric level-set problems. A zero-finding procedure, based on inexact function evaluations and … Read more

Iteration-complexity of a Rockafellar’s proximal method of multipliers for convex programming based on second-order approximations

This paper studies the iteration-complexity of a new primal-dual algorithm based on Rockafellar’s proximal method of multipliers (PMM) for solving smooth convex programming problems with inequality constraints. In each step, either a step of Rockafellar’s PMM for a second-order model of the problem is computed or a relaxed extragradient step is performed. The resulting algorithm … Read more