Uniqueness of DRS as the 2 Operator Resolvent-Splitting and Impossibility of 3 Operator Resolvent-Splitting

Given the success of Douglas-Rachford splitting (DRS), it is natural to ask whether DRS can be generalized. Are there are other 2 operator splittings? Can DRS be generalized to 3 operators? This work presents the answers: no and no. In a certain sense, DRS is the unique 2 operator resolvent-splitting, and generalizing DRS to 3 … Read more

A Progressive Batching L-BFGS Method for Machine Learning

The standard L-BFGS method relies on gradient approximations that are not dominated by noise, so that search directions are descent directions, the line search is reliable, and quasi-Newton updating yields useful quadratic models of the objective function. All of this appears to call for a full batch approach, but since small batch sizes give rise … Read more

Pointed Closed Convex Sets are the Intersection of All Rational Supporting Closed Halfspaces

We prove that every pointed closed convex set in $\mathbb{R}^n$ is the intersection of all the rational closed halfspaces that contain it. This generalizes a previous result by the authors for compact convex sets. Citation arXiv:1802.03296. February 2018 Article Download View Pointed Closed Convex Sets are the Intersection of All Rational Supporting Closed Halfspaces

Cut-Pursuit Algorithm for Regularizing Nonsmooth Functionals with Graph Total Variation

We present an extension of the cut-pursuit algorithm, introduced by Landrieu and Obozinski (2017), to the graph total-variation regularization of functions with a separable nondifferentiable part. We propose a modified algorithmic scheme as well as adapted proofs of convergence. We also present a heuristic approach for handling the cases in which the values associated to … Read more

Stochastic subgradient method converges at the rate (k^{-1/4})$ on weakly convex function

We prove that the projected stochastic subgradient method, applied to a weakly convex problem, drives the gradient of the Moreau envelope to zero at the rate $O(k^{-1/4})$. Article Download View Stochastic subgradient method converges at the rate (k^{-1/4})$ on weakly convex function

The condition of a function relative to a polytope

The condition number of a smooth convex function, namely the ratio of its smoothness to strong convexity constants, is closely tied to fundamental properties of the function. In particular, the condition number of a quadratic convex function is precisely the square of the diameter-to-width ratio of a canonical ellipsoid associated to the function. Furthermore, the … Read more

On self-concordant barriers for generalized power cones

In the study of interior-point methods for nonsymmetric conic optimization and their applications, Nesterov introduced the power cone, together with a 4-self-concordant barrier for it. In his PhD thesis, Chares found an improved 3-self-concordant barrier for the power cone. In addition, he introduced the generalized power cone, and conjectured a nearly optimal self-concordant barrier for … Read more

An Alternating Minimization Method for Matrix Completion Problem

In this paper, we focus on solving matrix completion problem arising from applications in the fields of information theory, statistics, engineering, etc. However, the matrix completion problem involves nonconvex rank constraints which make this type of problem difficult to handle. Traditional approaches use a nuclear norm surrogate to replace the rank constraints. The relaxed model … Read more

On Quasi-Newton Forward–Backward Splitting: Proximal Calculus and Convergence

We introduce a framework for quasi-Newton forward–backward splitting algorithms (proximal quasi-Newton methods) with a metric induced by diagonal +/- rank-r symmetric positive definite matrices. This special type of metric allows for a highly efficient evaluation of the proximal mapping. The key to this efficiency is a general proximal calculus in the new metric. By using … Read more

Subsampled Inexact Newton methods for minimizing large sums of convex functions

This paper deals with the minimization of large sum of convex functions by Inexact Newton (IN) methods employing subsampled Hessian approximations. The Conjugate Gradient method is used to compute the inexact Newton step and global convergence is enforced by a nonmonotone line search procedure. The aim is to obtain methods with affordable costs and fast … Read more