A Linearly Convergent Linear-Time First-Order Algorithm for Support Vector Classification with a Core Set Result

We present a simple, first-order approximation algorithm for the support vector classification problem. Given a pair of linearly separable data sets and $\epsilon \in (0,1)$, the proposed algorithm computes a separating hyperplane whose margin is within a factor of $(1-\epsilon)$ of that of the maximum-margin separating hyperplane. We discuss how our algorithm can be extended … Read more

A Redistributed Proximal Bundle Method for Nonconvex Optimization

Proximal bundle methods have been shown to be highly successful optimization methods for unconstrained convex problems with discontinuous first derivatives. This naturally leads to the question of whether proximal variants of bundle methods can be extended to a nonconvex setting. This work proposes an approach based on generating cutting-planes models, not of the objective function … Read more

An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems

The affine rank minimization problem, which consists of finding a matrix of minimum rank subject to linear equality constraints, has been proposed in many areas of engineering and science. A specific rank minimization problem is the matrix completion problem, in which we wish to recover a (low-rank) data matrix from incomplete samples of its entries. … Read more

Robust Linear Optimization With Recourse

We propose an approach to linear optimization with recourse that does not involve a probabilistic description of the uncertainty, and allows the decision-maker to adjust the degree of robustness of the model while preserving its linear properties. We model random variables as uncertain parameters belonging to a polyhedral uncertainty set and minimize the sum of … Read more

Bundle Methods for Convex Minimization with Partially Inexact Oracles

Recently the proximal bundle method for minimizing a convex function has been extended to an inexact oracle that delivers function and subgradient values of unknown accuracy. We adapt this method to a partially inexact oracle that becomes exact only when an objective target level for a descent step is met. In Lagrangian relaxation, such oracles … Read more

On the complexity of the hybrid proximal extragradient method for the iterates and the ergodic mean

In this paper we analyze the iteration-complexity of the hybrid proximal extragradient (HPE) method for finding a zero of a maximal monotone operator recently proposed by Solodov and Svaiter. One of the key points of our analysis is the use of new termination criteria based on the $\varepsilon$-enlargement of a maximal monotone operator. The advantage … Read more

A fast TVL1-L2 minimization algorithm for signal reconstruction from partial Fourier data

Recent compressive sensing results show that it is possible to accurately reconstruct certain compressible signals from relatively few linear measurements via solving nonsmooth convex ptimization problems. In this paper, we propose a simple and fast algorithm for signal reconstruction from partial Fourier data. The algorithm minimizes the sum of three terms corresponding to total variation, … Read more

A Fast Algorithm for Sparse Reconstruction based on Shrinkage, Subspace Optimization and Continuation

We propose a fast algorithm for solving the l1-regularized least squares problem for recovering sparse solutions to an undetermined system of linear equations Ax = b. The algorithm is divided into two stages that are performed repeatedly. In the first stage a first-order iterative method called “shrinkage” yields an estimate of the subset of components … Read more

A quasisecant method for minimizing nonsmooth functions

In this paper a new algorithm to locally minimize nonsmooth, nonconvex functions is developed. We introduce the notion of secants and quasisecants for nonsmooth functions. The quasisecants are applied to find descent directions of locally Lipschitz functions. We design a minimization algorithm which uses quasisecants to find descent directions. We prove that this algorithm converges … Read more

A convex polynomial that is not sos-convex

A multivariate polynomial $p(x)=p(x_1,…,x_n)$ is sos-convex if its Hessian $H(x)$ can be factored as $H(x)= M^T(x) M(x)$ with a possibly nonsquare polynomial matrix $M(x)$. It is easy to see that sos-convexity is a sufficient condition for convexity of $p(x)$. Moreover, the problem of deciding sos-convexity of a polynomial can be cast as the feasibility of … Read more