An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP

The accelerated proximal gradient (APG) method, first proposed by Nesterov, and later refined by Beck and Teboulle, and studied in a unifying manner by Tseng has proven to be highly efficient in solving some classes of large scale structured convex optimization (possibly nonsmooth) problems, including nuclear norm minimization problems in matrix completion and $l_1$ minimization … Read more

Weak and Strong Convergence of Algorithms for the Split Common Null Point Problem

We introduce and study the Split Common Null Point Problem (SCNPP) for set-valued maximal monotone mappings in Hilbert space. This problem generalizes our Split Variational Inequality Problem (SVIP) [Y. Censor, A. Gibali and S. Reich, Algorithms for the split variational inequality problem, Numerical Algorithms, accepted for publication, DOI 10.1007/s11075-011-9490-5]. The SCNPP with only two set-valued … Read more

A relaxed customized proximal point algorithm for separable convex programming

The alternating direction method (ADM) is classical for solving a linearly constrained separable convex programming problem (primal problem), and it is well known that ADM is essentially the application of a concrete form of the proximal point algorithm (PPA) (more precisely, the Douglas-Rachford splitting method) to the corresponding dual problem. This paper shows that an … Read more

Generalized Forward-Backward Splitting

This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form $F + \sum_{i=1}^n G_i$, where $F$ has a Lipschitz-continuous gradient and the $G_i$’s are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than $n = 1$ non-smooth … Read more

On smooth relaxations of obstacle sets

We present and discuss a method to relax sets described by finitely many smooth convex inequality constraints by the level set of a single smooth convex inequality constraint. Based on error bounds and Lipschitz continuity, special attention is paid to the maximal approximation error and a guaranteed safety margin. Our results allow to safely avoid … Read more

Accelerated and Inexact forward-backward algorithms

We propose a convergence analysis of accelerated forward-backward splitting methods for minimizing composite functions, when the proximity operator is not available in closed form, and is thus computed up to a certain precision. We prove that the $1/k^2$ convergence rate for the function values can be achieved if the admissible errors are of a certain … Read more

Inexact and accelerated proximal point algorithms

We present inexact accelerated proximal point algorithms for minimizing a proper lower semicon- tinuous and convex function. We carry on a convergence analysis under different types of errors in the evaluation of the proximity operator, and we provide corresponding convergence rates for the objective function values. The proof relies on a generalization of the strategy … Read more

An Alternating Direction Method for Total Variation Denoising

We consider the image denoising problem using total variation (TV) regularization. This problem can be computationally challenging to solve due to the non-differentiability and non-linearity of the regularization term. We propose an alternating direction augmented Lagrangian (ADAL) method, based on a new variable splitting approach that results in subproblems that can be solved efficiently and … Read more

Distributed Basis Pursuit

We propose a distributed algorithm for solving the optimization problem Basis Pursuit (BP). BP finds the least L1-norm solution of the underdetermined linear system Ax = b and is used, for example, in compressed sensing for reconstruction. Our algorithm solves BP on a distributed platform such as a sensor network, and is designed to minimize … Read more

A proximal point algorithm for sequential feature extraction applications

We propose a proximal point algorithm to solve LAROS problem, that is the problem of finding a “large approximately rank-one submatrix”. This LAROS problem is used to sequentially extract features in data. We also develop a new stopping criterion for the proximal point algorithm, which is based on the duality conditions of \eps-optimal solutions of … Read more