Generalized Forward-Backward Splitting

This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form $F + \sum_{i=1}^n G_i$, where $F$ has a Lipschitz-continuous gradient and the $G_i$’s are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than $n = 1$ non-smooth … Read more

Accelerated and Inexact forward-backward algorithms

We propose a convergence analysis of accelerated forward-backward splitting methods for minimizing composite functions, when the proximity operator is not available in closed form, and is thus computed up to a certain precision. We prove that the $1/k^2$ convergence rate for the function values can be achieved if the admissible errors are of a certain … Read more

Inexact and accelerated proximal point algorithms

We present inexact accelerated proximal point algorithms for minimizing a proper lower semicon- tinuous and convex function. We carry on a convergence analysis under different types of errors in the evaluation of the proximity operator, and we provide corresponding convergence rates for the objective function values. The proof relies on a generalization of the strategy … Read more

A proximal point algorithm for sequential feature extraction applications

We propose a proximal point algorithm to solve LAROS problem, that is the problem of finding a “large approximately rank-one submatrix”. This LAROS problem is used to sequentially extract features in data. We also develop a new stopping criterion for the proximal point algorithm, which is based on the duality conditions of \eps-optimal solutions of … Read more

Solving Basis Pursuit: Heuristic Optimality Check and Solver Comparison

The problem of finding a minimum l^1-norm solution to an underdetermined linear system is an important problem in compressed sensing, where it is also known as basis pursuit. We propose a heuristic optimality check as a general tool for l^1-minimization, which often allows for early termination by “guessing” a primal-dual optimal pair based on an … Read more

Approximation of rank function and its application to the nearest low-rank correlation matrix

The rank function $\rank(\cdot)$ is neither continuous nor convex which brings much difficulty to the solution of rank minimization problems. In this paper, we provide a unified framework to construct the approximation functions of $\rank(\cdot)$, and study their favorable properties. Particularly, with two families of approximation functions, we propose a convex relaxation method for the … Read more

The mesh adaptive direct search algorithm with treed Gaussian process surrogates

This work introduces the use of the treed Gaussian process (TGP) as a surrogate model within the mesh adaptive direct search (MADS) framework for constrained blackbox optimization. It extends the surrogate management framework (SMF) to nonsmooth optimization under general constraints. MADS uses TGP in two ways: one, as a surrogate for blackbox evaluations; and two, … Read more

Dependence of bilevel programming on irrelevant data

In 1997, Macal and Hurter have found that adding a constraint to the lower level problem, which is not active at the computed global optimal solution, can destroy global optimality. In this paper this property is reconsidered and it is shown that this solution remains locally optimal under inner semicontinuity of the original solution set … Read more

Structured Sparsity via Alternating Direction Methods

We consider a class of sparse learning problems in high dimensional feature space regularized by a structured sparsity-inducing norm which incorporates prior knowledge of the group structure of the features. Such problems often pose a considerable challenge to optimization algorithms due to the non-smoothness and non-separability of the regularization term. In this paper, we focus … Read more

An Infeasible-Point Subgradient Method Using Adaptive Approximate Projections

We propose a new subgradient method for the minimization of convex functions over a convex set. Common subgradient algorithms require an exact projection onto the feasible region in every iteration, which can be efficient only for problems that admit a fast projection. In our method we use inexact adaptive projections requiring to move within a … Read more