Convergence rate analysis of the forward-Douglas-Rachford splitting scheme

Operator splitting schemes are a class of powerful algorithms that solve complicated monotone inclusion and convex optimization problems that are built from many simpler pieces. They give rise to algorithms in which all simple pieces of the decomposition are processed individually. This leads to easily implementable and highly parallelizable or distributed algorithms, which often obtain … Read more

Interior-point solver for convex separable block-angular problems

Constraints matrices with block-angular structures are pervasive in Optimization. Interior-point methods have shown to be competitive for these structured problems by exploiting the linear algebra. One of these approaches solved the normal equations using sparse Cholesky factorizations for the block constraints, and a preconditioned conjugate gradient (PCG) for the linking constraints. The preconditioner is based … Read more

A Parallel Line Search Subspace Correction Method for Composite Convex Optimization

In this paper, we investigate a parallel subspace correction framework for composite convex optimization. The variables are first divided into a few blocks based on certain rules. At each iteration, the algorithms solve a suitable subproblem on each block simultaneously, construct a search direction by combining their solutions on all blocks, then identify a new … Read more

An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions

We propose a forward-backward proximal-type algorithm with inertial/memory effects for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. The sequence of iterates generated by the algorithm converges to a critical point of the objective function provided an appropriate regularization of the objective satisfies the Kurdyka-Lojasiewicz inequality, which is … Read more

Weak and Strong Superiorization: Between Feasibility-Seeking and Minimization

We review the superiorization methodology, which can be thought of, in some cases, as lying between feasibility-seeking and constrained minimization. It is not quite trying to solve the full fledged constrained minimization problem; rather, the task is to find a feasible point which is superior (with respect to an objective function value) to one returned … Read more

Douglas-Rachford splitting for nonconvex feasibility problems

We adapt the Douglas-Rachford (DR) splitting method to solve nonconvex feasibility problems by studying this method for a class of nonconvex optimization problem. While the convergence properties of the method for convex problems have been well studied, far less is known in the nonconvex setting. In this paper, for the direct adaptation of the method … Read more

Randomized First-order Methods for Saddle Point Optimization

In this paper, we present novel randomized algorithms for solving saddle point problems whose dual feasible region is a direct product of many convex sets. Our algorithms can achieve ${\cal O}(1/N)$ rate of convergence by solving only one dual subproblem at each iteration. Our algorithms can also achieve ${\cal O}(1/N^2)$ rate of convergence if a … Read more

A Proximal Multiplier Method for Convex Separable Symmetric Cone Optimization

This work is devoted to the study of a proximal decomposition algorithm for solving convex symmetric cone optimization with separable structures. The algorithm considered is based on the decomposition method proposed by Chen and Teboulle (1994), and the proximal generalized distance defined by Auslender and Teboulle (2006). Under suitable assumptions, first a class of proximal … Read more

Relative Entropy Relaxations for Signomial Optimization

Signomial programs (SPs) are optimization problems specified in terms of signomials, which are weighted sums of exponentials composed with linear functionals of a decision variable. SPs are non convex optimization problems in general, and families of NP-hard problems can be reduced to SPs. In this paper we describe a hierarchy of convex relaxations to obtain … Read more

A Gentle, Geometric Introduction to Copositive Optimization

This paper illustrates the fundamental connection between nonconvex quadratic optimization and copositive optimization—a connection that allows the reformulation of nonconvex quadratic problems as convex ones in a unified way. We intend the paper for readers new to the area, and hence the exposition is largely self-contained. We focus on examples having just a few variables … Read more