An asymptotic inclusion speed for the Douglas-Rachford splitting method in Hilbert spaces

In this paper, we consider the Douglas-Rachford splitting method for monotone inclusion in Hilbert spaces. It can be implemented as follows: from the current iterate, first use forward-backward step to get the intermediate point, then to get the new iterate. Generally speaking, the sum operator involved in the Douglas-Rachford splitting takes the value of every … Read more

Coordinate descent algorithms

Coordinate descent algorithms solve optimization problems by successively performing approximate minimization along coordinate directions or coordinate hyperplanes. They have been used in applications for many years, and their popularity continues to grow because of their usefulness in data analysis, machine learning, and other areas of current interest. This paper describes the fundamentals of the coordinate … Read more

Interior-point algorithms for convex optimization based on primal-dual metrics

We propose and analyse primal-dual interior-point algorithms for convex optimization problems in conic form. The families of algorithms we analyse are so-called short-step algorithms and they match the current best iteration complexity bounds for primal-dual symmetric interior-point algorithm of Nesterov and Todd, for symmetric cone programming problems with given self-scaled barriers. Our results apply to … Read more

The direct extension of ADMM for three-block separable convex minimization models is convergent when one function is strongly convex

The alternating direction method of multipliers (ADMM) is a benchmark for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. Meanwhile, it is known that the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective … Read more

Stochastic Compositional Gradient Descent: Algorithms for Minimizing Compositions of Expected-Value Functions

Classical stochastic gradient methods are well suited for minimizing expected-value objective functions. However, they do not apply to the minimization of a nonlinear function involving expected values or a composition of two expected-value functions, i.e., problems of the form $\min_x \E_v\[f_v\big(\E_w [g_w(x)]\big) \]$. In order to solve this stochastic composition problem, we propose a class … Read more

On the Information-Adaptive Variants of the ADMM: an Iteration Complexity Perspective

Designing algorithms for an optimization model often amounts to maintaining a balance between the degree of information to request from the model on the one hand, and the computational speed to expect on the other hand. Naturally, the more information is available, the faster one can expect the algorithm to converge. The popular algorithm of … Read more

Interior-point solver for convex separable block-angular problems

Constraints matrices with block-angular structures are pervasive in Optimization. Interior-point methods have shown to be competitive for these structured problems by exploiting the linear algebra. One of these approaches solved the normal equations using sparse Cholesky factorizations for the block constraints, and a preconditioned conjugate gradient (PCG) for the linking constraints. The preconditioner is based … Read more

Weak and Strong Superiorization: Between Feasibility-Seeking and Minimization

We review the superiorization methodology, which can be thought of, in some cases, as lying between feasibility-seeking and constrained minimization. It is not quite trying to solve the full fledged constrained minimization problem; rather, the task is to find a feasible point which is superior (with respect to an objective function value) to one returned … Read more

A Gentle, Geometric Introduction to Copositive Optimization

This paper illustrates the fundamental connection between nonconvex quadratic optimization and copositive optimization—a connection that allows the reformulation of nonconvex quadratic problems as convex ones in a unified way. We intend the paper for readers new to the area, and hence the exposition is largely self-contained. We focus on examples having just a few variables … Read more

Block-wise Alternating Direction Method of Multipliers with Gaussian Back Substitution for Multiple-block Convex Programming

We consider the linearly constrained convex minimization model with a separable objective function which is the sum of m functions without coupled variables, and discuss how to design an efficient algorithm based on the fundamental technique of splitting the augmented Lagrangian method (ALM). Our focus is the specific big-data scenario where m is huge. A … Read more