Parallel Block Coordinate Minimization with Application to Group Regularized Regression

This paper proposes a method for parallel block coordinate-wise minimization for convex functions. Each iteration involves a first phase where n independent minimizations are performed over the n variable blocks, followed by a phase where the results of the first phase are coordinated to obtain the whole variable update. Convergence of the method to the … Read more

A Parallel Inertial Proximal Optimization Method

The Douglas-Rachford algorithm is a popular iterative method for finding a zero of a sum of two maximal monotone operators defined on a Hilbert space. In this paper, we propose an extension of this algorithm including inertia parameters and develop parallel versions to deal with the case of a sum of an arbitrary number of … Read more