This paper proposes a method for parallel block coordinate-wise minimization for convex functions. Each iteration involves a first phase where n independent minimizations are performed over the n variable blocks, followed by a phase where the results of the first phase are coordinated to obtain the whole variable update. Convergence of the method to the global optimum is proved for functions composed of a smooth part plus a possibly non-smooth but separable term. The method is also proved to have linear rate of convergence, for functions that are smooth and strongly convex. The proposed algorithm can give computational advantage over the more standard serial block coordinate-wise minimization methods, when run over a parallel, multi-worker, computing architecture. The method is suitable for regularized regression problems, such as the group Lasso, group Ridge regression, and goup Elastic Net. Numerical tests are run on such type of regression problems to exemplify the performance of the proposed parallel method in comparison with the serial one.
Citation
DAUIN, Politecnico di Torino internal report, March 2015
Article
View Parallel Block Coordinate Minimization with Application to Group Regularized Regression