Asynchronous Parallel Algorithms for Nonconvex Big-Data Optimization. Part I: Model and Convergence

We propose a novel asynchronous parallel algorithmic framework for the minimization of the sum of a smooth nonconvex function and a convex nonsmooth regularizer, subject to both convex and nonconvex constraints. The proposed framework hinges on successive convex approximation techniques and a novel probabilistic model that captures key elements of modern computational architectures and asynchronous implementations in a more faithful way than current state of the art models. Key features of the proposed framework are: i) it accommodates inconsistent read, meaning that components of the vector variables may be written by some cores while being simultaneously read by others; ii) it covers in a unified way several different specific solution methods, and iii) it accommodates a variety of possible parallel computing architectures. Almost sure convergence to stationary solutions is proved. Numerical results, reported in the companion paper, on both convex and nonconvex problems show our method can consistently outperform existing parallel asynchronous algorithms.

Article

Download

View Asynchronous Parallel Algorithms for Nonconvex Big-Data Optimization. Part I: Model and Convergence