Under the strongly convex assumption, several recent works studied the global linear convergence rate of the proximal incremental aggregated gradient (PIAG) method for minimizing the sum of a large number of smooth component functions and a non-smooth convex function. In this paper, under the quadratic growth condition{a strictly weaker condition than the strongly convex assumption, we derive a new convergence result, which implies that the PIAG method attains global linear convergence rates in both the function value and iterate point errors. Moreover, by using the relative smoothness (recently proposed to weaken the traditional gradient Lipschitz continuity) and dening the Bregman distance growth condition (that generalizes the quadratic growth condition), we further analyze the PIAG method with general distance functions. Finally, we propose a new variant of the PIAG method with improved linear convergence rates. Our theory recovers many very recent results under strictly weaker assumptions, but also provides new results for both PIAG methods and the proximal gradient method. Besides, if the strongly convex assumption indeed holds, then our theory shows that one can improve the corresponding rates derived under the quadratic growth condition. The key idea behind our theory is to construct certain Lyapunov functions.
Citation
National University of Defense Technology, 3/2017