Stochastic variance reduced methods have recently surged into prominence for solving large scale optimization problems in the context of machine learning. Tan, Ma and Dai et al. first proposed the new stochastic variance reduced gradient (SVRG) method with the Barzilai-Borwein (BB) method to compute step sizes automatically, which performs well in practice. On this basis, we propose a class of stochastic variance reduced methods with an adaptive stepsize which is based on local estimation of Lipschitz constant. Specifically, we adapt this stepsize to SVRG and stochastic recursive gradient algorithm (SARAH), which leads to two algorithms: SVRG-AS and SARAH-AS. We prove that both SVRG-AS and SARAH-AS converge linearly for strongly convex objective function. Numerical experiments on standard datasets indicate that our algorithms are effective and robust. The performance of SVRG-AS is better than SVRG-BB, and SARAH-AS is comparable to SARAH with best-tuned stepsizes. And our proposed stepsize is suitable for some other stochastic variance reduced methods.
Citation
University of Chinese Academy of Sciences,12/2018
Article
View A Class of Stochastic Variance Reduced Methods with an Adaptive Stepsize