In this paper we study new stochastic approximation (SA) type algorithms, namely, the accelerated SA (AC-SA), for solving strongly convex stochastic composite optimization (SCO) problems. Specifically, by introducing a domain shrinking procedure, we significantly improve the large-deviation results associated with the convergence rate of a nearly optimal AC-SA algorithm presented by the authors. Moreover, we introduce a multi-stage AC-SA algorithm, which possesses an optimal rate of convergence for solving strongly convex SCO problems in terms of the dependence on, not only the target accuracy, but also a number of problem parameters and the selection of initial points. To the best of out knowledge, this is the first time that such an optimal method has been presented in the literature. From our computational results, these AC-SA algorithms can substantially outperform the classic SA and some other SA type algorithms for solving certain classes of strongly convex SCO problems.
Citation
Technical Report, Department of Industrial and Systems Engineering, University of Florida.