The Fastest Known Globally Convergent First-Order Method for Minimizing Strongly Convex Functions

We design and analyze a novel gradient-based algorithm for unconstrained convex optimization. When the objective function is $m$-strongly convex and its gradient is $L$-Lipschitz continuous, the iterates and function values converge linearly to the optimum at rates $\rho$ and $\rho^2$, respectively, where $\rho = 1-\sqrt{m/L}$. These are the fastest known guaranteed linear convergence rates for … Read more