We design and analyze a novel gradient-based algorithm for unconstrained convex optimization. When the objective function is $m$-strongly convex and its gradient is $L$-Lipschitz continuous, the iterates and function values converge linearly to the optimum at rates $\rho$ and $\rho^2$, respectively, where $\rho = 1-\sqrt{m/L}$. These are the fastest known guaranteed linear convergence rates for globally convergent first-order methods, and for high desired accuracies the corresponding iteration complexity is within a factor of two of the theoretical lower bound. We use a simple graphical design procedure based on integral quadratic constraints to derive closed-form expressions for the algorithm parameters. The new algorithm, which we call the triple momentum method, can be seen as an extension of methods such as gradient descent, Nesterov's accelerated gradient descent, and the heavy-ball method.
Citation
@ARTICLE{7967721, author={B. Van Scoy and R. A. Freeman and K. M. Lynch}, journal={IEEE Control Systems Letters}, title={The Fastest Known Globally Convergent First-Order Method for Minimizing Strongly Convex Functions}, year={2017}, volume={PP}, number={99}, pages={1-1}, keywords={Acceleration;Algorithm design and analysis;Complexity theory;Convergence;Linear programming;Optimization;Transfer functions;optimization algorithms;robust control.}, doi={10.1109/LCSYS.2017.2722406}, ISSN={2475-1456}, month={},}