In a recent paper, Bubeck, Lee, and Singh introduced a new first order method for minimizing smooth strongly convex functions. Their geometric descent algorithm, largely inspired by the ellipsoid method, enjoys the optimal linear rate of convergence. Motivated by their work, we propose a close variant that iteratively maintains a quadratic global under-estimator of the objective function, whose minimal value approaches the true minimum at an optimal rate. The resulting intuitive scheme comes equipped with a natural stopping criterion and can be numerically accelerated by using accumulated information.
Citation
20 pages