Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we study the oracle complexity of such methods, or equivalently, the number of iterations required to optimize a function to a given accuracy. Focusing on smooth and convex functions, we derive (to the best of our knowledge) the first algorithm-independent lower bounds for such methods. These bounds shed light on the limits of attainable performance with second-order methods, and in which cases they can or cannot require less iterations than gradient-based methods, whose oracle complexity is much better understood.
Weizmann Institute, May 2017