An example is presented where Newton's method for unconstrained minimization is applied to find an $\epsilon$-approximate first-order critical point of a smooth function and takes a multiple of $\epsilon^{-2}$ iterations and function evaluations to terminate, which is as many as the steepest-descent method in its worst-case. The novel feature of the proposed example is that the objective function has a globally Lipschitz-continuous Hessian, while a previous example published by the same authors only ensured this critical property along the path of iterates, which is impossible to verify a priori.
Citation
@techreport{CartGoulToin13b, author = {C. Cartis and N. I. M. Gould and Ph. L. Toint}, title = {An example of slow convergence for {N}ewton's method on a function with globally {L}ipschitz continuous {H}essian}, institution = NAXYS, address = {University of Namur, Namur, Belgium}, number = {naXys-03-2013}, year = 2013}