We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori en Teboulle [Mathematical Programming, 145(1-2):451--482, 2014], and extends recent performance estimation results for the method of Cauchy by the authors [Optimization Letters, to appear]. To illustrate the applicability of the tools, we sketch how to give a rigorous worst-case complexity analysis of a recent interior point method by Abernethy and Hazan [arXiv 1507.02528, 2015]. The algorithm of Abernethy and Hazan has sparked much recent interest, since it demonstrates the formal equivalence between an interior point method and a simulated annealing algorithm for convex optimization, but several details of the worst-case analysis are omitted in their paper.
View Worst-case convergence analysis of gradient and Newton methods through semidefinite programming performance estimation