On the iterate convergence of descent methods for convex optimization

We study the iterate convergence of strong descent algorithms applied to convex functions. We assume that the function satisfies a very simple growth condition around its minimizers, and then show that the trajectory described by the iterates generated by any such method has finite length, which proves that the sequence of iterates converge.

Citation

Federal University of Santa Catarina, Brazil, May 2014 ccgonzaga1@gmail.com

Article

Download

View On the iterate convergence of descent methods for convex optimization