Convergence rates for an inertial algorithm of gradient type associated to a smooth nonconvex minimization

We investigate an inertial algorithm of gradient type in connection with the minimization of a nonconvex differentiable function. The algorithm is formulated in the spirit of Nesterov's accelerated convex gradient method. We show that the generated sequences converge to a critical point of the objective function, if a regularization of the objective function satis es the Kurdyka-Lojasiewicz property. Further, we provide convergence rates for the generated sequences and the function values formulated in terms of the Lojasiewicz exponent.

Citation

S. László, 2018, Technical University of Cluj-Napoca, Department of Mathematics, Str. Memorandumului nr. 28, 400114 Cluj-Napoca, Romania, e-mail: laszlosziszi@yahoo.com

Article

Download

View Convergence rates for an inertial algorithm of gradient type associated to a smooth nonconvex minimization