A Novel Stepsize for Gradient Descent Method

In this paper, we propose a novel stepsize for the classical gradient descent scheme to solve unconstrained nonlinear optimization problems. We are concerned with the convex and smooth objective without the globally Lipschitz gradient condition. Our new method just needs the locally Lipschitz gradient but still gets the rate $O(\frac{1}{k})$ of $f(x^k)-f_*$ at most. By using the idea of the new stepsize, we propose another new algorithm based on the projected gradient for solving a class of nonconvex optimization problems over a closed convex set. At each iteration, our new stepsize is computed easily by a clear formulation without estimating any constant like the Lipschitz constant or using a backtracking procedure. Moreover, we prove that the sequence of stepsize is non-decreasing from some fixed iteration. The numerical experiments show the significant efficiency of the new method compared to the recent ones.

Article

Download

View A Novel Stepsize for Gradient Descent Method