Complexity of Adagrad and other first-order methods for nonconvex optimization problems with bounds constraints

A parametric class of trust-region algorithms for constrained nonconvex optimization is analyzed, where the objective function is never computed. By defining appropriate first-order stationarity criteria, we are able to extend the Adagrad method to the newly considered problem and retrieve the standard complexity rate of the projected gradient method that uses both the gradient and … Read more

An optimally fast objective-function-free minimization algorithm using random subspaces

Article Download View An optimally fast objective-function-free minimization algorithm using random subspaces

OFFO minimization algorithms for second-order optimality and their complexity

An Adagrad-inspired class of algorithms for smooth unconstrained optimization is presented in which the objective function is never evaluated and yet the gradient norms decrease at least as fast as O(1/\sqrt{k+1}) while second-order optimality measures converge to zero at least as fast as O(1/(k+1)^{1/3}). This latter rate of convergence is shown to be essentially sharp … Read more

First-Order Objective-Function-Free Optimization Algorithms and Their Complexity

A class of algorithms for unconstrained nonconvex optimization is considered where the value of the objective function is never computed. The class contains a deterministic version of the first-order Adagrad method typically used for minimization of noisy function, but also allows the use of second-order information when available. The rate of convergence of methods in … Read more