Negative Curvature Methods with High-Probability Complexity Guarantees for Stochastic Nonconvex Optimization

This paper develops negative curvature methods for continuous nonlinear unconstrained optimization in stochastic settings, in which function, gradient, and Hessian information is available only through probabilistic oracles, i.e., oracles that return approximations of a certain accuracy and reliability. We introduce conditions on these oracles and design a two-step framework that systematically combines gradient and negative … Read more

Exploiting Negative Curvature in Conjunction with Adaptive Sampling: Theoretical Results and a Practical Algorithm

In this paper, we propose algorithms that exploit negative curvature for solving noisy nonlinear nonconvex unconstrained optimization problems. We consider both deterministic and stochastic inexact settings, and develop two-step algorithms that combine directions of negative curvature and descent directions to update the iterates. Under reasonable assumptions, we prove second-order convergence results and derive complexity guarantees … Read more