Exploiting Negative Curvature in Conjunction with Adaptive Sampling: Theoretical Results and a Practical Algorithm

In this paper, we propose algorithms that exploit negative curvature for solving noisy nonlinear nonconvex unconstrained optimization problems. We consider both deterministic and stochastic inexact settings, and develop two-step algorithms that combine directions of negative curvature and descent directions to update the iterates. Under reasonable assumptions, we prove second-order convergence results and derive complexity guarantees … Read more

Yet another fast variant of Newton’s method for nonconvex optimization

A second-order algorithm is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps. In most cases, the Hessian matrix is regularized with the square root of the current gradient and an additional term taking moderate negative curvature into account, a negative curvature step being taken only exceptionnally. As a … Read more

Detecting negative eigenvalues of exact and approximate Hessian matrices in optimization

Nonconvex minimization algorithms often benefit from the use of second-order information as represented by the Hessian matrix. When the Hessian at a critical point possesses negative eigenvalues, the corresponding eigenvectors can be used to search for further improvement in the objective function value. Computing such eigenpairs can be computationally challenging, particularly if the Hessian matrix … Read more

Trust-Region Newton-CG with Strong Second-Order Complexity Guarantees for Nonconvex Optimization

Worst-case complexity guarantees for nonconvex optimization algorithms have been a topic of growing interest. Multiple frameworks that achieve the best known complexity bounds among a broad class of first- and second-order strategies have been proposed. These methods have often been designed primarily with complexity guarantees in mind and, as a result, represent a departure from … Read more

A Subsampling Line-Search Method with Second-Order Results

In many contemporary optimization problems such as those arising in machine learning, it can be computationally challenging or even infeasible to evaluate an entire function or its derivatives. This motivates the use of stochastic algorithms that sample problem data, which can jeopardize the guarantees obtained through classical globalization techniques in optimization such as a trust … Read more

Exploiting Negative Curvature in Deterministic and Stochastic Optimization

This paper addresses the question of whether it can be beneficial for an optimization algorithm to follow directions of negative curvature. Although some prior work has established convergence results for algorithms that integrate both descent and negative curvature directions, there has not yet been numerical evidence showing that such methods offer significant performance improvements. In … Read more

A hybrid algorithm for nonlinear equality constrained optimization problems: global and local convergence theory

In this paper we combine both trust-region and linesearch globalization strategies in a globally convergent hybrid algorithm to solve a continuously differentiable nonlinear equality constrained minimization problem. First, the trust-region approach is used to determine a descent direction of the augmented Lagrangian chosen as the merit function, and second, linesearch techniques are used to obtain … Read more