A Newton-CG Algorithm with Complexity Guarantees for Smooth Unconstrained Optimization

We consider minimization of a smooth nonconvex objective function using an iterative algorithm based on Newton’s method and linear conjugate gradient, with explicit detection and use of negative curvature directions for the Hessian of the objective function. The algorithm tracks Newton-conjugate gradient procedures developed in the 1980s closely, but includes enhancements that allow worst-case complexity … Read more

Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization

There has been much recent interest in finding unconstrained local minima of smooth functions, due in part of the prevalence of such problems in machine learning and robust statistics. A particular focus is algorithms with good complexity guarantees. Second-order Newton-type methods that make use of regularization and trust regions have been analyzed from such a … Read more

A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds

In order to be provably convergent towards a second-order stationary point, optimization methods applied to nonconvex problems must necessarily exploit both first and second-order information. However, as revealed by recent complexity analyzes of some of these methods, the overall effort to reach second-order points is significantly larger when compared to the one of approaching first-order … Read more

Complexity and global rates of trust-region methods based on probabilistic models

Trust-region algorithms have been proved to globally converge with probability one when the accuracy of the trust-region models is imposed with a certain probability conditioning on the iteration history. In this paper, we study their complexity, providing global rates and worst case complexity bounds on the number of iterations (with overwhelmingly high probability), for both … Read more

Direct search based on probabilistic feasible descent for bound and linearly constrained problems

Direct search is a methodology for derivative-free optimization whose iterations are characterized by evaluating the objective function using a set of polling directions. In deterministic direct search applied to smooth objectives, these directions must somehow conform to the geometry of the feasible region and typically consist of positive generators of approximate tangent cones (which then … Read more

A second-order globally convergent direct-search method and its worst-case complexity

Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, first-order global convergence comes from the ability of the vectors to approximate the steepest … Read more

Direct search based on probabilistic descent

Direct-search methods are a class of popular derivative-free algorithms characterized by evaluating the objective function using a step size and a number of (polling) directions. When applied to the minimization of smooth functions, the polling directions are typically taken from positive spanning sets which in turn must have at least n+1 vectors in an n-dimensional … Read more