A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds

In order to be provably convergent towards a second-order stationary point, optimization methods applied to nonconvex problems must necessarily exploit both first and second-order information. However, as revealed by recent complexity analyzes of some of these methods, the overall effort to reach second-order points is significantly larger when compared to the one of approaching first-order … Read more

Complexity and global rates of trust-region methods based on probabilistic models

Trust-region algorithms have been proved to globally converge with probability one when the accuracy of the trust-region models is imposed with a certain probability conditioning on the iteration history. In this paper, we study their complexity, providing global rates and worst case complexity bounds on the number of iterations (with overwhelmingly high probability), for both … Read more

Direct search based on probabilistic feasible descent for bound and linearly constrained problems

Direct search is a methodology for derivative-free optimization whose iterations are characterized by evaluating the objective function using a set of polling directions. In deterministic direct search applied to smooth objectives, these directions must somehow conform to the geometry of the feasible region and typically consist of positive generators of approximate tangent cones (which then … Read more

A second-order globally convergent direct-search method and its worst-case complexity

Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, first-order global convergence comes from the ability of the vectors to approximate the steepest … Read more

Trust-region methods without using derivatives: Worst case complexity and the non-smooth case

Trust-region methods are a broad class of methods for continuous optimization that found application in a variety of problems and contexts. In particular, they have been studied and applied for problems without using derivatives. The analysis of trust-region derivative-free methods has focused on global convergence, and they have been proved to generate a sequence of … Read more

A Parallel Evolution Strategy for an Earth Imaging Problem in Geophysics

In this paper we propose a new way to compute a warm starting point for a challenging global optimization problem related to Earth imaging in geophysics. The warm start consists of a velocity model that approximately solves a full-waveform inverse problem at low frequency. Our motivation arises from the availability of massively parallel computing platforms … Read more

On the optimal order of worst case complexity of direct search

The worst case complexity of direct-search methods has been recently analyzed when they use positive spanning sets and impose a sufficient decrease condition to accept new iterates. Assuming that the objective function is smooth, it is now known that such methods require at most O(n^2 epsilon^{-2}) function evaluations to compute a gradient of norm below … Read more

Globally Convergent Evolution Strategies for Constrained Optimization.

In this work we propose, analyze, and test algorithms for linearly constrained optimization when no use of derivatives of the objective function is made. The proposed methodology is built upon the globally convergent evolution strategies previously introduced by the authors for unconstrained optimization. Two approaches are encompassed to handle the constraints. In a first approach, … Read more

Levenberg-Marquardt methods based on probabilistic gradient models and inexact subproblem solution, with application to data assimilation

The Levenberg-Marquardt algorithm is one of the most popular algorithms for the solution of nonlinear least squares problems. Motivated by the problem structure in data assimilation, we consider in this paper the extension of the classical Levenberg-Marquardt algorithm to the scenarios where the linearized least squares subproblems are solved inexactly and/or the gradient model is … Read more

Direct search based on probabilistic descent

Direct-search methods are a class of popular derivative-free algorithms characterized by evaluating the objective function using a step size and a number of (polling) directions. When applied to the minimization of smooth functions, the polling directions are typically taken from positive spanning sets which in turn must have at least n+1 vectors in an n-dimensional … Read more