A Parallel Evolution Strategy for an Earth Imaging Problem in Geophysics

In this paper we propose a new way to compute a warm starting point for a challenging global optimization problem related to Earth imaging in geophysics. The warm start consists of a velocity model that approximately solves a full-waveform inverse problem at low frequency. Our motivation arises from the availability of massively parallel computing platforms … Read more

Globally Convergent Evolution Strategies for Constrained Optimization.

In this work we propose, analyze, and test algorithms for linearly constrained optimization when no use of derivatives of the objective function is made. The proposed methodology is built upon the globally convergent evolution strategies previously introduced by the authors for unconstrained optimization. Two approaches are encompassed to handle the constraints. In a first approach, … Read more

Levenberg-Marquardt methods based on probabilistic gradient models and inexact subproblem solution, with application to data assimilation

The Levenberg-Marquardt algorithm is one of the most popular algorithms for the solution of nonlinear least squares problems. Motivated by the problem structure in data assimilation, we consider in this paper the extension of the classical Levenberg-Marquardt algorithm to the scenarios where the linearized least squares subproblems are solved inexactly and/or the gradient model is … Read more

Direct search based on probabilistic descent

Direct-search methods are a class of popular derivative-free algorithms characterized by evaluating the objective function using a step size and a number of (polling) directions. When applied to the minimization of smooth functions, the polling directions are typically taken from positive spanning sets which in turn must have at least n+1 vectors in an n-dimensional … Read more

Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given. CitationReport naXys-09-2013, Namur Centre for Complex Systems, Unibersity of Namur, Namur (Belgium)ArticleDownload View PDF

Adaptive Observations And Multilevel Optimization In Data Assimilation

We propose to use a decomposition of large-scale incremental four dimensional (4D-Var) data assimilation problems in order to make their numerical solution more efficient. This decomposition is based on exploiting an adaptive hierarchy of the observations. Starting with a low-cardinality set and the solution of its corresponding optimization problem, observations are adaptively added based on … Read more

A merit function approach for direct search

In this paper it is proposed to equip direct-search methods with a general procedure to minimize an objective function, possibly non-smooth, without using derivatives and subject to constraints on the variables. One aims at considering constraints, most likely nonlinear or non-smooth, for which the derivatives of the corresponding functions are also unavailable. The novelty of … Read more

Linearizing the Method of Conjugate Gradients

The method of conjugate gradients (CG) is widely used for the iterative solution of large sparse systems of equations $Ax=b$, where $A\in\Re^{n\times n}$ is symmetric positive definite. Let $x_k$ denote the $k$–th iterate of CG. In this paper we obtain an expression for $J_k$, the Jacobian matrix of $x_k$ with respect to $b$. We use … Read more

Conjugate-gradients versus multigrid solvers for diffusion-based correlation models in data assimilation

This paper provides a theoretical and experimental comparison between conjugate-gradients and multigrid, two iterative schemes for solving linear systems, in the context of applying diffusion-based correlation models in data assimilation. In this context, a large number of such systems has to be (approximately) solved if the implicit mode is chosen for integrating the involved diffusion … Read more

Globally Convergent Evolution Strategies and CMA-ES

In this paper we show how to modify a large class of evolution strategies (ES) to rigorously achieve a form of global convergence, meaning convergence to stationary points independently of the starting point. The type of ES under consideration recombine the parents by means of a weighted sum, around which the offsprings are computed by … Read more