Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given. CitationReport naXys-09-2013, Namur Centre for Complex Systems, Unibersity of Namur, Namur (Belgium)ArticleDownload View PDF

Adaptive Observations And Multilevel Optimization In Data Assimilation

We propose to use a decomposition of large-scale incremental four dimensional (4D-Var) data assimilation problems in order to make their numerical solution more efficient. This decomposition is based on exploiting an adaptive hierarchy of the observations. Starting with a low-cardinality set and the solution of its corresponding optimization problem, observations are adaptively added based on … Read more

A merit function approach for direct search

In this paper it is proposed to equip direct-search methods with a general procedure to minimize an objective function, possibly non-smooth, without using derivatives and subject to constraints on the variables. One aims at considering constraints, most likely nonlinear or non-smooth, for which the derivatives of the corresponding functions are also unavailable. The novelty of … Read more

Linearizing the Method of Conjugate Gradients

The method of conjugate gradients (CG) is widely used for the iterative solution of large sparse systems of equations $Ax=b$, where $A\in\Re^{n\times n}$ is symmetric positive definite. Let $x_k$ denote the $k$–th iterate of CG. In this paper we obtain an expression for $J_k$, the Jacobian matrix of $x_k$ with respect to $b$. We use … Read more

Conjugate-gradients versus multigrid solvers for diffusion-based correlation models in data assimilation

This paper provides a theoretical and experimental comparison between conjugate-gradients and multigrid, two iterative schemes for solving linear systems, in the context of applying diffusion-based correlation models in data assimilation. In this context, a large number of such systems has to be (approximately) solved if the implicit mode is chosen for integrating the involved diffusion … Read more

Globally Convergent Evolution Strategies and CMA-ES

In this paper we show how to modify a large class of evolution strategies (ES) to rigorously achieve a form of global convergence, meaning convergence to stationary points independently of the starting point. The type of ES under consideration recombine the parents by means of a weighted sum, around which the offsprings are computed by … Read more

A surrogate management framework using rigorous trust-regions steps

Surrogate models and heuristics are frequently used in the optimization engineering community as convenient approaches to deal with functions for which evaluations are expensive or noisy, or lack convexity. These methodologies do not typically guarantee any type of convergence under reasonable assumptions and frequently render slow convergence. In this paper we will show how to … Read more

A surrogate management framework using rigorous trust-regions steps

Surrogate models and heuristics are frequently used in the optimization engineering community as convenient approaches to deal with functions for which evaluations are expensive or noisy, or lack convexity. These methodologies do not typically guarantee any type of convergence under reasonable assumptions and frequently render slow convergence. In this paper we will show how to … Read more

Preconditioning and Globalizing Conjugate Gradients in Dual Space for Quadratically Penalized Nonlinear-Least Squares Problems

When solving nonlinear least-squares problems, it is often useful to regularize the problem using a quadratic term, a practice which is especially common in applications arising in inverse calculations. A solution method derived from a trust-region Gauss-Newton algorithm is analyzed for such applications, where, contrary to the standard algorithm, the least-squares subproblem solved at each … Read more

Using approximate secant equations in limited memory methods for multilevel unconstrained optimization

The properties of multilevel optimization problems defined on a hierarchy of discretization grids can be used to define approximate secant equations, which describe the second-order behaviour of the objective function. Following earlier work by Gratton and Toint (2009), we introduce a quasi-Newton method (with a linesearch) and a nonlinear conjugate gradient method that both take … Read more