Sensitivity of trust-region algorithms on their parameters

In this paper, we examine the sensitivity of trust-region algorithms on the parameters related to the step acceptance and update of the trust region. We show, in the context of unconstrained programming, that the numerical efficiency of these algorithms can easily be improved by choosing appropriate parameters. Recommanded ranges of values for these parameters are … Read more

Recursive Trust-Region Methods for Multilevel Nonlinear Optimization (Part I): Global Convergence and Complexity

A class of trust-region methods is presented for solving unconstrained nonlinear and possibly nonconvex discretized optimization problems, like those arising in systems governed by partial differential equations. The algorithms in this class make use of the discretization level as a mean of speeding up the computation of the step. This use is recursive, leading to … Read more

Additional properties of shifted valiable metric methods.

Some supplements to shifted variable metric or quasi-Newton methods for unconstrained minimization are given, including new limited-memory methods. Global convergence of these methods can be established for convex sufficiently smooth functions. Some encouraging numerical experience is reported. Citation Report No. V899-03, Institute of Computer Scienc, Czech Academy of Sciences, Prague, December 2003 (revised May 2004). … Read more

On the Relationship Between Convergence Rates of Discrete and Continuous Dynamical Systems

Considering iterative sequences that arise when the approximate solution $x_k$ to a numerical problem is updated by $x_{k+1} = x_k+v(x_k)$, where $v$ is a vector field, we derive necessary and sufficient conditions for such discrete methods to converge to a stationary point of $v$ at different Q-rates in terms of the differential properties of $v$ … Read more

Semidefinite Approximations for Global Unconstrained Polynomial Optimization

We consider here the problem of minimizing a polynomial function on $\oR^n$. The problem is known to be hard even for degree $4$. Therefore approximation algorithms are of interest. Lasserre \cite{lasserre:2001} and Parrilo \cite{Pa02a} have proposed approximating the minimum of the original problem using a hierarchy of lower bounds obtained via semidefinite programming relaxations. We … Read more

A filter-trust-region method for unconstrained optimization

A new filter-trust-region algorithm for solving unconstrained nonlinear optimization problems is introduced. Based on the filter technique introduced by Fletcher and Leyffer, it extends an existing technique of Gould, Leyffer and Toint (SIAM J. Optim., to appear 2004) for nonlinear equations and nonlinear least-squares to the fully general unconstrained optimization problem. The new algorithm is … Read more

Optimal Direct Determination of Sparse Jacobian Matrices

It is well known that a sparse Jacobian matrix can be determined with fewer function evaluations or automatic differentiation \emph{passes} than the number of independent variables of the underlying function. In this paper we show that by grouping together rows into blocks one can reduce this number further. We propose a graph coloring technique for … Read more

Intermediate Report on the development of an optimization code for smooth, high computing load, continuous objective functions when derivatives are not available

We find very often in the industry simulators of huge chemical reactors, simulators of huge turbo-compressors, simulators of the path of a satellite in low orbit around earth, … These simulators were written to allow the design engineer to correctly estimate the consequences of the adjustment of one (or many) design variables (or parameters of … Read more

The continuous Newton-Raphson method can look ahead

This paper is about an intriguing property of the continuous Newton-Raphson method for the minimization of a continuous objective function f: if x is a point in the domain of attraction of a strict local minimizer x* then the flux line of the Newton-Raphson flow that starts in x approaches x* from a direction that … Read more

A Wide Interval for Efficient Self-Scaling Quasi-Newton Algorithms

This paper uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstraiend optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from the so-called convex class, but increase their chances … Read more