A modified limited-memory BNS method for unconstrained minimization based on the conjugate directions idea

A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization is proposed, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors for better satisfaction of previous quasi-Newton conditions. In comparison with [16], where a similar approach is used, correction vectors from more previous iterations … Read more

A globally convergent trust-region algorithm for unconstrained derivative-free optimization

In this work we explicit a derivative-free trust-region algorithm for unconstrained optimization based on the paper (Computational Optimization and Applications 53: 527-555, 2012) proposed by Powell. The objective function is approximated by quadratic models obtained by polynomial interpolation. The number of points of the interpolation set is fixed. In each iteration only one interpolation point … Read more

A Stochastic Quasi-Newton Method for Large-Scale Optimization

Abstract The question of how to incorporate curvature information in stochastic approximation methods is challenging. The direct application of classical quasi- Newton updating techniques for deterministic optimization leads to noisy curvature estimates that have harmful effects on the robustness of the iteration. In this paper, we propose a stochastic quasi-Newton method that is efficient, robust … Read more

On Efficiently Combining Limited Memory and Trust-Region Techniques

Limited memory quasi-Newton methods and trust-region methods represent two efficient approaches used for solving unconstrained optimization problems. A straightforward combination of them deteriorates the efficiency of the former approach, especially in the case of large-scale problems. For this reason, the limited memory methods are usually combined with a line search. We show how to efficiently … Read more

An efficient gradient method using the Yuan steplength

We propose a new gradient method for quadratic programming, named SDC, which alternates some SD iterates with some gradient iterates that use a constant steplength computed through the Yuan formula. The SDC method exploits the asymptotic spectral behaviour of the Yuan steplength to foster a selective elimination of the components of the gradient along the … Read more

Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given. CitationReport naXys-09-2013, Namur Centre for Complex Systems, Unibersity of Namur, Namur (Belgium)ArticleDownload View PDF

The divergence of the BFGS and the Gauss Newton Methods

We present examples of divergence for the BFGS and Gauss Newton methods. These examples have objective functions with bounded level sets and other properties concerning the examples published recently in this journal, like unit steps and convexity along the search lines. As these other examples, the iterates, function values and gradients in the new examples … Read more

The Euclidean distance degree of an algebraic variety

The nearest point map of a real algebraic variety with respect to Euclidean distance is an algebraic function. For instance, for varieties of low rank matrices, the Eckart-Young Theorem states that this map is given by the singular value decomposition. This article develops a theory of such nearest point maps from the perspective of computational … Read more

Algebraic rules for quadratic regularization of Newton’s method

In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton’s method, with algebraic explicit rules for computing the regularizing parameter. The convergence properties of this class of methods are analysed. We show that if the … Read more

Efficient tridiagonal preconditioner for the matrix-free truncated Newton method

In this report, we study an efficient tridiagonal preconditioner, based on numerical differentiation, applied to the matrix-free truncated Newton method for unconstrained optimization. It is proved that this preconditioner is positive definite for many practical problems. The efficiency of the resulting matrix-free truncated Newton method is demonstrated by results of extensive numerical experiments. CitationTechnical Report … Read more