Using Simplex Gradients of Nonsmooth Functions in Direct Search Methods

It has been shown recently that the efficiency of direct search methods that use opportunistic polling in positive spanning directions can be improved significantly by reordering the poll directions according to descent indicators built from simplex gradients. The purpose of this paper is twofold. First, we analyze the properties of simplex gradients of nonsmooth functions … Read more

Global Convergence of General Derivative-Free Trust-Region Algorithms to First and Second Order Critical Points

In this paper we prove global convergence for first and second-order stationarity points of a class of derivative-free trust-region methods for unconstrained optimization. These methods are based on the sequential minimization of linear or quadratic models built from evaluating the objective function at sample sets. The derivative-free models are required to satisfy Taylor-type bounds but, … Read more

Asynchronous parallel generating set search for linearly-constrained optimization

Generating set search (GSS) is a family of direct search methods that encompasses generalized pattern search and related methods. We describe an algorithm for asynchronous linearly-constrained GSS, which has some complexities that make it different from both the asynchronous bound-constrained case as well as the synchronous linearly-constrained case. The algorithm has been implemented in the … Read more

Discrete gradient method: a derivative free method for nonsmooth optimization

In this paper a new derivative-free method is developed for solving unconstrained nonsmooth optimization problems. This method is based on the notion of a discrete gradient. It is demonstrated that the discrete gradients can be used to approximate subgradients of a broad class of nonsmooth functions. It is also shown that the discrete gradients can … Read more

A Particle Swarm Pattern Search Method for Bound Constrained Nonlinear Optimization

In this paper we develop, analyze, and test a new algorithm for the global minimization of a function subject to simple bounds without the use of derivatives. The underlying algorithm is a pattern search method, more specifically a coordinate search method, which guarantees convergence to stationary points from arbitrary starting points. In the optional search … Read more

Geometry of Sample Sets in Derivative Free Optimization. Part II: Polynomial Regression and Underdetermined Interpolation

In the recent years, there has been a considerable amount of work in the development of numerical methods for derivative free optimization problems. Some of this work relies on the management of the geometry of sets of sampling points for function evaluation and model building. In this paper, we continue the work developed in [Conn, … Read more

A generating set search method exploiting curvature and sparsity

Generating Set Search method are one of the few alternatives for optimising high fidelity functions with numerical noise. These methods are usually only efficient when the number of variables is relatively small. This paper presents a modification to an existing Generating Set Search method, which makes it aware of the sparsity structure of the Hessian. … Read more

Using Sampling and Simplex Derivatives in Pattern Search Methods

Pattern search methods can be made more efficient if past function evaluations are appropriately reused. In this paper we will introduce a number of ways of reusing previous evaluations of the objective function based on the computation of simplex derivatives (e.g., simplex gradients) to improve the efficiency of a pattern search iteration. At each iteration … Read more

Local Optimization Method with Global Multidimensional

This paper presents a new method for solving global optimization problems. We use a local technique based on the notion of discrete gradients for finding a cone of descent directions and then we use a global cutting angle algorithm for finding global minimum within the intersection of the cone and the feasible region. We present … Read more

Error Estimates and Poisedness in Multivariate Polynomial Interpolation

We show how to derive error estimates between a function and its interpolating polynomial and between their corresponding derivatives. The derivation is based on a new definition of well-poisedness for the interpolation set, directly connecting the accuracy of the error estimates with the geometry of the points in the set. This definition is equivalent to … Read more