A derivative-free Gauss-Newton method

We present DFO-GN, a derivative-free version of the Gauss-Newton method for solving nonlinear least-squares problems. As is common in derivative-free optimization, DFO-GN uses interpolation of function values to build a model of the objective, which is then used within a trust-region framework to give a globally-convergent algorithm requiring $O(\epsilon^{-2})$ iterations to reach approximate first-order criticality … Read more

Derivative-Free Robust Optimization by Outer Approximations

We develop an algorithm for minimax problems that arise in robust optimization in the absence of objective function derivatives. The algorithm utilizes an extension of methods for inexact outer approximation in sampling a potentially infinite-cardinality uncertainty set. Clarke stationarity of the algorithm output is established alongside desirable features of the model-based trust-region subproblems encountered. We … Read more

Manifold Sampling for Optimization of Nonconvex Functions that are Piecewise Linear Compositions of Smooth Components

We develop a manifold sampling algorithm for the minimization of a nonsmooth composite function $f \defined \psi + h \circ F$ when $\psi$ is smooth with known derivatives, $h$ is a known, nonsmooth, piecewise linear function, and $F$ is smooth but expensive to evaluate. The trust-region algorithm classifies points in the domain of $h$ as … Read more

Direct Search Methods on Reductive Homogeneous Spaces

Direct search methods are mainly designed for use in problems with no equality constraints. However, there are many instances where the feasible set is of measure zero in the ambient space and no mesh point lies within it. There are methods for working with feasible sets that are (Riemannian) manifolds, but not all manifolds are … Read more

Direct search based on probabilistic feasible descent for bound and linearly constrained problems

Direct search is a methodology for derivative-free optimization whose iterations are characterized by evaluating the objective function using a set of polling directions. In deterministic direct search applied to smooth objectives, these directions must somehow conform to the geometry of the feasible region and typically consist of positive generators of approximate tangent cones (which then … Read more

Locally weighted regression models for surrogate-assisted design optimization

Locally weighted regression combines the advantages of polynomial regression and kernel smoothing. We present three ideas for appropriate and effective use of LOcally WEighted Scatterplot Smoothing (LOWESS) models for surrogate optimization. First, a method is proposed to reduce the computational cost of LOWESS models. Second, a local scaling coefficient is introduced to adapt LOWESS models … Read more

A progressive barrier derivative-free trust-region algorithm for constrained optimization

We study derivative-free constrained optimization problems and propose a trust-region method that builds linear or quadratic models around the best feasible and and around the best infeasible solutions found so far. These models are optimized within a trust region, and the progressive barrier methodology handles the constraints by progressively pushing the infeasible solutions toward the … Read more

Manifold Sampling for L1 Nonconvex Optimization

We present a new algorithm, called manifold sampling, for the unconstrained minimization of a nonsmooth composite function $h\circ F$ when $h$ has known structure. In particular, by classifying points in the domain of the nonsmooth function $h$ into manifolds, we adapt search directions within a trust-region framework based on knowledge of manifolds intersecting the current … Read more

Global convergence of a derivative-free inexact restoration filter algorithm for nonlinear programming

In this work we present an algorithm for solving constrained optimization problems that does not make explicit use of the objective function derivatives. The algorithm mixes an inexact restoration framework with filter techniques, where the forbidden regions can be given by the flat or slanting filter rule. Each iteration is decomposed in two independent phases: … Read more

Randomized Derivative-Free Optimization of Noisy Convex Functions

We propose STARS, a randomized derivative-free algorithm for unconstrained optimization when the function evaluations are contaminated with random noise. STARS takes dynamic, noise-adjusted smoothing step-sizes that minimize the least-squares error between the true directional derivative of a noisy function and its finite difference approximation. We provide a convergence rate analysis of STARS for solving convex … Read more